Sample records for rate testing methodology

  1. Single Event Test Methodologies and System Error Rate Analysis for Triple Modular Redundant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Allen, Gregory; Edmonds, Larry D.; Swift, Gary; Carmichael, Carl; Tseng, Chen Wei; Heldt, Kevin; Anderson, Scott Arlo; Coe, Michael

    2010-01-01

    We present a test methodology for estimating system error rates of Field Programmable Gate Arrays (FPGAs) mitigated with Triple Modular Redundancy (TMR). The test methodology is founded in a mathematical model, which is also presented. Accelerator data from 90 nm Xilins Military/Aerospace grade FPGA are shown to fit the model. Fault injection (FI) results are discussed and related to the test data. Design implementation and the corresponding impact of multiple bit upset (MBU) are also discussed.

  2. Preloading To Accelerate Slow-Crack-Growth Testing

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Choi, Sung R.; Pawlik, Ralph J.

    2004-01-01

    An accelerated-testing methodology has been developed for measuring the slow-crack-growth (SCG) behavior of brittle materials. Like the prior methodology, the accelerated-testing methodology involves dynamic fatigue ( constant stress-rate) testing, in which a load or a displacement is applied to a specimen at a constant rate. SCG parameters or life prediction parameters needed for designing components made of the same material as that of the specimen are calculated from the relationship between (1) the strength of the material as measured in the test and (2) the applied stress rate used in the test. Despite its simplicity and convenience, dynamic fatigue testing as practiced heretofore has one major drawback: it is extremely time-consuming, especially at low stress rates. The present accelerated methodology reduces the time needed to test a specimen at a given rate of applied load, stress, or displacement. Instead of starting the test from zero applied load or displacement as in the prior methodology, one preloads the specimen and increases the applied load at the specified rate (see Figure 1). One might expect the preload to alter the results of the test and indeed it does, but fortunately, it is possible to account for the effect of the preload in interpreting the results. The accounting is done by calculating the normalized strength (defined as the strength in the presence of preload the strength in the absence of preload) as a function of (1) the preloading factor (defined as the preload stress the strength in the absence of preload) and (2) a SCG parameter, denoted n, that is used in a power-law crack-speed formulation. Figure 2 presents numerical results from this theoretical calculation.

  3. The UCERF3 grand inversion: Solving for the long‐term rate of ruptures in a fault system

    USGS Publications Warehouse

    Page, Morgan T.; Field, Edward H.; Milner, Kevin; Powers, Peter M.

    2014-01-01

    We present implementation details, testing, and results from a new inversion‐based methodology, known colloquially as the “grand inversion,” developed for the Uniform California Earthquake Rupture Forecast (UCERF3). We employ a parallel simulated annealing algorithm to solve for the long‐term rate of all ruptures that extend through the seismogenic thickness on major mapped faults in California while simultaneously satisfying available slip‐rate, paleoseismic event‐rate, and magnitude‐distribution constraints. The inversion methodology enables the relaxation of fault segmentation and allows for the incorporation of multifault ruptures, which are needed to remove magnitude‐distribution misfits that were present in the previous model, UCERF2. The grand inversion is more objective than past methodologies, as it eliminates the need to prescriptively assign rupture rates. It also provides a means to easily update the model as new data become available. In addition to UCERF3 model results, we present verification of the grand inversion, including sensitivity tests, tuning of equation set weights, convergence metrics, and a synthetic test. These tests demonstrate that while individual rupture rates are poorly resolved by the data, integrated quantities such as magnitude–frequency distributions and, most importantly, hazard metrics, are much more robust.

  4. Methodology for testing infrared focal plane arrays in simulated nuclear radiation environments

    NASA Astrophysics Data System (ADS)

    Divita, E. L.; Mills, R. E.; Koch, T. L.; Gordon, M. J.; Wilcox, R. A.; Williams, R. E.

    1992-07-01

    This paper summarizes test methodology for focal plane array (FPA) testing that can be used for benign (clear) and radiation environments, and describes the use of custom dewars and integrated test equipment in an example environment. The test methodology, consistent with American Society for Testing Materials (ASTM) standards, is presented for the total accumulated gamma dose, transient dose rate, gamma flux, and neutron fluence environments. The merits and limitations of using Cobalt 60 for gamma environment simulations and of using various fast-neutron reactors and neutron sources for neutron simulations are presented. Test result examples are presented to demonstrate test data acquisition and FPA parameter performance under different measurement conditions and environmental simulations.

  5. An Investigation into the Effect of the Mode of Presentation on Contract Evaluation When Cost/Schedule Control System Criteria Is Used.

    DTIC Science & Technology

    1986-09-01

    inversely related to years of experience. 1 18 IV. Methodology The methods used to test the research hypotheses were experimentation and survey. Two test...17 IV. Methodology .. .. .. .. .. .. .. ... .. ... .. .... 19 Task. .. .. .. .. .. .. ... .. ... .. ... .... 19 Population...Attribute Ratings vs Mode of Presentation (Paired T-test). . 53 XXVI. Preferences ............................... 53 vii Abstract This rsearch -focused

  6. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires.

    PubMed

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  7. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires

    NASA Astrophysics Data System (ADS)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  8. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate ('dynamic fatigue') testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rate in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  9. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  10. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    PubMed

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2018-05-01

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  11. Two-step rating-based 'double-faced applicability' test for sensory analysis of spread products as an alternative to descriptive analysis with trained panel.

    PubMed

    Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong

    2018-03-01

    Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Constructing a question bank based on script concordance approach as a novel assessment methodology in surgical education.

    PubMed

    Aldekhayel, Salah A; Alselaim, Nahar A; Magzoub, Mohi Eldin; Al-Qattan, Mohammad M; Al-Namlah, Abdullah M; Tamim, Hani; Al-Khayal, Abdullah; Al-Habdan, Sultan I; Zamakhshary, Mohammed F

    2012-10-24

    Script Concordance Test (SCT) is a new assessment tool that reliably assesses clinical reasoning skills. Previous descriptions of developing SCT-question banks were merely subjective. This study addresses two gaps in the literature: 1) conducting the first phase of a multistep validation process of SCT in Plastic Surgery, and 2) providing an objective methodology to construct a question bank based on SCT. After developing a test blueprint, 52 test items were written. Five validation questions were developed and a validation survey was established online. Seven reviewers were asked to answer this survey. They were recruited from two countries, Saudi Arabia and Canada, to improve the test's external validity. Their ratings were transformed into percentages. Analysis was performed to compare reviewers' ratings by looking at correlations, ranges, means, medians, and overall scores. Scores of reviewers' ratings were between 76% and 95% (mean 86% ± 5). We found poor correlations between reviewers (Pearson's: +0.38 to -0.22). Ratings of individual validation questions ranged between 0 and 4 (on a scale 1-5). Means and medians of these ranges were computed for each test item (mean: 0.8 to 2.4; median: 1 to 3). A subset of test items comprising 27 items was generated based on a set of inclusion and exclusion criteria. This study proposes an objective methodology for validation of SCT-question bank. Analysis of validation survey is done from all angles, i.e., reviewers, validation questions, and test items. Finally, a subset of test items is generated based on a set of criteria.

  13. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    PubMed

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Sediment bioaccumulation test with Lumbriculus variegatus (EPA test method 100.3) effects of feeding and organism loading rate

    EPA Science Inventory

    Sediment bioaccumulation test methodology of USEPA and ASTM in 2000 specifies that the Lumbriculus variegatus should not be fed during the 28-day exposure and recommends an organism loading rate of total organic carbon in sediment to organism dry weight of no less than 50:1. It ...

  15. Pressure Decay Testing Methodology for Quantifying Leak Rates of Full-Scale Docking System Seals

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.; Daniels, Christopher C.; Wasowski, Janice L.; Garafolo, Nicholas G.; Penney, Nicholas; Steinetz, Bruce M.

    2010-01-01

    NASA is developing a new docking system to support future space exploration missions to low-Earth orbit and the Moon. This system, called the Low Impact Docking System, is a mechanism designed to connect the Orion Crew Exploration Vehicle to the International Space Station, the lunar lander (Altair), and other future Constellation Project vehicles. NASA Glenn Research Center is playing a key role in developing the main interface seal for this docking system. This seal will be relatively large with an outside diameter in the range of 54 to 58 in. (137 to 147 cm). As part of this effort, a new test apparatus has been designed, fabricated, and installed to measure leak rates of candidate full-scale seals under simulated thermal, vacuum, and engagement conditions. Using this test apparatus, a pressure decay testing and data processing methodology has been developed to quantify full-scale seal leak rates. Tests performed on untreated 54 in. diameter seals at room temperature in a fully compressed state resulted in leak rates lower than the requirement of less than 0.0025 lbm, air per day (0.0011 kg/day).

  16. Inactivating Influenza Viruses on Surfaces Using Hydrogen Peroxide or Triethylene Glycol at Low Vapor Concentrations

    DTIC Science & Technology

    2009-04-01

    would not be expected to have a very sgnficant effect on the TEG vapor concentraton. Tests of Natural Die Off Rate. Our normal meth- odology for...to get consstent results. For tests to measure the natural de-off rate of nflu- enza vruses, ths methodology could not be used because the durat...on of the test was too long, so an alternatve procedure was employed. In preparaton for an exper- mental test to measure the natural de-off rate

  17. Consequences of switching from a fixed 2 : 1 ratio of amoxicillin/clavulanate (CLSI) to a fixed concentration of clavulanate (EUCAST) for susceptibility testing of Escherichia coli.

    PubMed

    Leverstein-van Hall, Maurine A; Waar, Karola; Muilwijk, Jan; Cohen Stuart, James

    2013-11-01

    The CLSI recommends a fixed 2 : 1 ratio of co-amoxiclav for broth microdilution susceptibility testing of Enterobacteriaceae, while EUCAST recommends a fixed 2 mg/L clavulanate concentration. The aims of this study were: (i) to determine the influence of a switch from CLSI to EUCAST methodology on Escherichia coli susceptibility rates; (ii) to compare susceptibility results obtained using EUCAST-compliant microdilution with those from disc diffusion and the Etest; and (iii) to evaluate the clinical outcome of patients with E. coli sepsis treated with co-amoxiclav in relation to the susceptibility results obtained using either method. Resistance rates were determined in three laboratories that switched from CLSI to EUCAST cards with the Phoenix system (Becton Dickinson) as well as in 17 laboratories that continued to use CLSI cards with the VITEK 2 system (bioMérieux). In one laboratory, isolates were simultaneously tested by both the Phoenix system and either disc diffusion (n = 471) or the Etest (n = 113). Medical and laboratory records were reviewed for E. coli sepsis patients treated with co-amoxiclav monotherapy. Only laboratories that switched methodology showed an increase in resistance rates - from 19% in 2010 to 31% in 2011 (P < 0.0001). All isolates that tested susceptible by microdilution were also susceptible by disc diffusion or the Etest, but of 326 isolates that tested resistant by microdilution, 43% and 59% tested susceptible by disc diffusion and the Etest, respectively. Among the 89 patients included there was a better correlation between clinical response and measured MICs using the Phoenix system than the Etest. EUCAST methodology resulted in higher co-amoxiclav E. coli resistance rates than CLSI methodology, but correlated better with clinical outcome. EUCAST-compliant microdilution and disc diffusion provided discrepant results.

  18. Using a Lean Six Sigma Approach to Yield Sustained Pressure Ulcer Prevention for Complex Critical Care Patients.

    PubMed

    Donovan, Elizabeth A; Manta, Christine J; Goldsack, Jennifer C; Collins, Michelle L

    2016-01-01

    Under value-based purchasing, Medicare withholds reimbursements for hospital-acquired pressure ulcer occurrence and rewards hospitals that meet performance standards. With little evidence of a validated prevention process, nurse managers are challenged to find evidence-based interventions. The aim of this study was to reduce the unit-acquired pressure ulcer (UAPU) rate on targeted intensive care and step-down units by 15% using Lean Six Sigma (LSS) methodology. An interdisciplinary team designed a pilot program using LSS methodology to test 4 interventions: standardized documentation, equipment monitoring, patient out-of-bed-to-chair monitoring, and a rounding checklist. During the pilot, the UAPU rate decreased from 4.4% to 2.8%, exceeding the goal of a 15% reduction. The rate remained below the goal through the program control phase at 2.9%, demonstrating a statistically significant reduction after intervention implementation. The program significantly reduced UAPU rates in high-risk populations. LSS methodologies are a sustainable approach to reducing hospital-acquired conditions that should be broadly tested and implemented.

  19. How effective is drug testing as a workplace safety strategy? A systematic review of the evidence.

    PubMed

    Pidd, Ken; Roche, Ann M

    2014-10-01

    The growing prevalence of workplace drug testing and the narrow scope of previous reviews of the evidence base necessitate a comprehensive review of research concerning the efficacy of drug testing as a workplace strategy. A systematic qualitative review of relevant research published between January 1990 and January 2013 was undertaken. Inclusion criteria were studies that evaluated the effectiveness of drug testing in deterring employee drug use or reducing workplace accident or injury rates. Methodological adequacy was assessed using a published assessment tool specifically designed to assess the quality of intervention studies. A total of 23 studies were reviewed and assessed, six of which reported on the effectiveness of testing in reducing employee drug use and 17 which reported on occupational accident or injury rates. No studies involved randomised control trials. Only one study was assessed as demonstrating strong methodological rigour. That study found random alcohol testing reduced fatal accidents in the transport industry. The majority of studies reviewed contained methodological weaknesses including; inappropriate study design, limited sample representativeness, the use of ecological data to evaluate individual behaviour change and failure to adequately control for potentially confounding variables. This latter finding is consistent with previous reviews and indicates the evidence base for the effectiveness of testing in improving workplace safety is at best tenuous. Better dissemination of the current evidence in relation to workplace drug testing is required to support evidence-informed policy and practice. There is also a pressing need for more methodologically rigorous research to evaluate the efficacy and utility of drug testing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Single event test methodology for integrated optoelectronics

    NASA Technical Reports Server (NTRS)

    Label, Kenneth A.; Cooley, James A.; Stassinopoulos, E. G.; Marshall, Paul; Crabtree, Christina

    1993-01-01

    A single event upset (SEU), defined as a transient or glitch on the output of a device, and its applicability to integrated optoelectronics are discussed in the context of spacecraft design and the need for more than a bit error rate viewpoint for testing and analysis. A methodology for testing integrated optoelectronic receivers and transmitters for SEUs is presented, focusing on the actual test requirements and system schemes needed for integrated optoelectronic devices. Two main causes of single event effects in the space environment, including protons and galactic cosmic rays, are considered along with ground test facilities for simulating the space environment.

  1. Effective Rating Scale Development for Speaking Tests: Performance Decision Trees

    ERIC Educational Resources Information Center

    Fulcher, Glenn; Davidson, Fred; Kemp, Jenny

    2011-01-01

    Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…

  2. Problem Solving in Biology: A Methodology

    ERIC Educational Resources Information Center

    Wisehart, Gary; Mandell, Mark

    2008-01-01

    A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…

  3. Signal processing methodologies for an acoustic fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  4. MUSIC-Expected maximization gaussian mixture methodology for clustering and detection of task-related neuronal firing rates.

    PubMed

    Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A

    2017-01-15

    Researchers often rely on simple methods to identify involvement of neurons in a particular motor task. The historical approach has been to inspect large groups of neurons and subjectively separate neurons into groups based on the expertise of the investigator. In cases where neuron populations are small it is reasonable to inspect these neuronal recordings and their firing rates carefully to avoid data omissions. In this paper, a new methodology is presented for automatic objective classification of neurons recorded in association with behavioral tasks into groups. By identifying characteristics of neurons in a particular group, the investigator can then identify functional classes of neurons based on their relationship to the task. The methodology is based on integration of a multiple signal classification (MUSIC) algorithm to extract relevant features from the firing rate and an expectation-maximization Gaussian mixture algorithm (EM-GMM) to cluster the extracted features. The methodology is capable of identifying and clustering similar firing rate profiles automatically based on specific signal features. An empirical wavelet transform (EWT) was used to validate the features found in the MUSIC pseudospectrum and the resulting signal features captured by the methodology. Additionally, this methodology was used to inspect behavioral elements of neurons to physiologically validate the model. This methodology was tested using a set of data collected from awake behaving non-human primates. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Methodology for Flight Relevant Arc-Jet Testing of Flexible Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Bruce, Walter E., III; Mesick, Nathaniel J.; Sutton, Kenneth

    2013-01-01

    A methodology to correlate flight aeroheating environments to the arc-jet environment is presented. For a desired hot-wall flight heating rate, the methodology provides the arcjet bulk enthalpy for the corresponding cold-wall heating rate. A series of analyses were conducted to examine the effects of the test sample model holder geometry to the overall performance of the test sample. The analyses were compared with arc-jet test samples and challenges and issues are presented. The transient flight environment was calculated for the Hypersonic Inflatable Aerodynamic Decelerator (HIAD) Earth Atmospheric Reentry Test (HEART) vehicle, which is a planned demonstration vehicle using a large inflatable, flexible thermal protection system to reenter the Earth's atmosphere from the International Space Station. A series of correlations were developed to define the relevant arc-jet test environment to properly approximate the HEART flight environment. The computed arcjet environments were compared with the measured arc-jet values to define the uncertainty of the correlated environment. The results show that for a given flight surface heat flux and a fully-catalytic TPS, the flight relevant arc-jet heat flux increases with the arc-jet bulk enthalpy while for a non-catalytic TPS the arc-jet heat flux decreases with the bulk enthalpy.

  6. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Statistical Rating Organization (NRSRO) for this security, as of the reporting date Wt Avg Gross Margin Gross... Nationally Recognized Statistical Rating Organization (NRSRO) for this security, as of the reporting date [c... The most current rating issued by any Nationally Recognized Statistical Rating Organization (NRSRO...

  7. Assessing Aircraft Susceptibility to Nonlinear Aircraft-Pilot Coupling/Pilot-Induced Oscillations

    NASA Technical Reports Server (NTRS)

    Hess, R.A.; Stout, P. W.

    1997-01-01

    A unified approach for assessing aircraft susceptibility to aircraft-pilot coupling (or pilot-induced oscillations) which was previously reported in the literature and applied to linear systems is extended to nonlinear systems, with emphasis upon vehicles with actuator rate saturation. The linear methodology provided a tool for predicting: (1) handling qualities levels, (2) pilot-induced oscillation rating levels and (3) a frequency range in which pilot-induced oscillations are likely to occur. The extension to nonlinear systems provides a methodology for predicting the latter two quantities. Eight examples are presented to illustrate the use of the technique. The dearth of experimental flight-test data involving systematic variation and assessment of the effects of actuator rate limits presently prevents a more thorough evaluation of the methodology.

  8. Microplastic Generation in the Marine Environment Through Degradation and Fragmentation

    NASA Astrophysics Data System (ADS)

    Perryman, M. E.; Jambeck, J.; Woodson, C. B.; Locklin, J.

    2016-02-01

    Plastic use has become requisite in our global economy; as population continues to increase, so too, will plastic production. At its end-of-life, some amount of plastic is mismanaged and ends up in the ocean. Once there, various environmental stresses eventually fragment plastic into microplastic pieces, now ubiquitous in the marine environment. Microplastics pose a serious threat to marine biota and possibly humans. Though the general mechanisms of microplastic formation are known, the rate and extent is not. Currently, no standard methodology for testing the formation of microplastic exists. We developed a replicable and flexible methodology for testing the formation of microplastics. We used this methodology to test the effects of UV, thermal, and mechanical stress on various types of plastic. We tested for fragmentation by measuring weight and size distribution, and looked for signs of degraded plastic using Fourier transform infrared spectroscopy. Though our results did not find any signs of fragmentation, we did see degradation. Additionally, we established a sound methodology and provided a benchmark for additional studies.

  9. Kentucky highway rating system

    DOT National Transportation Integrated Search

    2003-03-01

    This study had two goals: 1. Formulate a new method for generating roadway adequacy ratings; 2. Construct an appropriate data set and then test the method by comparing it to the results of the HPMS-AP method. The recommended methodology builds on the...

  10. Global trends in the incidence and prevalence of type 2 diabetes in children and adolescents: a systematic review and evaluation of methodological approaches.

    PubMed

    Fazeli Farsani, S; van der Aa, M P; van der Vorst, M M J; Knibbe, C A J; de Boer, A

    2013-07-01

    This study aimed to systematically review what has been reported on the incidence and prevalence of type 2 diabetes in children and adolescents, to scrutinise the methodological issues observed in the included studies and to prepare recommendations for future research and surveillances. PubMed, the Cochrane Database of Systematic Reviews, Scopus, EMBASE and Web of Science were searched from inception to February 2013. Population-based studies on incidence and prevalence of type 2 diabetes in children and adolescents were summarised and methodologically evaluated. Owing to substantial methodological heterogeneity and considerable differences in study populations a quantitative meta-analysis was not performed. Among 145 potentially relevant studies, 37 population-based studies met the inclusion criteria. Variations in the incidence and prevalence rates of type 2 diabetes in children and adolescents were mainly related to age of the study population, calendar time, geographical regions and ethnicity, resulting in a range of 0-330 per 100,000 person-years for incidence rates, and 0-5,300 per 100,000 population for prevalence rates. Furthermore, a substantial variation in the methodological characteristics was observed for response rates (60-96%), ascertainment rates (53-99%), diagnostic tests and criteria used to diagnose type 2 diabetes. Worldwide incidence and prevalence of type 2 diabetes in children and adolescents vary substantially among countries, age categories and ethnic groups and this can be explained by variations in population characteristics and methodological dissimilarities between studies.

  11. Life Prediction/Reliability Data of Glass-Ceramic Material Determined for Radome Applications

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.

    2002-01-01

    Brittle materials, ceramics, are candidate materials for a variety of structural applications for a wide range of temperatures. However, the process of slow crack growth, occurring in any loading configuration, limits the service life of structural components. Therefore, it is important to accurately determine the slow crack growth parameters required for component life prediction using an appropriate test methodology. This test methodology also should be useful in determining the influence of component processing and composition variables on the slow crack growth behavior of newly developed or existing materials, thereby allowing the component processing and composition to be tailored and optimized to specific needs. Through the American Society for Testing and Materials (ASTM), the authors recently developed two test methods to determine the life prediction parameters of ceramics. The two test standards, ASTM 1368 for room temperature and ASTM C 1465 for elevated temperatures, were published in the 2001 Annual Book of ASTM Standards, Vol. 15.01. Briefly, the test method employs constant stress-rate (or dynamic fatigue) testing to determine flexural strengths as a function of the applied stress rate. The merit of this test method lies in its simplicity: strengths are measured in a routine manner in flexure at four or more applied stress rates with an appropriate number of test specimens at each applied stress rate. The slow crack growth parameters necessary for life prediction are then determined from a simple relationship between the strength and the applied stress rate. Extensive life prediction testing was conducted at the NASA Glenn Research Center using the developed ASTM C 1368 test method to determine the life prediction parameters of a glass-ceramic material that the Navy will use for radome applications.

  12. Methodological Quality of National Guidelines for Pediatric Inpatient Conditions

    PubMed Central

    Hester, Gabrielle; Nelson, Katherine; Mahant, Sanjay; Eresuma, Emily; Keren, Ron; Srivastava, Rajendu

    2014-01-01

    Background Guidelines help inform standardization of care for quality improvement (QI). The Pediatric Research in Inpatient Settings (PRIS) network published a prioritization list of inpatient conditions with high prevalence, cost, and variation in resource utilization across children’s hospitals. The methodological quality of guidelines for priority conditions is unknown. Objective To rate the methodological quality of national guidelines for 20 priority pediatric inpatient conditions. Design We searched sources including PubMed for national guidelines published 2002–2012. Guidelines specific to one organism, test or treatment, or institution were excluded. Guidelines were rated by two raters using a validated tool (AGREE II) with an overall rating on a 7-point scale (7–highest). Inter-rater reliability was measured with a weighted kappa coefficient. Results 17 guidelines met inclusion criteria for 13 conditions, 7 conditions yielded no relevant national guidelines. The highest methodological quality guidelines were for asthma, tonsillectomy, and bronchiolitis (mean overall rating 7, 6.5 and 6.5 respectively); the lowest were for sickle cell disease (2 guidelines) and dental caries (mean overall rating 4, 3.5, and 3 respectively). The overall weighted kappa was 0.83 (95% confidence interval 0.78–0.87). Conclusions We identified a group of moderate to high methodological quality national guidelines for priority pediatric inpatient conditions. Hospitals should consider these guidelines to inform QI initiatives. PMID:24677729

  13. Comment on Hall et al. (2017), "How to Choose Between Measures of Tinnitus Loudness for Clinical Research? A Report on the Reliability and Validity of an Investigator-Administered Test and a Patient-Reported Measure Using Baseline Data Collected in a Phase IIa Drug Trial".

    PubMed

    Sabour, Siamak

    2018-03-08

    The purpose of this letter, in response to Hall, Mehta, and Fackrell (2017), is to provide important knowledge about methodology and statistical issues in assessing the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. The author uses reference textbooks and published articles regarding scientific assessment of the validity and reliability of a clinical test to discuss the statistical test and the methodological approach in assessing validity and reliability in clinical research. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess reliability and validity. The qualitative variables of sensitivity, specificity, positive predictive value, negative predictive value, false positive and false negative rates, likelihood ratio positive and likelihood ratio negative, as well as odds ratio (i.e., ratio of true to false results), are the most appropriate estimates to evaluate validity of a test compared to a gold standard. In the case of quantitative variables, depending on distribution of the variable, Pearson r or Spearman rho can be applied. Diagnostic accuracy (validity) and diagnostic precision (reliability or agreement) are two completely different methodological issues. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess validity.

  14. REE radiation fault model: a tool for organizing and communication radiation test data and construction COTS based spacebourne computing systems

    NASA Technical Reports Server (NTRS)

    Ferraro, R.; Some, R.

    2002-01-01

    The growth in data rates of instruments on future NASA spacecraft continues to outstrip the improvement in communications bandwidth and processing capabilities of radiation-hardened computers. Sophisticated autonomous operations strategies will further increase the processing workload. Given the reductions in spacecraft size and available power, standard radiation hardened computing systems alone will not be able to address the requirements of future missions. The REE project was intended to overcome this obstacle by developing a COTS- based supercomputer suitable for use as a science and autonomy data processor in most space environments. This development required a detailed knowledge of system behavior in the presence of Single Event Effect (SEE) induced faults so that mitigation strategies could be designed to recover system level reliability while maintaining the COTS throughput advantage. The REE project has developed a suite of tools and a methodology for predicting SEU induced transient fault rates in a range of natural space environments from ground-based radiation testing of component parts. In this paper we provide an overview of this methodology and tool set with a concentration on the radiation fault model and its use in the REE system development methodology. Using test data reported elsewhere in this and other conferences, we predict upset rates for a particular COTS single board computer configuration in several space environments.

  15. Statistical Properties of SEE Rate Calculation in the Limits of Large and Small Event Counts

    NASA Technical Reports Server (NTRS)

    Ladbury, Ray

    2007-01-01

    This viewgraph presentation reviews the Statistical properties of Single Event Effects (SEE) rate calculations. The goal of SEE rate calculation is to bound the SEE rate, though the question is by how much. The presentation covers: (1) Understanding errors on SEE cross sections, (2) Methodology: Maximum Likelihood and confidence Contours, (3) Tests with Simulated data and (4) Applications.

  16. A Guide for Setting the Cut-Scores to Minimize Weighted Classification Errors in Test Batteries

    ERIC Educational Resources Information Center

    Grabovsky, Irina; Wainer, Howard

    2017-01-01

    In this article, we extend the methodology of the Cut-Score Operating Function that we introduced previously and apply it to a testing scenario with multiple independent components and different testing policies. We derive analytically the overall classification error rate for a test battery under the policy when several retakes are allowed for…

  17. Development and Field Test of an Audit Tool and Tracer Methodology for Clinician Assessment of Quality in End-of-Life Care.

    PubMed

    Bookbinder, Marilyn; Hugodot, Amandine; Freeman, Katherine; Homel, Peter; Santiago, Elisabeth; Riggs, Alexa; Gavin, Maggie; Chu, Alice; Brady, Ellen; Lesage, Pauline; Portenoy, Russell K

    2018-02-01

    Quality improvement in end-of-life care generally acquires data from charts or caregivers. "Tracer" methodology, which assesses real-time information from multiple sources, may provide complementary information. The objective of this study was to develop a valid brief audit tool that can guide assessment and rate care when used in a clinician tracer to evaluate the quality of care for the dying patient. To identify items for a brief audit tool, 248 items were created to evaluate overall quality, quality in specific content areas (e.g., symptom management), and specific practices. Collected into three instruments, these items were used to interview professional caregivers and evaluate the charts of hospitalized patients who died. Evidence that this information could be validly captured using a small number of items was obtained through factor analyses, canonical correlations, and group comparisons. A nurse manager field tested tracer methodology using candidate items to evaluate the care provided to other patients who died. The survey of 145 deaths provided chart data and data from 445 interviews (26 physicians, 108 nurses, 18 social workers, and nine chaplains). The analyses yielded evidence of construct validity for a small number of items, demonstrating significant correlations between these items and content areas identified as latent variables in factor analyses. Criterion validity was suggested by significant differences in the ratings on these items between the palliative care unit and other units. The field test evaluated 127 deaths, demonstrated the feasibility of tracer methodology, and informed reworking of the candidate items into the 14-item Tracer EoLC v1. The Tracer EoLC v1 can be used with tracer methodology to guide the assessment and rate the quality of end-of-life care. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  18. Methodology in the assessment of complex human performance : the effects of signal rate on monitoring a dynamic process.

    DOT National Transportation Integrated Search

    1969-04-01

    Male subjects were tested after extensive training as two five-man 'crews' in an experiment designed to examine the effects of signal rate on the performance of a task involving the monitoring of a dynamic process. Performance was measured using thre...

  19. FY17 Status Report on Testing Supporting the Inclusion of Grade 91 Steel as an Acceptable Material for Application of the EPP Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messner, Mark C.; Sham, Sam; Wang, Yanli

    This report summarizes the experiments performed in FY17 on Gr. 91 steels. The testing of Gr. 91 has technical significance because, currently, it is the only approved material for Class A construction that is strongly cyclic softening. Specific FY17 testing includes the following activities for Gr. 91 steel. First, two types of key feature testing have been initiated, including two-bar thermal ratcheting and Simplified Model Testing (SMT). The goal is to qualify the Elastic – Perfectly Plastic (EPP) design methodologies and to support incorporation of these rules for Gr. 91 into the ASME Division 5 Code. The preliminary SMT testmore » results show that Gr. 91 is most damaging when tested with compression hold mode under the SMT creep fatigue testing condition. Two-bar thermal ratcheting test results at a temperature range between 350 to 650o C were compared with the EPP strain limits code case evaluation, and the results show that the EPP strain limits code case is conservative. The material information obtained from these key feature tests can also be used to verify its material model. Second, to provide experimental data in support of the viscoplastic material model development at Argonne National Laboratory, selective tests were performed to evaluate the effect of cyclic softening on strain rate sensitivity and creep rates. The results show the prior cyclic loading history decreases the strain rate sensitivity and increases creep rates. In addition, isothermal cyclic stress-strain curves were generated at six different temperatures, and a nonisothermal thermomechanical testing was also performed to provide data to calibrate the viscoplastic material model.« less

  20. The State of Retrieval System Evaluation.

    ERIC Educational Resources Information Center

    Salton, Gerald

    1992-01-01

    The current state of information retrieval (IR) evaluation is reviewed with criticisms directed at the available test collections and the research and evaluation methodologies used, including precision and recall rates for online searches and laboratory tests not including real users. Automatic text retrieval systems are also discussed. (32…

  1. Audit of Trichomonas vaginalis test requesting by community referrers after a change from culture to molecular testing, including a cost analysis.

    PubMed

    Bissessor, Liselle; Wilson, Janet; McAuliffe, Gary; Upton, Arlo

    2017-06-16

    Trichomonas vaginalis (TV) prevalence varies among different communities and peoples. The availability of robust molecular platforms for the detection of TV has advanced diagnosis; however, molecular tests are more costly than phenotypic methodologies, and testing all urogenital samples is costly. We recently replaced culture methods with the Aptima Trichomonas vaginalis nucleic acid amplification test on specific request and as reflex testing by the laboratory, and have audited this change. Data were collected from August 2015 (microbroth culture and microscopy) and August 2016 (Aptima TV assay) including referrer, testing volumes, results and test cost estimates. In August 2015, 10,299 vaginal swabs, and in August 2016, 2,189 specimens (urogenital swabs and urines), were tested. The positivity rate went from 0.9% to 5.3%, and overall more TV infections were detected in 2016. The number needed to test and cost for one positive TV result respectively was 111 and $902.55 in 2015, and 19 and $368.92 in 2016. Request volumes and positivity rates differed among referrers. The methodology change was associated with higher overall detection of TV, and reductions in the numbers needed to test/cost for one TV diagnosis. Our audit suggests that there is room for improvement with TV test requesting in our community.

  2. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale.

    PubMed

    Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon

    2008-08-01

    Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical appraisal of single-subject designs, thereby assisting to improve standards of single-case methodology.

  3. Methodology for dynamic biaxial tension testing of pregnant uterine tissue.

    PubMed

    Manoogian, Sarah; Mcnally, Craig; Calloway, Britt; Duma, Stefan

    2007-01-01

    Placental abruption accounts for 50% to 70% of fetal losses in motor vehicle crashes. Since automobile crashes are the leading cause of traumatic fetal injury mortality in the United States, research of this injury mechanism is important. Before research can adequately evaluate current and future restraint designs, a detailed model of the pregnant uterine tissues is necessary. The purpose of this study is to develop a methodology for testing the pregnant uterus in biaxial tension at a rate normally seen in a motor vehicle crash. Since the majority of previous biaxial work has established methods for quasi-static testing, this paper combines previous research and new methods to develop a custom designed system to strain the tissue at a dynamic rate. Load cells and optical markers are used for calculating stress strain curves of the perpendicular loading axes. Results for this methodology show images of a tissue specimen loaded and a finite verification of the optical strain measurement. The biaxial test system dynamically pulls the tissue to failure with synchronous motion of four tissue grips that are rigidly coupled to the tissue specimen. The test device models in situ loading conditions of the pregnant uterus and overcomes previous limitations of biaxial testing. A non-contact method of measuring strains combined with data reduction to resolve the stresses in two directions provides the information necessary to develop a three dimensional constitutive model of the material. Moreover, future research can apply this method to other soft tissues with similar in situ loading conditions.

  4. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology

    PubMed Central

    Comella, Cynthia L.; Fox, Susan H.; Bhatia, Kailash P.; Perlmutter, Joel S.; Jinnah, Hyder A.; Zurowski, Mateusz; McDonald, William M.; Marsh, Laura; Rosen, Ami R.; Waliczek, Tracy; Wright, Laura J.; Galpern, Wendy R.; Stebbins, Glenn T.

    2016-01-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies. PMID:27088112

  5. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology.

    PubMed

    Comella, Cynthia L; Fox, Susan H; Bhatia, Kailash P; Perlmutter, Joel S; Jinnah, Hyder A; Zurowski, Mateusz; McDonald, William M; Marsh, Laura; Rosen, Ami R; Waliczek, Tracy; Wright, Laura J; Galpern, Wendy R; Stebbins, Glenn T

    2015-06-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies.

  6. Maintaining Equivalent Cut Scores for Small Sample Test Forms

    ERIC Educational Resources Information Center

    Dwyer, Andrew C.

    2016-01-01

    This study examines the effectiveness of three approaches for maintaining equivalent performance standards across test forms with small samples: (1) common-item equating, (2) resetting the standard, and (3) rescaling the standard. Rescaling the standard (i.e., applying common-item equating methodology to standard setting ratings to account for…

  7. Assessment of Integrated Pedestrian Protection Systems with Autonomous Emergency Braking (AEB) and Passive Safety Components.

    PubMed

    Edwards, Mervyn; Nathanson, Andrew; Carroll, Jolyon; Wisch, Marcus; Zander, Oliver; Lubbe, Nils

    2015-01-01

    Autonomous emergency braking (AEB) systems fitted to cars for pedestrians have been predicted to offer substantial benefit. On this basis, consumer rating programs-for example, the European New Car Assessment Programme (Euro NCAP)-are developing rating schemes to encourage fitment of these systems. One of the questions that needs to be answered to do this fully is how the assessment of the speed reduction offered by the AEB is integrated with the current assessment of the passive safety for mitigation of pedestrian injury. Ideally, this should be done on a benefit-related basis. The objective of this research was to develop a benefit-based methodology for assessment of integrated pedestrian protection systems with AEB and passive safety components. The method should include weighting procedures to ensure that it represents injury patterns from accident data and replicates an independently estimated benefit of AEB. A methodology has been developed to calculate the expected societal cost of pedestrian injuries, assuming that all pedestrians in the target population (i.e., pedestrians impacted by the front of a passenger car) are impacted by the car being assessed, taking into account the impact speed reduction offered by the car's AEB (if fitted) and the passive safety protection offered by the car's frontal structure. For rating purposes, the cost for the assessed car is normalized by comparing it to the cost calculated for a reference car. The speed reductions measured in AEB tests are used to determine the speed at which each pedestrian in the target population will be impacted. Injury probabilities for each impact are then calculated using the results from Euro NCAP pedestrian impactor tests and injury risk curves. These injury probabilities are converted into cost using "harm"-type costs for the body regions tested. These costs are weighted and summed. Weighting factors were determined using accident data from Germany and Great Britain and an independently estimated AEB benefit. German and Great Britain versions of the methodology are available. The methodology was used to assess cars with good, average, and poor Euro NCAP pedestrian ratings, in combination with a current AEB system. The fitment of a hypothetical A-pillar airbag was also investigated. It was found that the decrease in casualty injury cost achieved by fitting an AEB system was approximately equivalent to that achieved by increasing the passive safety rating from poor to average. Because the assessment was influenced strongly by the level of head protection offered in the scuttle and windscreen area, a hypothetical A-pillar airbag showed high potential to reduce overall casualty cost. A benefit-based methodology for assessment of integrated pedestrian protection systems with AEB has been developed and tested. It uses input from AEB tests and Euro NCAP passive safety tests to give an integrated assessment of the system performance, which includes consideration of effects such as the change in head impact location caused by the impact speed reduction given by the AEB.

  8. Carbon Dioxide Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit

    NASA Technical Reports Server (NTRS)

    Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses

    2014-01-01

    Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy, and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject, and physiological differences between subjects. Computational Fluid Dynamics (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test was to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III suit across a range of workload and flow rates. Data and trends observed during testing along with refined CFD models will be used to help design an inlet vent configuration for the Z-2 space suit. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit, and the Enhanced Mobility Advanced Crew Escape Suit. Three subjects performed two test sessions each in the Mark-III suit to allow for comparison between tests. Six different helmet inlet vent configurations were evaluated during each test session. Suit pressure was maintained at 4.3 psid. Suited test subjects walked on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute were tested at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the CO2 production measured by an additional gas analyzer at the air outlet from the suit. Real-time metabolic rate measurements were used to adjust the treadmill workload to meet target metabolic rates. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future inlet vent designs and ground testing.

  9. Aerospace Payloads Leak Test Methodology

    NASA Technical Reports Server (NTRS)

    Lvovsky, Oleg; Grayson, Cynthia M.

    2010-01-01

    Pressurized and sealed aerospace payloads can leak on orbit. When dealing with toxic or hazardous materials, requirements for fluid and gas leakage rates have to be properly established, and most importantly, reliably verified using the best Nondestructive Test (NDT) method available. Such verification can be implemented through application of various leak test methods that will be the subject of this paper, with a purpose to show what approach to payload leakage rate requirement verification is taken by the National Aeronautics and Space Administration (NASA). The scope of this paper will be mostly a detailed description of 14 leak test methods recommended.

  10. On the identification of cohesive parameters for printed metal-polymer interfaces

    NASA Astrophysics Data System (ADS)

    Heinrich, Felix; Langner, Hauke H.; Lammering, Rolf

    2017-05-01

    The mechanical behavior of printed electronics on fiber reinforced composites is investigated. A methodology based on cohesive zone models is employed, considering interfacial strengths, stiffnesses and critical strain energy release rates. A double cantilever beam test and an end notched flexure test are carried out to experimentally determine critical strain energy release rates under fracture modes I and II. Numerical simulations are performed in Abaqus 6.13 to model both tests. Applying the simulations, an inverse parameter identification is run to determine the full set of cohesive parameters.

  11. Delamination onset in polymeric composite laminates under thermal and mechanical loads

    NASA Technical Reports Server (NTRS)

    Martin, Roderick H.

    1991-01-01

    A fracture mechanics damage methodology to predict edge delamination is described. The methodology accounts for residual thermal stresses, cyclic thermal stresses, and cyclic mechanical stresses. The modeling is based on the classical lamination theory and a sublaminate theory. The prediction methodology determines the strain energy release rate, G, at the edge of a laminate and compares it with the fatigue and fracture toughness of the composite. To verify the methodology, isothermal static tests at 23, 125, and 175 C and tension-tension fatigue tests at 23 and 175 C were conducted on laminates. The material system used was a carbon/bismaleimide, IM7/5260. Two quasi-isotropic layups were used. Also, 24 ply unidirectional double cantilever beam specimens were tested to determine the fatigue and fracture toughness of the composite at different temperatures. Raising the temperature had the effect of increasing the value of G at the edge for these layups and also to lower the fatigue and fracture toughness of the composite. The static stress to edge delamination was not affected by temperature but the number of cycles to edge delamination decreased.

  12. The Relationship between ISO 9000 Participation and Educational Outcomes of Schools

    ERIC Educational Resources Information Center

    Bae, Sang Hoon

    2007-01-01

    Purpose: The study seeks to examine the relationship between the implementation of the ISO 9000 quality management system and educational outcomes of schools, measured by student achievement on the state-mandated tests and school attendance rates--graduation rates, in the case of high schools. Design/methodology/approach: The study was conducted…

  13. Fatigue Life Methodology for Bonded Composite Skin/Stringer Configurations

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Paris, Isabelle L.; OBrien, T. Kevin; Minguet, Pierre J.

    2001-01-01

    A methodology is presented for determining the fatigue life of composite structures based on fatigue characterization data and geometric nonlinear finite element (FE) analyses. To demonstrate the approach, predicted results were compared to fatigue tests performed on specimens which represented a tapered composite flange bonded onto a composite skin. In a first step, tension tests were performed to evaluate the debonding mechanisms between the flange and the skin. In a second step, a 2D FE model was developed to analyze the tests. To predict matrix cracking onset, the relationship between the tension load and the maximum principal stresses transverse to the fiber direction was determined through FE analysis. Transverse tension fatigue life data were used to -enerate an onset fatigue life P-N curve for matrix cracking. The resulting prediction was in good agreement with data from the fatigue tests. In a third step, a fracture mechanics approach based on FE analysis was used to determine the relationship between the tension load and the critical energy release rate. Mixed mode energy release rate fatigue life data were used to create a fatigue life onset G-N curve for delamination. The resulting prediction was in good agreement with data from the fatigue tests. Further, the prediction curve for cumulative life to failure was generated from the previous onset fatigue life curves. The results showed that the methodology offers a significant potential to Predict cumulative fatigue life of composite structures.

  14. Methodology for speech assessment in the Scandcleft project--an international randomized clinical trial on palatal surgery: experiences from a pilot study.

    PubMed

    Lohmander, A; Willadsen, E; Persson, C; Henningsson, G; Bowden, M; Hutters, B

    2009-07-01

    To present the methodology for speech assessment in the Scandcleft project and discuss issues from a pilot study. Description of methodology and blinded test for speech assessment. Speech samples and instructions for data collection and analysis for comparisons of speech outcomes across five included languages were developed and tested. PARTICIPANTS AND MATERIALS: Randomly selected video recordings of 10 5-year-old children from each language (n = 50) were included in the project. Speech material consisted of test consonants in single words, connected speech, and syllable chains with nasal consonants. Five experienced speech and language pathologists participated as observers. Narrow phonetic transcription of test consonants translated into cleft speech characteristics, ordinal scale rating of resonance, and perceived velopharyngeal closure (VPC). A velopharyngeal composite score (VPC-sum) was extrapolated from raw data. Intra-agreement comparisons were performed. Range for intra-agreement for consonant analysis was 53% to 89%, for hypernasality on high vowels in single words the range was 20% to 80%, and the agreement between the VPC-sum and the overall rating of VPC was 78%. Pooling data of speakers of different languages in the same trial and comparing speech outcome across trials seems possible if the assessment of speech concerns consonants and is confined to speech units that are phonetically similar across languages. Agreed conventions and rules are important. A composite variable for perceptual assessment of velopharyngeal function during speech seems usable; whereas, the method for hypernasality evaluation requires further testing.

  15. Design Development Test and Evaluation (DDT and E) Considerations for Safe and Reliable Human Rated Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Miller, James; Leggett, Jay; Kramer-White, Julie

    2008-01-01

    A team directed by the NASA Engineering and Safety Center (NESC) collected methodologies for how best to develop safe and reliable human rated systems and how to identify the drivers that provide the basis for assessing safety and reliability. The team also identified techniques, methodologies, and best practices to assure that NASA can develop safe and reliable human rated systems. The results are drawn from a wide variety of resources, from experts involved with the space program since its inception to the best-practices espoused in contemporary engineering doctrine. This report focuses on safety and reliability considerations and does not duplicate or update any existing references. Neither does it intend to replace existing standards and policy.

  16. Statistical inference for template aging

    NASA Astrophysics Data System (ADS)

    Schuckers, Michael E.

    2006-04-01

    A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.

  17. Error-rate prediction for programmable circuits: methodology, tools and studied cases

    NASA Astrophysics Data System (ADS)

    Velazco, Raoul

    2013-05-01

    This work presents an approach to predict the error rates due to Single Event Upsets (SEU) occurring in programmable circuits as a consequence of the impact or energetic particles present in the environment the circuits operate. For a chosen application, the error-rate is predicted by combining the results obtained from radiation ground testing and the results of fault injection campaigns performed off-beam during which huge numbers of SEUs are injected during the execution of the studied application. The goal of this strategy is to obtain accurate results about different applications' error rates, without using particle accelerator facilities, thus significantly reducing the cost of the sensitivity evaluation. As a case study, this methodology was applied a complex processor, the Power PC 7448 executing a program issued from a real space application and a crypto-processor application implemented in an SRAM-based FPGA and accepted to be embedded in the payload of a scientific satellite of NASA. The accuracy of predicted error rates was confirmed by comparing, for the same circuit and application, predictions with measures issued from radiation ground testing performed at the cyclotron Cyclone cyclotron of HIF (Heavy Ion Facility) of Louvain-la-Neuve (Belgium).

  18. The Validity and Responsiveness of Isometric Lower Body Multi-Joint Tests of Muscular Strength: a Systematic Review.

    PubMed

    Drake, David; Kennedy, Rodney; Wallace, Eric

    2017-12-01

    Researchers and practitioners working in sports medicine and science require valid tests to determine the effectiveness of interventions and enhance understanding of mechanisms underpinning adaptation. Such decision making is influenced by the supportive evidence describing the validity of tests within current research. The objective of this study is to review the validity of lower body isometric multi-joint tests ability to assess muscular strength and determine the current level of supporting evidence. Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were followed in a systematic fashion to search, assess and synthesize existing literature on this topic. Electronic databases such as Web of Science, CINAHL and PubMed were searched up to 18 March 2015. Potential inclusions were screened against eligibility criteria relating to types of test, measurement instrument, properties of validity assessed and population group and were required to be published in English. The Consensus-based Standards for the Selection of health Measurement Instruments (COSMIN) checklist was used to assess methodological quality and measurement property rating of included studies. Studies rated as fair or better in methodological quality were included in the best evidence synthesis. Fifty-nine studies met the eligibility criteria for quality appraisal. The ten studies that rated fair or better in methodological quality were included in the best evidence synthesis. The most frequently investigated lower body isometric multi-joint tests for validity were the isometric mid-thigh pull and isometric squat. The validity of each of these tests was strong in terms of reliability and construct validity. The evidence for responsiveness of tests was found to be moderate for the isometric squat test and unknown for the isometric mid-thigh pull. No tests using the isometric leg press met the criteria for inclusion in the best evidence synthesis. Researchers and practitioners can use the isometric squat and isometric mid-thigh pull with confidence in terms of reliability and construct validity. Further work to investigate other validity components such as criterion validity, smallest detectable change and responsiveness to resistance exercise interventions may be beneficial to the current level of evidence.

  19. A Monte Carlo Approach for Adaptive Testing with Content Constraints

    ERIC Educational Resources Information Center

    Belov, Dmitry I.; Armstrong, Ronald D.; Weissman, Alexander

    2008-01-01

    This article presents a new algorithm for computerized adaptive testing (CAT) when content constraints are present. The algorithm is based on shadow CAT methodology to meet content constraints but applies Monte Carlo methods and provides the following advantages over shadow CAT: (a) lower maximum item exposure rates, (b) higher utilization of the…

  20. Prediction of work metabolism from heart rate measurements in forest work: some practical methodological issues.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Auger, Isabelle; Leone, Mario

    2015-01-01

    Individual heart rate (HR) to workload relationships were determined using 93 submaximal step-tests administered to 26 healthy participants attending physical activities in a university training centre (laboratory study) and 41 experienced forest workers (field study). Predicted maximum aerobic capacity (MAC) was compared to measured MAC from a maximal treadmill test (laboratory study) to test the effect of two age-predicted maximum HR Equations (220-age and 207-0.7 × age) and two clothing insulation levels (0.4 and 0.91 clo) during the step-test. Work metabolism (WM) estimated from forest work HR was compared against concurrent work V̇O2 measurements while taking into account the HR thermal component. Results show that MAC and WM can be accurately predicted from work HR measurements and simple regression models developed in this study (1% group mean prediction bias and up to 25% expected prediction bias for a single individual). Clothing insulation had no impact on predicted MAC nor age-predicted maximum HR equations. Practitioner summary: This study sheds light on four practical methodological issues faced by practitioners regarding the use of HR methodology to assess WM in actual work environments. More specifically, the effect of wearing work clothes and the use of two different maximum HR prediction equations on the ability of a submaximal step-test to assess MAC are examined, as well as the accuracy of using an individual's step-test HR to workload relationship to predict WM from HR data collected during actual work in the presence of thermal stress.

  1. Test of the Constancy - Velocity Hypothesis: Navy Unit Functioning and Performance over 12 Years.

    DTIC Science & Technology

    1988-01-31

    purpose of the United States Government. 17 COSATI CODES 18 SUEJECT T’ERMS C rinue an rev se f necess iy1 f da eao ms een- listment1111 Rate, Change of...38 Velocity, Climate Change , and Upgrade Rate 41 Joint Effects of Culture/Climate and Velocity 43 Conclusions about the Role Played by Velocity 44...which (a) examined change in organizational systems over time, (b) systematically tested different methodological approaches to organizational

  2. Sweating Rate and Sweat Sodium Concentration in Athletes: A Review of Methodology and Intra/Interindividual Variability.

    PubMed

    Baker, Lindsay B

    2017-03-01

    Athletes lose water and electrolytes as a consequence of thermoregulatory sweating during exercise and it is well known that the rate and composition of sweat loss can vary considerably within and among individuals. Many scientists and practitioners conduct sweat tests to determine sweat water and electrolyte losses of athletes during practice and competition. The information gleaned from sweat testing is often used to guide personalized fluid and electrolyte replacement recommendations for athletes; however, unstandardized methodological practices and challenging field conditions can produce inconsistent/inaccurate results. The primary objective of this paper is to provide a review of the literature regarding the effect of laboratory and field sweat-testing methodological variations on sweating rate (SR) and sweat composition (primarily sodium concentration [Na + ]). The simplest and most accurate method to assess whole-body SR is via changes in body mass during exercise; however, potential confounding factors to consider are non-sweat sources of mass change and trapped sweat in clothing. In addition, variability in sweat [Na + ] can result from differences in the type of collection system used (whole body or localized), the timing/duration of sweat collection, skin cleaning procedure, sample storage/handling, and analytical technique. Another aim of this paper is to briefly review factors that may impact intra/interindividual variability in SR and sweat [Na + ] during exercise, including exercise intensity, environmental conditions, heat acclimation, aerobic capacity, body size/composition, wearing of protective equipment, sex, maturation, aging, diet, and/or hydration status. In summary, sweat testing can be a useful tool to estimate athletes' SR and sweat Na + loss to help guide fluid/electrolyte replacement strategies, provided that data are collected, analyzed, and interpreted appropriately.

  3. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    NASA Astrophysics Data System (ADS)

    Izzuddin, Nur; Sunarsih, Priyanto, Agoes

    2015-05-01

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel's speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel's speed to obtain better characteristics and hence optimize the fuel saving rate.

  4. Client Perceptions of Helpfulness in Therapy: a Novel Video-Rating Methodology for Examining Process Variables at Brief Intervals During a Single Session.

    PubMed

    Cocklin, Alexandra A; Mansell, Warren; Emsley, Richard; McEvoy, Phil; Preston, Chloe; Comiskey, Jody; Tai, Sara

    2017-11-01

    The value of clients' reports of their experiences in therapy is widely recognized, yet quantitative methodology has rarely been used to measure clients' self-reported perceptions of what is helpful over a single session. A video-rating method using was developed to gather data at brief intervals using process measures of client perceived experience and standardized measures of working alliance (Session Rating Scale; SRS). Data were collected over the course of a single video-recorded session of cognitive therapy (Method of Levels Therapy; Carey, 2006; Mansell et al., 2012). We examined the acceptability and feasibility of the methodology and tested the concurrent validity of the measure by utilizing theory-led constructs. Eighteen therapy sessions were video-recorded and clients each rated a 20-minute session of therapy at two-minute intervals using repeated measures. A multi-level analysis was used to test for correlations between perceived levels of helpfulness and client process variables. The design proved to be feasible. Concurrent validity was borne out through high correlations between constructs. A multi-level regression examined the independent contributions of client process variables to client perceived helpfulness. Client perceived control (b = 0.39, 95% CI .05 to 0.73), the ability to talk freely (b = 0.30, SE = 0.11, 95% CI .09 to 0.51) and therapist approach (b = 0.31, SE = 0.14, 95% CI .04 to 0.57) predicted client-rated helpfulness. We identify a feasible and acceptable method for studying continuous measures of helpfulness and their psychological correlates during a single therapy session.

  5. Addressing Angular Single-Event Effects in the Estimation of On-Orbit Error Rates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, David S.; Swift, Gary M.; Wirthlin, Michael J.

    2015-12-01

    Our study describes complications introduced by angular direct ionization events on space error rate predictions. In particular, prevalence of multiple-cell upsets and a breakdown in the application of effective linear energy transfer in modern-scale devices can skew error rates approximated from currently available estimation models. Moreover, this paper highlights the importance of angular testing and proposes a methodology to extend existing error estimation tools to properly consider angular strikes in modern-scale devices. Finally, these techniques are illustrated with test data provided from a modern 28 nm SRAM-based device.

  6. Method to determine the position-dependant metal correction factor for dose-rate equivalent laser testing of semiconductor devices

    DOEpatents

    Horn, Kevin M.

    2013-07-09

    A method reconstructs the charge collection from regions beneath opaque metallization of a semiconductor device, as determined from focused laser charge collection response images, and thereby derives a dose-rate dependent correction factor for subsequent broad-area, dose-rate equivalent, laser measurements. The position- and dose-rate dependencies of the charge-collection magnitude of the device are determined empirically and can be combined with a digital reconstruction methodology to derive an accurate metal-correction factor that permits subsequent absolute dose-rate response measurements to be derived from laser measurements alone. Broad-area laser dose-rate testing can thereby be used to accurately determine the peak transient current, dose-rate response of semiconductor devices to penetrating electron, gamma- and x-ray irradiation.

  7. CO2 Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit

    NASA Technical Reports Server (NTRS)

    Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses

    2014-01-01

    Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject and physiological differences between subjects. Computational Fluid Dynamic (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test was to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III suit across a range of workload and flow rates. Data and trends observed during testing along with refined CFD models will be used to help design an inlet vent configuration for the Z-2 space suit. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES). Three subjects performed two test sessions each in the Mark-III suit to allow for comparison between tests. Six different helmet inlet vent configurations were evaluated during each test session. Suit pressure was maintained at 4.3 psid. Suited test subjects walked on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute (ACFM) were tested at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the total oxygen consumption and CO2 production measured by additional gas analyzers at the air outlet from the suit. Realtime metabolic rate measurements were used to adjust the treadmill workload to meet target metabolic rates. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future inlet vent designs and ground testing.

  8. CO2 Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit

    NASA Technical Reports Server (NTRS)

    Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses

    2014-01-01

    Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject and physiological differences between subjects. Computational Fluid Dynamic (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test was to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III suit across a range of workload and flow rates. Data and trends observed during testing along with refined CFD models will be used to help design an inlet vent configuration for the Z-2 space suit. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES). Three subjects performed two test sessions each in the Mark-III suit to allow for comparison between tests. Six different helmet inlet vent configurations were evaluated during each test session. Suit pressure was maintained at 4.3 psid. Suited test subjects walked on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute (ACFM) were tested at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the total oxygen consumption and CO2 production measured by additional gas analyzers at the air outlet from the suit. Real-time metabolic rate measurements were used to adjust the treadmill workload to meet target metabolic rates. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future inlet vent designs and ground testing.

  9. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    USGS Publications Warehouse

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  10. Working group written presentation: Solar radiation

    NASA Technical Reports Server (NTRS)

    Slemp, Wayne S.

    1989-01-01

    The members of the Solar Radiation Working Group arrived at two major solar radiation technology needs: (1) generation of a long term flight data base; and (2) development of a standardized UV testing methodology. The flight data base should include 1 to 5 year exposure of optical filters, windows, thermal control coatings, hardened coatings, polymeric films, and structural composites. The UV flux and wavelength distribution, as well as particulate radiation flux and energy, should be measured during this flight exposure. A standard testing methodology is needed to establish techniques for highly accelerated UV exposure which will correlate well with flight test data. Currently, UV can only be accelerated to about 3 solar constants and can correlate well with flight exposure data. With space missions to 30 years, acceleration rates of 30 to 100X are needed for efficient laboratory testing.

  11. Methodologies for Optimum Capital Expenditure Decisions for New Medical Technology

    PubMed Central

    Landau, Thomas P.; Ledley, Robert S.

    1980-01-01

    This study deals with the development of a theory and an analytical model to support decisions regarding capital expenditures for complex new medical technology. Formal methodologies and quantitative techniques developed by applied mathematicians and management scientists can be used by health planners to develop cost-effective plans for the utilization of medical technology on a community or region-wide basis. In order to maximize the usefulness of the model, it was developed and tested against multiple technologies. The types of technologies studied include capital and labor-intensive technologies, technologies whose utilization rates vary with hospital occupancy rate, technologies whose use can be scheduled, and limited-use and large-use technologies.

  12. A Descent Rate Control Approach to Developing an Autonomous Descent Vehicle

    NASA Astrophysics Data System (ADS)

    Fields, Travis D.

    Circular parachutes have been used for aerial payload/personnel deliveries for over 100 years. In the past two decades, significant work has been done to improve the landing accuracies of cargo deliveries for humanitarian and military applications. This dissertation discusses the approach developed in which a circular parachute is used in conjunction with an electro-mechanical reefing system to manipulate the landing location. Rather than attempt to steer the autonomous descent vehicle directly, control of the landing location is accomplished by modifying the amount of time spent in a particular wind layer. Descent rate control is performed by reversibly reefing the parachute canopy. The first stage of the research investigated the use of a single actuation during descent (with periodic updates), in conjunction with a curvilinear target. Simulation results using real-world wind data are presented, illustrating the utility of the methodology developed. Additionally, hardware development and flight-testing of the single actuation autonomous descent vehicle are presented. The next phase of the research focuses on expanding the single actuation descent rate control methodology to incorporate a multi-actuation path-planning system. By modifying the parachute size throughout the descent, the controllability of the system greatly increases. The trajectory planning methodology developed provides a robust approach to accurately manipulate the landing location of the vehicle. The primary benefits of this system are the inherent robustness to release location errors and the ability to overcome vehicle uncertainties (mass, parachute size, etc.). A separate application of the path-planning methodology is also presented. An in-flight path-prediction system was developed for use in high-altitude ballooning by utilizing the path-planning methodology developed for descent vehicles. The developed onboard system improves landing location predictions in-flight using collected flight information during the ascent and descent. Simulation and real-world flight tests (using the developed low-cost hardware) demonstrate the significance of the improvements achievable when flying the developed system.

  13. Fatigue criterion to system design, life and reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, E. V.

    1985-01-01

    A generalized methodology to structural life prediction, design, and reliability based upon a fatigue criterion is advanced. The life prediction methodology is based in part on work of W. Weibull and G. Lundberg and A. Palmgren. The approach incorporates the computed life of elemental stress volumes of a complex machine element to predict system life. The results of coupon fatigue testing can be incorporated into the analysis allowing for life prediction and component or structural renewal rates with reasonable statistical certainty.

  14. HRR Upgrade to mass loss calorimeter and modified Schlyter test for FR Wood

    Treesearch

    Mark A. Dietenberger; Charles R. Boardman

    2013-01-01

    Enhanced Heat Release Rate (HRR) methodology has been extended to the Mass Loss Calorimeter (MLC) and the Modified Schlyter flame spread test to evaluate fire retardant effectiveness used on wood based materials. Modifications to MLC include installation of thermopile on the chimney walls to correct systematic errors to the sensible HRR calculations to account for...

  15. 18 CFR 342.4 - Other rate changing methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Other rate changing methodologies. 342.4 Section 342.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... METHODOLOGIES AND PROCEDURES § 342.4 Other rate changing methodologies. (a) Cost-of-service rates. A carrier may...

  16. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE) - A Systematic Review of Rating Scales

    PubMed Central

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Background Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students’ communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. Methods We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Results Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Discussion Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students’ academic success. PMID:27031506

  17. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE)--A Systematic Review of Rating Scales.

    PubMed

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students' communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students' academic success.

  18. Effect of Load Rate on Ultimate Tensile Strength of Ceramic Matrix Composites at Elevated Temperatures

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.

    2001-01-01

    The strengths of three continuous fiber-reinforced ceramic composites, including SiC/CAS-II, SiC/MAS-5 and SiC/SiC, were determined as a function of test rate in air at 1100 to 1200 C. All three composite materials exhibited a strong dependency of strength on test rate, similar to the behavior observed in many advanced monolithic ceramics at elevated temperatures. The application of the preloading technique as well as the prediction of life from one loading configuration (constant stress-rate) to another (constant stress loading) suggested that the overall macroscopic failure mechanism of the composites would be the one governed by a power-law type of damage evolution/accumulation, analogous to slow crack growth commonly observed in advanced monolithic ceramics. It was further found that constant stress-rate testing could be used as an alternative to life prediction test methodology even for composite materials, at least for short range of lifetimes and when ultimate strength is used as the failure criterion.

  19. Accelerated Testing Methodology Developed for Determining the Slow Crack Growth of Advanced Ceramics

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.

    1998-01-01

    Constant stress-rate ("dynamic fatigue") testing has been used for several decades to characterize the slow crack growth behavior of glass and structural ceramics at both ambient and elevated temperatures. The advantage of such testing over other methods lies in its simplicity: strengths are measured in a routine manner at four or more stress rates by applying a constant displacement or loading rate. The slow crack growth parameters required for component design can be estimated from a relationship between strength and stress rate. With the proper use of preloading in constant stress-rate testing, test time can be reduced appreciably. If a preload corresponding to 50 percent of the strength is applied to the specimen prior to testing, 50 percent of the test time can be saved as long as the applied preload does not change the strength. In fact, it has been a common, empirical practice in the strength testing of ceramics or optical fibers to apply some preloading (<40 percent). The purpose of this work at the NASA Lewis Research Center is to study the effect of preloading on measured strength in order to add a theoretical foundation to the empirical practice.

  20. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the targetmore » vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.« less

  1. Determination of Heritage SSME Pogo Suppressor Resistance and Inertance from Waterflow Pulse Testing

    NASA Technical Reports Server (NTRS)

    McDougal, Chris; Eberhart, Chad; Lee, Erik

    2016-01-01

    Waterflow tests of a heritage Space Shuttle Main Engine pogo suppressor were performed to experimentally quantify the resistance and inertance provided by the suppressor. Measurements of dynamic pressure and flow rate in response to pulsing flow were made throughout the test loop. A unique system identification methodology combined all sensor measurements with a one-dimensional perturbational flow model of the complete water flow loop to spatially translate physical measurements to the device under test. Multiple techniques were then employed to extract the effective resistance and inertance for the pogo suppressor. Parameters such as steady flow rate, perturbational flow rate magnitude, and pulse frequency were investigated to assess their influence on the behavior of the pogo suppressor dynamic response. These results support validation of the RS-25 pogo suppressor performance for use on the Space Launch System Core Stage.

  2. Variability in testing policies and impact on reported Clostridium difficile infection rates: results from the pilot Longitudinal European Clostridium difficile Infection Diagnosis surveillance study (LuCID).

    PubMed

    Davies, K; Davis, G; Barbut, F; Eckert, C; Petrosillo, N; Wilcox, M H

    2016-12-01

    Lack of standardised Clostridium difficile testing is a potential confounder when comparing infection rates. We used an observational, systematic, prospective large-scale sampling approach to investigate variability in C. difficile sampling to understand C. difficile infection (CDI) incidence rates. In-patient and institutional data were gathered from 60 European hospitals (across three countries). Testing methodology, testing/CDI rates and case profiles were compared between countries and institution types. The mean annual CDI rate per hospital was lowest in the UK and highest in Italy (1.5 vs. 4.7 cases/10,000 patient bed days [pbds], p < 0.001). The testing rate was highest in the UK compared with Italy and France (50.7/10,000 pbds vs. 31.5 and 30.3, respectively, p < 0.001). Only 58.4 % of diarrhoeal samples were tested for CDI across all countries. Overall, only 64 % of hospitals used recommended testing algorithms for laboratory testing. Small hospitals were significantly more likely to use standalone toxin tests (SATTs). There was an inverse correlation between hospital size and CDI testing rate. Hospitals using SATT or assays not detecting toxin reported significantly higher CDI rates than those using recommended methods, despite testing similar testing frequencies. These data are consistent with higher false-positive rates in such (non-recommended) testing scenarios. Cases in Italy and those diagnosed by SATT or methods NOT detecting toxin were significantly older. Testing occurred significantly earlier in the UK. Assessment of testing practice is paramount to the accurate interpretation and comparison of CDI rates.

  3. Determining solid-fluid interface temperature distribution during phase change of cryogenic propellants using transient thermal modeling

    NASA Astrophysics Data System (ADS)

    Bellur, K.; Médici, E. F.; Hermanson, J. C.; Choi, C. K.; Allen, J. S.

    2018-04-01

    Control of boil-off of cryogenic propellants is a continuing technical challenge for long duration space missions. Predicting phase change rates of cryogenic liquids requires an accurate estimation of solid-fluid interface temperature distributions in regions where a contact line or a thin liquid film exists. This paper described a methodology to predict inner wall temperature gradients with and without evaporation using discrete temperature measurements on the outer wall of a container. Phase change experiments with liquid hydrogen and methane in cylindrical test cells of various materials and sizes were conducted at the Neutron Imaging Facility at the National Institute of Standards and Technology. Two types of tests were conducted. The first type of testing involved thermal cycling of an evacuated cell (dry) and the second involved controlled phase change with cryogenic liquids (wet). During both types of tests, temperatures were measured using Si-diode sensors mounted on the exterior surface of the test cells. Heat is transferred to the test cell by conduction through a helium exchange gas and through the cryostat sample holder. Thermal conduction through the sample holder is shown to be the dominant mode with the rate of heat transfer limited by six independent contact resistances. An iterative methodology is employed to determine contact resistances between the various components of the cryostat stick insert, test cell and lid using the dry test data. After the contact resistances are established, inner wall temperature distributions during wet tests are calculated.

  4. About subjective evaluation of adaptive video streaming

    NASA Astrophysics Data System (ADS)

    Tavakoli, Samira; Brunnström, Kjell; Garcia, Narciso

    2015-03-01

    The usage of HTTP Adaptive Streaming (HAS) technology by content providers is increasing rapidly. Having available the video content in multiple qualities, using HAS allows to adapt the quality of downloaded video to the current network conditions providing smooth video-playback. However, the time-varying video quality by itself introduces a new type of impairment. The quality adaptation can be done in different ways. In order to find the best adaptation strategy maximizing users perceptual quality it is necessary to investigate about the subjective perception of adaptation-related impairments. However, the novelties of these impairments and their comparably long time duration make most of the standardized assessment methodologies fall less suited for studying HAS degradation. Furthermore, in traditional testing methodologies, the quality of the video in audiovisual services is often evaluated separated and not in the presence of audio. Nevertheless, the requirement of jointly evaluating the audio and the video within a subjective test is a relatively under-explored research field. In this work, we address the research question of determining the appropriate assessment methodology to evaluate the sequences with time-varying quality due to the adaptation. This was done by studying the influence of different adaptation related parameters through two different subjective experiments using a methodology developed to evaluate long test sequences. In order to study the impact of audio presence on quality assessment by the test subjects, one of the experiments was done in the presence of audio stimuli. The experimental results were subsequently compared with another experiment using the standardized single stimulus Absolute Category Rating (ACR) methodology.

  5. From SNOMED CT to Uberon: Transferability of evaluation methodology between similarly structured ontologies.

    PubMed

    Elhanan, Gai; Ochs, Christopher; Mejino, Jose L V; Liu, Hao; Mungall, Christopher J; Perl, Yehoshua

    2017-06-01

    To examine whether disjoint partial-area taxonomy, a semantically-based evaluation methodology that has been successfully tested in SNOMED CT, will perform with similar effectiveness on Uberon, an anatomical ontology that belongs to a structurally similar family of ontologies as SNOMED CT. A disjoint partial-area taxonomy was generated for Uberon. One hundred randomly selected test concepts that overlap between partial-areas were matched to a same size control sample of non-overlapping concepts. The samples were blindly inspected for non-critical issues and presumptive errors first by a general domain expert whose results were then confirmed or rejected by a highly experienced anatomical ontology domain expert. Reported issues were subsequently reviewed by Uberon's curators. Overlapping concepts in Uberon's disjoint partial-area taxonomy exhibited a significantly higher rate of all issues. Clear-cut presumptive errors trended similarly but did not reach statistical significance. A sub-analysis of overlapping concepts with three or more relationship types indicated a much higher rate of issues. Overlapping concepts from Uberon's disjoint abstraction network are quite likely (up to 28.9%) to exhibit issues. The results suggest that the methodology can transfer well between same family ontologies. Although Uberon exhibited relatively few overlapping concepts, the methodology can be combined with other semantic indicators to expand the process to other concepts within the ontology that will generate high yields of discovered issues. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    PubMed

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  7. Test Standard Developed for Determining the Slow Crack Growth of Advanced Ceramics at Ambient Temperature

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Salem, Jonathan A.

    1998-01-01

    The service life of structural ceramic components is often limited by the process of slow crack growth. Therefore, it is important to develop an appropriate testing methodology for accurately determining the slow crack growth design parameters necessary for component life prediction. In addition, an appropriate test methodology can be used to determine the influences of component processing variables and composition on the slow crack growth and strength behavior of newly developed materials, thus allowing the component process to be tailored and optimized to specific needs. At the NASA Lewis Research Center, work to develop a standard test method to determine the slow crack growth parameters of advanced ceramics was initiated by the authors in early 1994 in the C 28 (Advanced Ceramics) committee of the American Society for Testing and Materials (ASTM). After about 2 years of required balloting, the draft written by the authors was approved and established as a new ASTM test standard: ASTM C 1368-97, Standard Test Method for Determination of Slow Crack Growth Parameters of Advanced Ceramics by Constant Stress-Rate Flexural Testing at Ambient Temperature. Briefly, the test method uses constant stress-rate testing to determine strengths as a function of stress rate at ambient temperature. Strengths are measured in a routine manner at four or more stress rates by applying constant displacement or loading rates. The slow crack growth parameters required for design are then estimated from a relationship between strength and stress rate. This new standard will be published in the Annual Book of ASTM Standards, Vol. 15.01, in 1998. Currently, a companion draft ASTM standard for determination of the slow crack growth parameters of advanced ceramics at elevated temperatures is being prepared by the authors and will be presented to the committee by the middle of 1998. Consequently, Lewis will maintain an active leadership role in advanced ceramics standardization within ASTM. In addition, the authors have been and are involved with several international standardization organizations including the Versailles Project on Advanced Materials and Standards (VAMAS), the International Energy Agency (IEA), and the International Organization for Standardization (ISO). The associated standardization activities involve fracture toughness, strength, elastic modulus, and the machining of advanced ceramics.

  8. Determining the ventilation and aerosol deposition rates from routine indoor-air measurements.

    PubMed

    Halios, Christos H; Helmis, Costas G; Deligianni, Katerina; Vratolis, Sterios; Eleftheriadis, Konstantinos

    2014-01-01

    Measurement of air exchange rate provides critical information in energy and indoor-air quality studies. Continuous measurement of ventilation rates is a rather costly exercise and requires specific instrumentation. In this work, an alternative methodology is proposed and tested, where the air exchange rate is calculated by utilizing indoor and outdoor routine measurements of a common pollutant such as SO2, whereas the uncertainties induced in the calculations are analytically determined. The application of this methodology is demonstrated, for three residential microenvironments in Athens, Greece, and the results are also compared against ventilation rates calculated from differential pressure measurements. The calculated time resolved ventilation rates were applied to the mass balance equation to estimate the particle loss rate which was found to agree with literature values at an average of 0.50 h(-1). The proposed method was further evaluated by applying a mass balance numerical model for the calculation of the indoor aerosol number concentrations, using the previously calculated ventilation rate, the outdoor measured number concentrations and the particle loss rates as input values. The model results for the indoors' concentrations were found to be compared well with the experimentally measured values.

  9. Compression and Instrumented Indentation Measurements on Biomimetic Polymers

    DTIC Science & Technology

    2006-09-01

    styrene- isoprene triblock copolymer gels are tested and compared using both macro-scale and micro-scale measurements. A methodology is presented to...at stress states and strain rates not available to bulk measurement equipment. In this work, a ballistic gelatin and two styrene- isoprene triblock

  10. Convective Heat Transfer Scaling of Ignition Delay and Burning Rate with Heat Flux and Stretch Rate in the Equivalent Low Stretch Apparatus

    NASA Technical Reports Server (NTRS)

    Olson, Sandra

    2011-01-01

    To better evaluate the buoyant contributions to the convective cooling (or heating) inherent in normal-gravity material flammability test methods, we derive a convective heat transfer correlation that can be used to account for the forced convective stretch effects on the net radiant heat flux for both ignition delay time and burning rate. The Equivalent Low Stretch Apparatus (ELSA) uses an inverted cone heater to minimize buoyant effects while at the same time providing a forced stagnation flow on the sample, which ignites and burns as a ceiling fire. Ignition delay and burning rate data is correlated with incident heat flux and convective heat transfer and compared to results from other test methods and fuel geometries using similarity to determine the equivalent stretch rates and thus convective cooling (or heating) rates for those geometries. With this correlation methodology, buoyant effects inherent in normal gravity material flammability test methods can be estimated, to better apply the test results to low stretch environments relevant to spacecraft material selection.

  11. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  12. A Longitudinal Study on Human Outdoor Decomposition in Central Texas.

    PubMed

    Suckling, Joanna K; Spradley, M Katherine; Godde, Kanya

    2016-01-01

    The development of a methodology that estimates the postmortem interval (PMI) from stages of decomposition is a goal for which forensic practitioners strive. A proposed equation (Megyesi et al. 2005) that utilizes total body score (TBS) and accumulated degree days (ADD) was tested using longitudinal data collected from human remains donated to the Forensic Anthropology Research Facility (FARF) at Texas State University-San Marcos. Exact binomial tests examined the rate of the equation to successfully predict ADD. Statistically significant differences were found between ADD estimated by the equation and the observed value for decomposition stage. Differences remained significant after carnivore scavenged donations were removed from analysis. Low success rates for the equation to predict ADD from TBS and the wide standard errors demonstrate the need to re-evaluate the use of this equation and methodology for PMI estimation in different environments; rather, multivariate methods and equations should be derived that are environmentally specific. © 2015 American Academy of Forensic Sciences.

  13. Mortality prediction using TRISS methodology in the Spanish ICU Trauma Registry (RETRAUCI).

    PubMed

    Chico-Fernández, M; Llompart-Pou, J A; Sánchez-Casado, M; Alberdi-Odriozola, F; Guerrero-López, F; Mayor-García, M D; Egea-Guerrero, J J; Fernández-Ortega, J F; Bueno-González, A; González-Robledo, J; Servià-Goixart, L; Roldán-Ramírez, J; Ballesteros-Sanz, M Á; Tejerina-Alvarez, E; Pino-Sánchez, F I; Homar-Ramírez, J

    2016-10-01

    To validate Trauma and Injury Severity Score (TRISS) methodology as an auditing tool in the Spanish ICU Trauma Registry (RETRAUCI). A prospective, multicenter registry evaluation was carried out. Thirteen Spanish Intensive Care Units (ICUs). Individuals with traumatic disease and available data admitted to the participating ICUs. Predicted mortality using TRISS methodology was compared with that observed in the pilot phase of the RETRAUCI from November 2012 to January 2015. Discrimination was evaluated using receiver operating characteristic (ROC) curves and the corresponding areas under the curves (AUCs) (95% CI), with calibration using the Hosmer-Lemeshow (HL) goodness-of-fit test. A value of p<0.05 was considered significant. Predicted and observed mortality. A total of 1405 patients were analyzed. The observed mortality rate was 18% (253 patients), while the predicted mortality rate was 16.9%. The area under the ROC curve was 0.889 (95% CI: 0.867-0.911). Patients with blunt trauma (n=1305) had an area under the ROC curve of 0.887 (95% CI: 0.864-0.910), and those with penetrating trauma (n=100) presented an area under the curve of 0.919 (95% CI: 0.859-0.979). In the global sample, the HL test yielded a value of 25.38 (p=0.001): 27.35 (p<0.0001) in blunt trauma and 5.91 (p=0.658) in penetrating trauma. TRISS methodology underestimated mortality in patients with low predicted mortality and overestimated mortality in patients with high predicted mortality. TRISS methodology in the evaluation of severe trauma in Spanish ICUs showed good discrimination, with inadequate calibration - particularly in blunt trauma. Copyright © 2015 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  14. Fire Hazards from Combustible Ammunition, Methodology Development. Phase I

    DTIC Science & Technology

    1980-06-01

    5.3 Flame Length , Flame Diameter and Mass Burning Rate 37 5.4 Flame Emissive Power 41 5.5 Fire Plume Axial Gas Velocity 41 5.6 Flame Temperature...B.2 Exit Velocity 93 B.3 Rate of Energy Flow 93 B.4 Chamber Characteristics 94 B.5 Flame Length 95 B.6 Flame Lift Angle 95 B.7 Summary 97...Viewing Flame in Test Series 5 17. Flame Length Scaling 18. Scaling Trends for Mass Burning Rate 19. Effective Flame Emissive Power versus Flame

  15. A Robust Semi-Parametric Test for Detecting Trait-Dependent Diversification.

    PubMed

    Rabosky, Daniel L; Huang, Huateng

    2016-03-01

    Rates of species diversification vary widely across the tree of life and there is considerable interest in identifying organismal traits that correlate with rates of speciation and extinction. However, it has been challenging to develop methodological frameworks for testing hypotheses about trait-dependent diversification that are robust to phylogenetic pseudoreplication and to directionally biased rates of character change. We describe a semi-parametric test for trait-dependent diversification that explicitly requires replicated associations between character states and diversification rates to detect effects. To use the method, diversification rates are reconstructed across a phylogenetic tree with no consideration of character states. A test statistic is then computed to measure the association between species-level traits and the corresponding diversification rate estimates at the tips of the tree. The empirical value of the test statistic is compared to a null distribution that is generated by structured permutations of evolutionary rates across the phylogeny. The test is applicable to binary discrete characters as well as continuous-valued traits and can accommodate extremely sparse sampling of character states at the tips of the tree. We apply the test to several empirical data sets and demonstrate that the method has acceptable Type I error rates. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Lumbar Sympathetic Plexus Block as a Treatment for Postamputation Pain: Methodology for a Randomized Controlled Trial.

    PubMed

    McCormick, Zachary L; Hendrix, Andrew; Dayanim, David; Clay, Bryan; Kirsling, Amy; Harden, Norman

    2018-03-08

    We present a technical protocol for rigorous assessment of patient-reported outcomes and psychophysical testing relevant to lumbar sympathetic blocks for the treatment of postamputation pain (PAP). This description is intended to inform future prospective investigation. Series of four participants from a blinded randomized sham-controlled trial. Tertiary, urban, academic pain medicine center. Four participants with a single lower limb amputation and associated chronic PAP. Participants were randomized to receive a lumbar sympathetic block with 0.25% bupivacaine or sham needle placement. Patient-rated outcome measures included the numerical rating scale (NRS) for pain, the McGill Pain Questionnaire-Short Form, Center for Epidemiological Studies Depression Scale, Pain and Anxiety Symptoms Scale-short version, and Pain Disability Index (PDI). Psychophysical and biometric testing was also performed, which included vibration sensation testing, pinprick sensation testing, brush sensation testing, Von Frey repeated weighted pinprick sensation, and thermal quantitative sensory testing. In the four described cases, treatment of PAP with a single lumbar sympathetic block but not sham intervention resulted in reduction of both residual limb pain and phantom limb pain as well as perceived disability on the PDI at three-month follow-up. An appropriately powered randomized controlled study using this methodology may not only aid in determining the possible clinical efficacy of lumbar sympathetic block in PAP, but could also improve our understanding of underlying pathophysiologic mechanisms of PAP.

  17. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals

    PubMed Central

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-01-01

    Abstract Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals. To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors. A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis. The application rates of Kaplan–Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate, misleading of the reported results, or difficult to interpret. There are gaps in the conduct and reporting of survival analysis in studies published in Chinese oncology journals, severe deficiencies were noted. More endorsement by journals of the report guideline for survival analysis may improve articles quality, and the dissemination of reliable evidence to oncology clinicians. We recommend authors, readers, reviewers, and editors to consider survival analysis more carefully and cooperate more closely with statisticians and epidemiologists. PMID:29390340

  18. A Wavelet-Based Methodology for Grinding Wheel Condition Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, T. W.; Ting, C.F.; Qu, Jun

    2007-01-01

    Grinding wheel surface condition changes as more material is removed. This paper presents a wavelet-based methodology for grinding wheel condition monitoring based on acoustic emission (AE) signals. Grinding experiments in creep feed mode were conducted to grind alumina specimens with a resinoid-bonded diamond wheel using two different conditions. During the experiments, AE signals were collected when the wheel was 'sharp' and when the wheel was 'dull'. Discriminant features were then extracted from each raw AE signal segment using the discrete wavelet decomposition procedure. An adaptive genetic clustering algorithm was finally applied to the extracted features in order to distinguish differentmore » states of grinding wheel condition. The test results indicate that the proposed methodology can achieve 97% clustering accuracy for the high material removal rate condition, 86.7% for the low material removal rate condition, and 76.7% for the combined grinding conditions if the base wavelet, the decomposition level, and the GA parameters are properly selected.« less

  19. Variable frame rate transmission - A review of methodology and application to narrow-band LPC speech coding

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. R.; Makhoul, J.; Schwartz, R. M.; Huggins, A. W. F.

    1982-04-01

    The variable frame rate (VFR) transmission methodology developed, implemented, and tested in the years 1973-1978 for efficiently transmitting linear predictive coding (LPC) vocoder parameters extracted from the input speech at a fixed frame rate is reviewed. With the VFR method, parameters are transmitted only when their values have changed sufficiently over the interval since their preceding transmission. Two distinct approaches to automatic implementation of the VFR method are discussed. The first bases the transmission decisions on comparisons between the parameter values of the present frame and the last transmitted frame. The second, which is based on a functional perceptual model of speech, compares the parameter values of all the frames that lie in the interval between the present frame and the last transmitted frame against a linear model of parameter variation over that interval. Also considered is the application of VFR transmission to the design of narrow-band LPC speech coders with average bit rates of 2000-2400 bts/s.

  20. The tear turnover and tear clearance tests - a review.

    PubMed

    Garaszczuk, Izabela K; Montes Mico, Robert; Iskander, D Robert; Expósito, Alejandro Cerviño

    2018-03-01

    The aim is to provide a summary of methods available for the assessment of tear turnover and tear clearance rates. The review defines tear clearance and tear turnover and describes their implication for ocular surface health. Additionally, it describes main types of techniques for measuring tear turnover, including fluorescein tear clearance tests, techniques utilizing electromagnetic spectrum and tracer molecule and novel experimental techniques utilizing optical coherence tomography and fluorescein profilometry. Areas covered: Internet databases (PubMed, Science Direct, Google Scholar) and most frequently cited references were used as a principal resource of information on tear turnover rate and tear clearance rate, presenting methodologies and equipment, as well as their definition and implications for the anterior eye surface health and function. Keywords used for data-search were as follows: tear turnover, tear clearance, fluorescein clearance, scintigraphy, fluorophotometry, tear flow, drainage, tear meniscus dynamics, Krehbiel flow and lacrimal functional unit. Expert commentary: After decades, the topic of tear turnover assessment has been reintroduced. Recently, new techniques have been developed to propose less invasive, less time consuming and simpler methodologies for the assessment of tear dynamics that have the potential to be utilized in clinical practice.

  1. [Socio-cultural and ethical factors involved in the diagnosis of schistosomiasis mansoni in an area of low endemicity].

    PubMed

    Gonçalves, Margareth Maria Lessa; Barreto, Magali Muniz Gonçalves; Maldonado, Arnaldo; Maione, Vanessa Regal; Rey, Luís; Soares, Marisa da Silveira

    2005-01-01

    Five annual parasitological surveys and one serological survey, respectively based on the Kato-Katz and free sedimentation methods and the Western blot technique, were conducted in Sumidouro, Rio de Janeiro, Brazil, an endemic county for schistosomiasis. Possible influences of the use of these methodologies on social, cultural, and ethical aspects of the study population were also evaluated. Having the opportunity to choose the different techniques was a conclusive issue influencing participation by the population. Prevalence rates of positive results for stool tests were: 11.6% (1995); 8.8% (1996); 12.2% (1998); 5.9% (1999); and 3.2% (2000). In the period during which the serological survey was performed, the use of laboratory testing in association with analysis of clinical data and available data on transmission and treatment generated a diagnostic procedure termed "coproseroepidemiology". This methodology contributed to significant improvements in the accuracy of measurement of local schistosomiasis prevalence, indicating that epidemiological surveillance could help prevent the recurrence of high prevalence rates. The fact that Biomphalaria glabrata was replaced by Melanoides tuberculata in the main transmission focus contributed to a significant decrease in infection rates.

  2. Methodological evaluation and comparison of five urinary albumin measurements.

    PubMed

    Liu, Rui; Li, Gang; Cui, Xiao-Fan; Zhang, Dong-Ling; Yang, Qing-Hong; Mu, Xiao-Yan; Pan, Wen-Jie

    2011-01-01

    Microalbuminuria is an indicator of kidney damage and a risk factor for the progression kidney disease, cardiovascular disease, and so on. Therefore, accurate and precise measurement of urinary albumin is critical. However, there are no reference measurement procedures and reference materials for urinary albumin. Nephelometry, turbidimetry, colloidal gold method, radioimmunoassay, and chemiluminescence immunoassay were performed for methodological evaluation, based on imprecision test, recovery rate, linearity, haemoglobin interference rate, and verified reference interval. Then we tested 40 urine samples from diabetic patients by each method, and compared the result between assays. The results indicate that nephelometry is the method with best analytical performance among the five methods, with an average intraassay coefficient of variation (CV) of 2.6%, an average interassay CV of 1.7%, a mean recovery of 99.6%, a linearity of R=1.00 from 2 to 250 mg/l, and an interference rate of <10% at haemoglobin concentrations of <1.82 g/l. The correlation (r) between assays was from 0.701 to 0.982, and the Bland-Altman plots indicated each assay provided significantly different results from each other. Nephelometry is the clinical urinary albumin method with best analytical performance in our study. © 2011 Wiley-Liss, Inc.

  3. On the upper ocean turbulent dissipation rate due to microscale breakers and small whitecaps

    NASA Astrophysics Data System (ADS)

    Banner, Michael L.; Morison, Russel P.

    2018-06-01

    In ocean wave modelling, accurately computing the evolution of the wind-wave spectrum depends on the source terms and the spectral bandwidth used. The wave dissipation rate source term which spectrally quantifies wave breaking and other dissipative processes remains poorly understood, including the spectral bandwidth needed to capture the essential model physics. The observational study of Sutherland and Melville (2015a) investigated the relative dissipation rate contributions of breaking waves, from large-scale whitecaps to microbreakers. They concluded that a large fraction of wave energy was dissipated by microbreakers. However, in strong contrast with their findings, our analysis of their data and other recent data sets shows that for young seas, microbreakers and small whitecaps contribute only a small fraction of the total breaking wave dissipation rate. For older seas, we find microbreakers and small whitecaps contribute a large fraction of the breaking wave dissipation rate, but this is only a small fraction of the total dissipation rate, which is now dominated by non-breaking contributions. Hence, for all the wave age conditions observed, microbreakers make an insignificant contribution to the total wave dissipation rate in the wave boundary layer. We tested the sensitivity of the results to the SM15a whitecap analysis methodology by transforming the SM15a breaking data using our breaking crest processing methodology. This resulted in the small-scale breaking waves making an even smaller contribution to the total wave dissipation rate, and so the result is independent of the breaker processing methodology. Comparison with other near-surface total TKE dissipation rate observations also support this conclusion. These contributions to the spectral dissipation rate in ocean wave models are small and need not be explicitly resolved.

  4. Somatic and gastrointestinal in vivo biotransformation rates of hydrophobic chemicals in fish.

    PubMed

    Lo, Justin C; Campbell, David A; Kennedy, Christopher J; Gobas, Frank A P C

    2015-10-01

    To improve current bioaccumulation assessment methods, a methodology is developed, applied, and investigated for measuring in vivo biotransformation rates of hydrophobic organic substances in the body (soma) and gastrointestinal tract of the fish. The method resembles the Organisation for Economic Co-operation and Development (OECD) 305 dietary bioaccumulation test but includes reference chemicals to determine both somatic and gastrointestinal biotransformation rates of test chemicals. Somatic biotransformation rate constants for the test chemicals ranged between 0 d(-1) and 0.38 (standard error [SE] 0.03)/d(-1) . Gastrointestinal biotransformation rate constants varied from 0 d(-1) to 46 (SE 7) d(-1) . Gastrointestinal biotransformation contributed more to the overall biotransformation in fish than somatic biotransformation for all test substances but 1. Results suggest that biomagnification tests can reveal the full extent of biotransformation in fish. The common presumption that the liver is the main site of biotransformation may not apply to many substances exposed through the diet. The results suggest that the application of quantitative structure-activity relationships (QSARs) for somatic biotransformation rates and hepatic in vitro models to assess the effect of biotransformation on bioaccumulation can underestimate biotransformation rates and overestimate the biomagnification potential of chemicals that are biotransformed in the gastrointestinal tract. With some modifications, the OECD 305 test can generate somatic and gastrointestinal biotransformation data to develop biotransformation QSARs and test in vitro-in vivo biotransformation extrapolation methods. © 2015 SETAC.

  5. A simple randomisation procedure for validating discriminant analysis: a methodological note.

    PubMed

    Wastell, D G

    1987-04-01

    Because the goal of discriminant analysis (DA) is to optimise classification, it designedly exaggerates between-group differences. This bias complicates validation of DA. Jack-knifing has been used for validation but is inappropriate when stepwise selection (SWDA) is employed. A simple randomisation test is presented which is shown to give correct decisions for SWDA. The general superiority of randomisation tests over orthodox significance tests is discussed. Current work on non-parametric methods of estimating the error rates of prediction rules is briefly reviewed.

  6. Measuring standing balance in multiple sclerosis: Further progress towards an automatic and reliable method in clinical practice.

    PubMed

    Keune, Philipp M; Young, William R; Paraskevopoulos, Ioannis T; Hansen, Sascha; Muenssinger, Jana; Oschmann, Patrick; Müller, Roy

    2017-08-15

    Balance deficits in multiple sclerosis (MS) are often monitored by means of observer-rated tests. These may provide reliable data, but may also be time-consuming, subject to inter-rater variability, and potentially insensitive to mild fluctuations throughout the clinical course. On the other hand, laboratory assessments are often not available. The Nintendo Wii Balance Board (WBB) may represent a low-cost solution. The purpose of the current study was to examine the methodological quality of WBB data in MS (internal consistency, test-retest reliability), convergent validity with observer-rated tests (Berg Balance Scale, BBS; Timed-Up and Go Test, TUG), and discriminative validity concerning clinical status (Expanded Disability Status Scale, EDSS). Standing balance was assessed with the WBB for 4min in 63 MS patients at two assessment points, four months apart. Additionally, patients were examined with the BBS, TUG and the EDSS. A period of 4min on the WBB provided data characterized by excellent internal consistency and test-retest reliability. Significant correlations between WBB data and results of the BBS and TUG were obtained after merely 2min on the board. An EDSS median-split revealed that higher EDSS values (>3) were associated with significantly increased postural sway on the WBB. WBB measures reflecting postural sway are methodologically robust in MS, involving excellent internal consistency and test-retest reliability. They are also characterized by convergent validity with other considerably lengthier observer-rated balance measures (BBS) and sensitive to broader clinical characteristics (EDSS). The WBB may hence represent an effective, easy-to-use monitoring tool for MS patients in clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Accelerated Testing Methodology for the Determination of Slow Crack Growth of Advanced Ceramics

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Salem, Jonathan A.; Gyekenyesi, John P.

    1997-01-01

    Constant stress-rate (dynamic fatigue) testing has been used for several decades to characterize slow crack growth behavior of glass and ceramics at both ambient and elevated temperatures. The advantage of constant stress-rate testing over other methods lies in its simplicity: Strengths are measured in a routine manner at four or more stress rates by applying a constant crosshead speed or constant loading rate. The slow crack growth parameters (n and A) required for design can be estimated from a relationship between strength and stress rate. With the proper use of preloading in constant stress-rate testing, an appreciable saving of test time can be achieved. If a preload corresponding to 50 % of the strength is applied to the specimen prior to testing, 50 % of the test time can be saved as long as the strength remains unchanged regardless of the applied preload. In fact, it has been a common, empirical practice in strength testing of ceramics or optical fibers to apply some preloading (less then 40%). The purpose of this work is to study the effect of preloading on the strength to lay a theoretical foundation on such an empirical practice. For this purpose, analytical and numerical solutions of strength as a function of preloading were developed. To verify the solution, constant stress-rate testing using glass and alumina at room temperature and alumina silicon nitride, and silicon carbide at elevated temperatures was conducted in a range of preloadings from O to 90 %.

  8. Comparing trends in cancer rates across overlapping regions.

    PubMed

    Li, Yi; Tiwari, Ram C

    2008-12-01

    Monitoring and comparing trends in cancer rates across geographic regions or over different time periods have been major tasks of the National Cancer Institute's (NCI) Surveillance, Epidemiology, and End Results (SEER) Program as it profiles healthcare quality as well as decides healthcare resource allocations within a spatial-temporal framework. A fundamental difficulty, however, arises when such comparisons have to be made for regions or time intervals that overlap, for example, comparing the change in trends of mortality rates in a local area (e.g., the mortality rate of breast cancer in California) with a more global level (i.e., the national mortality rate of breast cancer). In view of sparsity of available methodologies, this article develops a simple corrected Z-test that accounts for such overlapping. The performance of the proposed test over the two-sample "pooled"t-test that assumes independence across comparison groups is assessed via the Pitman asymptotic relative efficiency as well as Monte Carlo simulations and applications to the SEER cancer data. The proposed test will be important for the SEER * STAT software, maintained by the NCI, for the analysis of the SEER data.

  9. 78 FR 27400 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-10

    ... tests ``furnished in a place of residence used as the patient's home,'' and are designed to ensure that... specific attachment designed to capture physician-owned hospital ownership and investment interest data was... interested entities in the Advance Notice of Methodological Changes for MA Payment Rates (every February) and...

  10. Assessing Faculty Performance: A Test of Method.

    ERIC Educational Resources Information Center

    Clark, Mary Jo; Blackburn, Robert T.

    A methodology for evaluating faculty work performance was discussed, using data obtained from a typical liberal arts college faculty. Separate evaluations of teaching effectiveness and of overall contributions to the college for 45 full-time faculty (85% response rate) were collected from administrators, faculty colleagues, students, and from the…

  11. Important Publications in the Area of Photovoltaic Performance |

    Science.gov Websites

    , 2011, DOI: 978-0-12-385934-1. Photoelectrochemical Water Splitting: Standards, Experimental Methods Energy Systems Testing, Solar Energy 73, 443-467 (2002). D.R. Myers, K. Emery, and C. Gueymard, Revising Performance Evaluation Methodologies for Energy Ratings," Proc. 24th IEEE Photovoltaic Specialists Conf

  12. Investigation of Super Learner Methodology on HIV-1 Small Sample: Application on Jaguar Trial Data.

    PubMed

    Houssaïni, Allal; Assoumou, Lambert; Marcelin, Anne Geneviève; Molina, Jean Michel; Calvez, Vincent; Flandre, Philippe

    2012-01-01

    Background. Many statistical models have been tested to predict phenotypic or virological response from genotypic data. A statistical framework called Super Learner has been introduced either to compare different methods/learners (discrete Super Learner) or to combine them in a Super Learner prediction method. Methods. The Jaguar trial is used to apply the Super Learner framework. The Jaguar study is an "add-on" trial comparing the efficacy of adding didanosine to an on-going failing regimen. Our aim was also to investigate the impact on the use of different cross-validation strategies and different loss functions. Four different repartitions between training set and validations set were tested through two loss functions. Six statistical methods were compared. We assess performance by evaluating R(2) values and accuracy by calculating the rates of patients being correctly classified. Results. Our results indicated that the more recent Super Learner methodology of building a new predictor based on a weighted combination of different methods/learners provided good performance. A simple linear model provided similar results to those of this new predictor. Slight discrepancy arises between the two loss functions investigated, and slight difference arises also between results based on cross-validated risks and results from full dataset. The Super Learner methodology and linear model provided around 80% of patients correctly classified. The difference between the lower and higher rates is around 10 percent. The number of mutations retained in different learners also varys from one to 41. Conclusions. The more recent Super Learner methodology combining the prediction of many learners provided good performance on our small dataset.

  13. Human Response to Low-Intensity Sonic Booms Heard Indoors and Outdoors

    NASA Technical Reports Server (NTRS)

    Sullivan, Brenda M.; Klos, Jacob; Buehrle, Ralph D.; McCurdy, David A.; Haering, Edward A., Jr.

    2010-01-01

    Test subjects seated inside and outside a house were exposed to low-intensity N-wave sonic booms during a 3-week test period in June 2006- The house was instrumented to measure the booms both inside and out. F-18 aircraft were flown to achieve a variety of boom overpressures from approximately .1 to .6 psf During four test days, seventy-seven test subjects heard the booms while seated inside and outside the house. Using the Magnitude Estimation methodology and artificial reference sounds ; the subjects rated the annoyance of the booms. Since the same subjects heard similar booms both inside and outside the house, comparative ratings of indoor and outdoor annoyance were obtained. For a given metric level, indoor subjects gave higher annoyance scores than outdoor subjects. For a given boom; annoyance scores inside were on average the same as those outside. In a post-test questionnaire, the majority of subjects rated the indoor booms as more annoying than the outdoor ones. These results are discussed in this paper.

  14. Delta Clipper-Experimental In-Ground Effect on Base-Heating Environment

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    1998-01-01

    A quasitransient in-ground effect method is developed to study the effect of vertical landing on a launch vehicle base-heating environment. This computational methodology is based on a three-dimensional, pressure-based, viscous flow, chemically reacting, computational fluid dynamics formulation. Important in-ground base-flow physics such as the fountain-jet formation, plume growth, air entrainment, and plume afterburning are captured with the present methodology. Convective and radiative base-heat fluxes are computed for comparison with those of a flight test. The influence of the laminar Prandtl number on the convective heat flux is included in this study. A radiative direction-dependency test is conducted using both the discrete ordinate and finite volume methods. Treatment of the plume afterburning is found to be very important for accurate prediction of the base-heat fluxes. Convective and radiative base-heat fluxes predicted by the model using a finite rate chemistry option compared reasonably well with flight-test data.

  15. The ALHAMBRA survey: accurate merger fractions derived by PDF analysis of photometrically close pairs

    NASA Astrophysics Data System (ADS)

    López-Sanjuan, C.; Cenarro, A. J.; Varela, J.; Viironen, K.; Molino, A.; Benítez, N.; Arnalte-Mur, P.; Ascaso, B.; Díaz-García, L. A.; Fernández-Soto, A.; Jiménez-Teja, Y.; Márquez, I.; Masegosa, J.; Moles, M.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Broadhurst, T.; Cabrera-Caño, J.; Castander, F. J.; Cepa, J.; Cerviño, M.; Cristóbal-Hornillos, D.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.

    2015-04-01

    Aims: Our goal is to develop and test a novel methodology to compute accurate close-pair fractions with photometric redshifts. Methods: We improved the currently used methodologies to estimate the merger fraction fm from photometric redshifts by (i) using the full probability distribution functions (PDFs) of the sources in redshift space; (ii) including the variation in the luminosity of the sources with z in both the sample selection and the luminosity ratio constrain; and (iii) splitting individual PDFs into red and blue spectral templates to reliably work with colour selections. We tested the performance of our new methodology with the PDFs provided by the ALHAMBRA photometric survey. Results: The merger fractions and rates from the ALHAMBRA survey agree excellently well with those from spectroscopic work for both the general population and red and blue galaxies. With the merger rate of bright (MB ≤ -20-1.1z) galaxies evolving as (1 + z)n, the power-law index n is higher for blue galaxies (n = 2.7 ± 0.5) than for red galaxies (n = 1.3 ± 0.4), confirming previous results. Integrating the merger rate over cosmic time, we find that the average number of mergers per galaxy since z = 1 is Nmred = 0.57 ± 0.05 for red galaxies and Nmblue = 0.26 ± 0.02 for blue galaxies. Conclusions: Our new methodology statistically exploits all the available information provided by photometric redshift codes and yields accurate measurements of the merger fraction by close pairs from using photometric redshifts alone. Current and future photometric surveys will benefit from this new methodology. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).The catalogues, probabilities, and figures of the ALHAMBRA close pairs detected in Sect. 5.1 are available at http://https://cloud.iaa.csic.es/alhambra/catalogues/ClosePairs

  16. Weapon Simulator Test Methodology Investigation: Comparison of Live Fire and Weapon Simulator Test Methodologies and the Effects of Clothing and Individual Equipment on Marksmanship

    DTIC Science & Technology

    2016-09-15

    METHODOLOGY INVESTIGATION: COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON...2. REPORT TYPE Final 3. DATES COVERED (From - To) October 2014 – August 2015 4. TITLE AND SUBTITLE WEAPON SIMULATOR TEST METHODOLOGY INVESTIGATION...COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON MARKSMANSHIP 5a. CONTRACT

  17. Predicting Failure Progression and Failure Loads in Composite Open-Hole Tension Coupons

    NASA Technical Reports Server (NTRS)

    Arunkumar, Satyanarayana; Przekop, Adam

    2010-01-01

    Failure types and failure loads in carbon-epoxy [45n/90n/-45n/0n]ms laminate coupons with central circular holes subjected to tensile load are simulated using progressive failure analysis (PFA) methodology. The progressive failure methodology is implemented using VUMAT subroutine within the ABAQUS(TradeMark)/Explicit nonlinear finite element code. The degradation model adopted in the present PFA methodology uses an instantaneous complete stress reduction (COSTR) approach to simulate damage at a material point when failure occurs. In-plane modeling parameters such as element size and shape are held constant in the finite element models, irrespective of laminate thickness and hole size, to predict failure loads and failure progression. Comparison to published test data indicates that this methodology accurately simulates brittle, pull-out and delamination failure types. The sensitivity of the failure progression and the failure load to analytical loading rates and solvers precision is demonstrated.

  18. Azithromycin treatment failure for Chlamydia trachomatis among heterosexual men with nongonococcal urethritis

    PubMed Central

    Kissinger, Patricia; White, Scott; Manhart, Lisa E.; Schwebke, Jane; Taylor, Stephanie N; Mena, Leandro; Khosropour, Christine M; Wilcox, Larissa; Schmidt, Norine; Martin, David H

    2016-01-01

    Background Three recent prospective studies have suggested that the 1 g dose of azithromycin for Chlamydia trachomatis (Ct) was less effective than expected, reporting a wide range of treatment failure rates (5.8%–22.6%). Reasons for the disparate results could be attributed to geographic or methodological differences. The purpose of this study was to re-examine the studies and attempt to harmonize methodologies to reduce misclassification as a result of false positives from early test-of-cure (TOC) or reinfection as a result of sexual exposure rather than treatment failure. Methods Men who had sex with women, who received 1 g azithromycin under directly observed therapy (DOT) for presumptive treatment of nongonococcal urethritis (NGU) with confirmed Ct were included. Baseline screening was performed on urethral swabs or urine and TOC screening was performed on urine using nucleic acid amplification tests (NAAT). Post-treatment vaginal sexual exposure was elicited at TOC. Data from the three studies was obtained and re-analyzed. Rates of Ct re-test positive were examined for all cases and a sensitivity analysis was conducted to either reclassify potential false positives/reinfections as negative or remove them from the analysis. Results The crude treatment failure rate was 12.8% (31/242). The rate when potential false positives/reinfections were reclassified as negative was 6.2% (15/242) or when these were excluded from analysis was 10.9% (15/138). Conclusion In these samples of men who have sex with women with Ct-related NGU, azithromycin treatment failure was between 6.2% and 12.8%. This range of failure is lower than previously published but higher than the desired World Health Organization’s target chlamydia treatment failure rate of < 5%. PMID:27631353

  19. Compendium of information on identification and testing of materials for plastic solar thermal collectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGinniss, V.D.; Sliemers, F.A.; Landstrom, D.K.

    1980-07-31

    This report is intended to organize and summarize prior and current literature concerning the weathering, aging, durability, degradation, and testing methodologies as applied to materials for plastic solar thermal collectors. Topics covered include (1) rate of aging of polymeric materials; (2) environmental factors affecting performance; (3) evaluation and prediction of service life; (4) measurement of physical and chemical properties; (5) discussion of evaluation techniques and specific instrumentation; (6) degradation reactions and mechanisms; (7) weathering of specific polymeric materials; and (8) exposure testing methodology. Major emphasis has been placed on defining the current state of the art in plastics degradation andmore » on identifying information that can be utilized in applying appropriate and effective aging tests for use in projecting service life of plastic solar thermal collectors. This information will also be of value where polymeric components are utilized in the construction of conventional solar collectors or any application where plastic degradation and weathering are prime factors in material selection.« less

  20. 77 FR 10767 - Rate Adjustments for Indian Irrigation Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-23

    ... Irrigation Project on the proposed rates about the following issues: (1) The methodology for O&M rate setting... BIA's responses are provided below. Comment: The BIA's methodology for setting the 2013 O&M assessment rate was unreasonable. Response: The methodology used by the BIA to determine the 2013 O&M assessment...

  1. CO2 Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit

    NASA Technical Reports Server (NTRS)

    Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses

    2014-01-01

    Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject and physiological differences between subjects. Computational Fluid Dynamic (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test is to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III space suit across a range of workload and flow rates. As a secondary objective, results will be compared to the predicted CO2 concentrations and used to refine existing CFD models. These CFD models will then be used to help design an inlet vent configuration for the Z-2 space suit, which maximizes oronasal CO2 washout. This test has not been completed, but is planned for January 2014. The results of this test will be incorporated into this paper. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES). Three subjects will be tested in the Mark-III space suit with each subject performing two test sessions to allow for comparison between tests. Six different helmet inlet vent configurations will be evaluated during each test session. Suit pressure will be maintained at 4.3 psid. Subjects will wear the suit while walking on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute (ACFM) will be tested at each workload. Subjects will wear an oronasal mask with an open port in front of the mouth and will be allowed to breathe freely. Oronasal ppCO2 will be monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate will be calculated from the total oxygen consumption and CO2 production measured by additional gas analyzers at the air outlet from the suit. Real-time metabolic rate measurements will be used to adjust the treadmill workload to meet target metabolic rates. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future inlet vent design and ground testing in the Mark-III.

  2. Computer simulation of thermal modeling of primary lithium cells

    NASA Technical Reports Server (NTRS)

    Young, I. Cho; Frank, Harvey; Halpert, Gersid

    1987-01-01

    The objective was to gain a better understanding of the safety problem of primary Li-SOCl2 and Li-SO2 cells by carrying out detailed thermal modeling work. In particular, the transient heat generation rates during moderate and extermely high discharge rate tests of Li-SOCl2 cells were predicted and compared with those from the electrochemical heating. The difference between the two may be attributed to the lithium corrosion and other chemical reactions. The present program was also tested for charging of Li-SO2. In addition, the present methodology should be applicable to other primary cylindrical cells as well as rechargeable battery analyses with minor modifications.

  3. 78 FR 4369 - Rates for Interstate Inmate Calling Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    .... Marginal Location Methodology. In 2008, ICS providers submitted the ICS Provider Proposal for ICS rates. The ICS Provider Proposal uses the ``marginal location'' methodology, previously adopted by the... ``marginal location'' methodology provides a ``basis for rates that represent `fair compensation' as set...

  4. Life Limiting Behavior in Interlaminar Shear of Continuous Fiber-Reinforced Ceramic Matrix Composites at Elevated Temperatures

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Calomino, Anthony M.; Bansal, Narottam P.; Verrilli, Michael J.

    2006-01-01

    Interlaminar shear strength of four different fiber-reinforced ceramic matrix composites was determined with doublenotch shear test specimens as a function of test rate at elevated temperatures ranging from 1100 to 1316 C in air. Life limiting behavior, represented as interlaminar shear strength degradation with decreasing test rate, was significant for 2-D crossplied SiC/MAS-5 and 2-D plain-woven C/SiC composites, but insignificant for 2-D plain-woven SiC/SiC and 2-D woven Sylramic (Dow Corning, Midland, Michigan) SiC/SiC composites. A phenomenological, power-law delayed failure model was proposed to account for and to quantify the rate dependency of interlaminar shear strength of the composites. Additional stress rupture testing in interlaminar shear was conducted at elevated temperatures to validate the proposed model. The model was in good agreement with SiC/MAS-5 and C/SiC composites, but in poor to reasonable agreement with Sylramic SiC/SiC. Constant shear stress-rate testing was proposed as a possible means of life prediction testing methodology for ceramic matrix composites subjected to interlaminar shear at elevated temperatures when short lifetimes are expected.

  5. Optimization of personalized therapies for anticancer treatment.

    PubMed

    Vazquez, Alexei

    2013-04-12

    As today, there are hundreds of targeted therapies for the treatment of cancer, many of which have companion biomarkers that are in use to inform treatment decisions. If we would consider this whole arsenal of targeted therapies as a treatment option for every patient, very soon we will reach a scenario where each patient is positive for several markers suggesting their treatment with several targeted therapies. Given the documented side effects of anticancer drugs, it is clear that such a strategy is unfeasible. Here, we propose a strategy that optimizes the design of combinatorial therapies to achieve the best response rates with the minimal toxicity. In this methodology markers are assigned to drugs such that we achieve a high overall response rate while using personalized combinations of minimal size. We tested this methodology in an in silico cancer patient cohort, constructed from in vitro data for 714 cell lines and 138 drugs reported by the Sanger Institute. Our analysis indicates that, even in the context of personalized medicine, combinations of three or more drugs are required to achieve high response rates. Furthermore, patient-to-patient variations in pharmacokinetics have a significant impact in the overall response rate. A 10 fold increase in the pharmacokinetics variations resulted in a significant drop the overall response rate. The design of optimal combinatorial therapy for anticancer treatment requires a transition from the one-drug/one-biomarker approach to global strategies that simultaneously assign makers to a catalog of drugs. The methodology reported here provides a framework to achieve this transition.

  6. Are mind wandering rates an artifact of the probe-caught method? Using self-caught mind wandering in the classroom to test, and reject, this possibility.

    PubMed

    Varao-Sousa, Trish L; Kingstone, Alan

    2018-06-26

    Mind wandering (MW) reports often rely on individuals responding to specific external thought probes. Researchers have used this probe-caught method almost exclusively, due to its reliability across a wide range of testing situations. However, it remains an open question whether the probe-caught MW rates in more complex settings converge with those for simpler tasks, because of the rather artificial and controlled nature of the probe-caught methodology itself, which is shared across the different settings. To address this issue, we measured MW in a real-world lecture, during which students indicated whether they were mind wandering by simply catching themselves (as one would normally do in real life) or by catching themselves and responding to thought probes. Across three separate lectures, self-caught MW reports were stable and unaffected by the inclusion of MW probes. That the probe rates were similar to those found in prior classroom research and did not affect the self-caught MW rates strongly suggests that the past consistency of probe-caught MW rates across a range of different settings is not an artifact of the thought-probe method. Our study also indicates that the self-caught MW methodology is a reliable way to acquire MW data. The extension of measurement techniques to include students' self-caught reports provides valuable information about how to successfully and naturalistically monitor MW in lecture settings, outside the laboratory.

  7. Optimal Rating Procedures and Methodology for NAEP Open- Ended Items. Working Paper Series.

    ERIC Educational Resources Information Center

    Patz, Richard J.; Wilson, Mark; Hoskens, Machteld

    The National Assessment of Educational Progress (NAEP) collects data in the form of repeated, discrete measures (test items) with hierarchical structure for both measures and subjects, that is complex by any standard. This complexity has been managed through a "divide and conquer" approach of isolating and evaluating sources of…

  8. Identifying treatment effect heterogeneity in clinical trials using subpopulations of events: STEPP.

    PubMed

    Lazar, Ann A; Bonetti, Marco; Cole, Bernard F; Yip, Wai-Ki; Gelber, Richard D

    2016-04-01

    Investigators conducting randomized clinical trials often explore treatment effect heterogeneity to assess whether treatment efficacy varies according to patient characteristics. Identifying heterogeneity is central to making informed personalized healthcare decisions. Treatment effect heterogeneity can be investigated using subpopulation treatment effect pattern plot (STEPP), a non-parametric graphical approach that constructs overlapping patient subpopulations with varying values of a characteristic. Procedures for statistical testing using subpopulation treatment effect pattern plot when the endpoint of interest is survival remain an area of active investigation. A STEPP analysis was used to explore patterns of absolute and relative treatment effects for varying levels of a breast cancer biomarker, Ki-67, in the phase III Breast International Group 1-98 randomized clinical trial, comparing letrozole to tamoxifen as adjuvant therapy for postmenopausal women with hormone receptor-positive breast cancer. Absolute treatment effects were measured by differences in 4-year cumulative incidence of breast cancer recurrence, while relative effects were measured by the subdistribution hazard ratio in the presence of competing risks using O-E (observed-minus-expected) methodology, an intuitive non-parametric method. While estimation of hazard ratio values based on O-E methodology has been shown, a similar development for the subdistribution hazard ratio has not. Furthermore, we observed that the subpopulation treatment effect pattern plot analysis may not produce results, even with 100 patients within each subpopulation. After further investigation through simulation studies, we observed inflation of the type I error rate of the traditional test statistic and sometimes singular variance-covariance matrix estimates that may lead to results not being produced. This is due to the lack of sufficient number of events within the subpopulations, which we refer to as instability of the subpopulation treatment effect pattern plot analysis. We introduce methodology designed to improve stability of the subpopulation treatment effect pattern plot analysis and generalize O-E methodology to the competing risks setting. Simulation studies were designed to assess the type I error rate of the tests for a variety of treatment effect measures, including subdistribution hazard ratio based on O-E estimation. This subpopulation treatment effect pattern plot methodology and standard regression modeling were used to evaluate heterogeneity of Ki-67 in the Breast International Group 1-98 randomized clinical trial. We introduce methodology that generalizes O-E methodology to the competing risks setting and that improves stability of the STEPP analysis by pre-specifying the number of events across subpopulations while controlling the type I error rate. The subpopulation treatment effect pattern plot analysis of the Breast International Group 1-98 randomized clinical trial showed that patients with high Ki-67 percentages may benefit most from letrozole, while heterogeneity was not detected using standard regression modeling. The STEPP methodology can be used to study complex patterns of treatment effect heterogeneity, as illustrated in the Breast International Group 1-98 randomized clinical trial. For the subpopulation treatment effect pattern plot analysis, we recommend a minimum of 20 events within each subpopulation. © The Author(s) 2015.

  9. Considerations for Assessing Maximum Critical Temperatures in Small Ectothermic Animals: Insights from Leaf-Cutting Ants

    PubMed Central

    Ribeiro, Pedro Leite; Camacho, Agustín; Navas, Carlos Arturo

    2012-01-01

    The thermal limits of individual animals were originally proposed as a link between animal physiology and thermal ecology. Although this link is valid in theory, the evaluation of physiological tolerances involves some problems that are the focus of this study. One rationale was that heating rates shall influence upper critical limits, so that ecological thermal limits need to consider experimental heating rates. In addition, if thermal limits are not surpassed in experiments, subsequent tests of the same individual should yield similar results or produce evidence of hardening. Finally, several non-controlled variables such as time under experimental conditions and procedures may affect results. To analyze these issues we conducted an integrative study of upper critical temperatures in a single species, the ant Atta sexdens rubropiosa, an animal model providing large numbers of individuals of diverse sizes but similar genetic makeup. Our specific aims were to test the 1) influence of heating rates in the experimental evaluation of upper critical temperature, 2) assumptions of absence of physical damage and reproducibility, and 3) sources of variance often overlooked in the thermal-limits literature; and 4) to introduce some experimental approaches that may help researchers to separate physiological and methodological issues. The upper thermal limits were influenced by both heating rates and body mass. In the latter case, the effect was physiological rather than methodological. The critical temperature decreased during subsequent tests performed on the same individual ants, even one week after the initial test. Accordingly, upper thermal limits may have been overestimated by our (and typical) protocols. Heating rates, body mass, procedures independent of temperature and other variables may affect the estimation of upper critical temperatures. Therefore, based on our data, we offer suggestions to enhance the quality of measurements, and offer recommendations to authors aiming to compile and analyze databases from the literature. PMID:22384147

  10. A methodology to condition distorted acoustic emission signals to identify fracture timing from human cadaver spine impact tests.

    PubMed

    Arun, Mike W J; Yoganandan, Narayan; Stemper, Brian D; Pintar, Frank A

    2014-12-01

    While studies have used acoustic sensors to determine fracture initiation time in biomechanical studies, a systematic procedure is not established to process acoustic signals. The objective of the study was to develop a methodology to condition distorted acoustic emission data using signal processing techniques to identify fracture initiation time. The methodology was developed from testing a human cadaver lumbar spine column. Acoustic sensors were glued to all vertebrae, high-rate impact loading was applied, load-time histories were recorded (load cell), and fracture was documented using CT. Compression fracture occurred to L1 while other vertebrae were intact. FFT of raw voltage-time traces were used to determine an optimum frequency range associated with high decibel levels. Signals were bandpass filtered in this range. Bursting pattern was found in the fractured vertebra while signals from other vertebrae were silent. Bursting time was associated with time of fracture initiation. Force at fracture was determined using this time and force-time data. The methodology is independent of selecting parameters a priori such as fixing a voltage level(s), bandpass frequency and/or using force-time signal, and allows determination of force based on time identified during signal processing. The methodology can be used for different body regions in cadaver experiments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Analytical solutions for efficient interpretation of single-well push-pull tracer tests

    NASA Astrophysics Data System (ADS)

    Huang, Junqi; Christ, John A.; Goltz, Mark N.

    2010-08-01

    Single-well push-pull tracer tests have been used to characterize the extent, fate, and transport of subsurface contamination. Analytical solutions provide one alternative for interpreting test results. In this work, an exact analytical solution to two-dimensional equations describing the governing processes acting on a dissolved compound during a modified push-pull test (advection, longitudinal and transverse dispersion, first-order decay, and rate-limited sorption/partitioning in steady, divergent, and convergent flow fields) is developed. The coupling of this solution with inverse modeling to estimate aquifer parameters provides an efficient methodology for subsurface characterization. Synthetic data for single-well push-pull tests are employed to demonstrate the utility of the solution for determining (1) estimates of aquifer longitudinal and transverse dispersivities, (2) sorption distribution coefficients and rate constants, and (3) non-aqueous phase liquid (NAPL) saturations. Employment of the solution to estimate NAPL saturations based on partitioning and non-partitioning tracers is designed to overcome limitations of previous efforts by including rate-limited mass transfer. This solution provides a new tool for use by practitioners when interpreting single-well push-pull test results.

  12. Borehole flowmeter logging for the accurate design and analysis of tracer tests.

    PubMed

    Basiricò, Stefano; Crosta, Giovanni B; Frattini, Paolo; Villa, Alberto; Godio, Alberto

    2015-04-01

    Tracer tests often give ambiguous interpretations that may be due to the erroneous location of sampling points and/or the lack of flow rate measurements through the sampler. To obtain more reliable tracer test results, we propose a methodology that optimizes the design and analysis of tracer tests in a cross borehole mode by using vertical borehole flow rate measurements. Experiments using this approach, herein defined as the Bh-flow tracer test, have been performed by implementing three sequential steps: (1) single-hole flowmeter test, (2) cross-hole flowmeter test, and (3) tracer test. At the experimental site, core logging, pumping tests, and static water-level measurements were previously carried out to determine stratigraphy, fracture characteristics, and bulk hydraulic conductivity. Single-hole flowmeter testing makes it possible to detect the presence of vertical flows as well as inflow and outflow zones, whereas cross-hole flowmeter testing detects the presence of connections along sets of flow conduits or discontinuities intercepted by boreholes. Finally, the specific pathways and rates of groundwater flow through selected flowpaths are determined by tracer testing. We conclude that the combined use of single and cross-borehole flowmeter tests is fundamental to the formulation of the tracer test strategy and interpretation of the tracer test results. © 2014, National Ground Water Association.

  13. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    PubMed

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate, misleading of the reported results, or difficult to interpret.There are gaps in the conduct and reporting of survival analysis in studies published in Chinese oncology journals, severe deficiencies were noted. More endorsement by journals of the report guideline for survival analysis may improve articles quality, and the dissemination of reliable evidence to oncology clinicians. We recommend authors, readers, reviewers, and editors to consider survival analysis more carefully and cooperate more closely with statisticians and epidemiologists. Copyright © 2017 The Authors. Published by Wolters Kluwer Health, Inc. All rights reserved.

  14. Test-Retest Reliability of Pediatric Heart Rate Variability: A Meta-Analysis.

    PubMed

    Weiner, Oren M; McGrath, Jennifer J

    2017-01-01

    Heart rate variability (HRV), an established index of autonomic cardiovascular modulation, is associated with health outcomes (e.g., obesity, diabetes) and mortality risk. Time- and frequency-domain HRV measures are commonly reported in longitudinal adult and pediatric studies of health. While test-retest reliability has been established among adults, less is known about the psychometric properties of HRV among infants, children, and adolescents. The objective was to conduct a meta-analysis of the test-retest reliability of time- and frequency-domain HRV measures from infancy to adolescence. Electronic searches (PubMed, PsycINFO; January 1970-December 2014) identified studies with nonclinical samples aged ≤ 18 years; ≥ 2 baseline HRV recordings separated by ≥ 1 day; and sufficient data for effect size computation. Forty-nine studies ( N = 5,170) met inclusion criteria. Methodological variables coded included factors relevant to study protocol, sample characteristics, electrocardiogram (ECG) signal acquisition and preprocessing, and HRV analytical decisions. Fisher's Z was derived as the common effect size. Analyses were age-stratified (infant/toddler < 5 years, n = 3,329; child/adolescent 5-18 years, n = 1,841) due to marked methodological differences across the pediatric literature. Meta-analytic results revealed HRV demonstrated moderate reliability; child/adolescent studies ( Z = 0.62, r = 0.55) had significantly higher reliability than infant/toddler studies ( Z = 0.42, r = 0.40). Relative to other reported measures, HF exhibited the highest reliability among infant/toddler studies ( Z = 0.42, r = 0.40), while rMSSD exhibited the highest reliability among child/adolescent studies ( Z = 1.00, r = 0.76). Moderator analyses indicated greater reliability with shorter test-retest interval length, reported exclusion criteria based on medical illness/condition, lower proportion of males, prerecording acclimatization period, and longer recording duration; differences were noted across age groups. HRV is reliable among pediatric samples. Reliability is sensitive to pertinent methodological decisions that require careful consideration by the researcher. Limited methodological reporting precluded several a priori moderator analyses. Suggestions for future research, including standards specified by Task Force Guidelines, are discussed.

  15. Test-Retest Reliability of Pediatric Heart Rate Variability

    PubMed Central

    Weiner, Oren M.; McGrath, Jennifer J.

    2017-01-01

    Heart rate variability (HRV), an established index of autonomic cardiovascular modulation, is associated with health outcomes (e.g., obesity, diabetes) and mortality risk. Time- and frequency-domain HRV measures are commonly reported in longitudinal adult and pediatric studies of health. While test-retest reliability has been established among adults, less is known about the psychometric properties of HRV among infants, children, and adolescents. The objective was to conduct a meta-analysis of the test-retest reliability of time- and frequency-domain HRV measures from infancy to adolescence. Electronic searches (PubMed, PsycINFO; January 1970–December 2014) identified studies with nonclinical samples aged ≤ 18 years; ≥ 2 baseline HRV recordings separated by ≥ 1 day; and sufficient data for effect size computation. Forty-nine studies (N = 5,170) met inclusion criteria. Methodological variables coded included factors relevant to study protocol, sample characteristics, electrocardiogram (ECG) signal acquisition and preprocessing, and HRV analytical decisions. Fisher’s Z was derived as the common effect size. Analyses were age-stratified (infant/toddler < 5 years, n = 3,329; child/adolescent 5–18 years, n = 1,841) due to marked methodological differences across the pediatric literature. Meta-analytic results revealed HRV demonstrated moderate reliability; child/adolescent studies (Z = 0.62, r = 0.55) had significantly higher reliability than infant/toddler studies (Z = 0.42, r = 0.40). Relative to other reported measures, HF exhibited the highest reliability among infant/toddler studies (Z = 0.42, r = 0.40), while rMSSD exhibited the highest reliability among child/adolescent studies (Z = 1.00, r = 0.76). Moderator analyses indicated greater reliability with shorter test-retest interval length, reported exclusion criteria based on medical illness/condition, lower proportion of males, prerecording acclimatization period, and longer recording duration; differences were noted across age groups. HRV is reliable among pediatric samples. Reliability is sensitive to pertinent methodological decisions that require careful consideration by the researcher. Limited methodological reporting precluded several a priori moderator analyses. Suggestions for future research, including standards specified by Task Force Guidelines, are discussed. PMID:29307951

  16. Recession Curve Generation for the Space Shuttle Solid Rocket Booster Thermal Protection System Coatings

    NASA Technical Reports Server (NTRS)

    Kanner, Howard S.; Stuckey, C. Irvin; Davis, Darrell W.; Davis, Darrell (Technical Monitor)

    2002-01-01

    Ablatable Thermal Protection System (TPS) coatings are used on the Space Shuttle Vehicle Solid Rocket Boosters in order to protect the aluminum structure from experiencing excessive temperatures. The methodology used to characterize the recession of such materials is outlined. Details of the tests, including the facility, test articles and test article processing are also presented. The recession rates are collapsed into an empirical power-law relation. A design curve is defined using a 95-percentile student-t distribution. based on the nominal results. Actual test results are presented for the current acreage TPS material used.

  17. Statistical reporting inconsistencies in experimental philosophy

    PubMed Central

    Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B.; Sprenger, Jan

    2018-01-01

    Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science. PMID:29649220

  18. Comparison of different methods for extraction and purification of human Papillomavirus (HPV) DNA from serum samples

    NASA Astrophysics Data System (ADS)

    Azizah, N.; Hashim, U.; Nadzirah, Sh.; Arshad, M. K. Md; Ruslinda, A. R.; Gopinath, Subash C. B.

    2017-03-01

    The affectability and unwavering quality of PCR for indicative and research purposes require effective fair systems of extraction and sanitization of nucleic acids. One of the real impediments of PCR-based tests is the hindrance of the enhancement procedure by substances exhibit in clinical examples. This examination considers distinctive techniques for extraction and cleaning of viral DNA from serum tests in view of recuperation productivity as far as yield of DNA and rate recouped immaculateness of removed DNA, and rate of restraint. The best extraction strategies were the phenol/chloroform strategy and the silica gel extraction methodology for serum tests, individually. Considering DNA immaculateness, extraction technique by utilizing the phenol/chloroform strategy delivered the most tasteful results in serum tests contrasted with the silica gel, separately. The nearness of inhibitors was overcome by all DNA extraction strategies in serum tests, as confirm by semiquantitative PCR enhancement.

  19. Trip Energy Estimation Methodology and Model Based on Real-World Driving Data for Green Routing Applications: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holden, Jacob; Van Til, Harrison J; Wood, Eric W

    A data-informed model to predict energy use for a proposed vehicle trip has been developed in this paper. The methodology leverages nearly 1 million miles of real-world driving data to generate the estimation model. Driving is categorized at the sub-trip level by average speed, road gradient, and road network geometry, then aggregated by category. An average energy consumption rate is determined for each category, creating an energy rates look-up table. Proposed vehicle trips are then categorized in the same manner, and estimated energy rates are appended from the look-up table. The methodology is robust and applicable to almost any typemore » of driving data. The model has been trained on vehicle global positioning system data from the Transportation Secure Data Center at the National Renewable Energy Laboratory and validated against on-road fuel consumption data from testing in Phoenix, Arizona. The estimation model has demonstrated an error range of 8.6% to 13.8%. The model results can be used to inform control strategies in routing tools, such as change in departure time, alternate routing, and alternate destinations to reduce energy consumption. This work provides a highly extensible framework that allows the model to be tuned to a specific driver or vehicle type.« less

  20. 76 FR 34270 - Federal-State Extended Benefits Program-Methodology for Calculating “on” or “off” Total...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ...--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate Indicators for Purposes of Determining...'' or ``off'' total unemployment rate (TUR) indicators to determine when extended benefit (EB) periods...-State Extended Benefits Program--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate...

  1. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....3.2, Mortgage Amortization Schedule Inputs 3-32, Loan Group Inputs for Mortgage Amortization... Prepayment Explanatory Variables F 3.6.3.5.2, Multifamily Default and Prepayment Inputs 3-38, Loan Group... Group inputs for Gross Loss Severity F 3.3.4, Interest Rates Outputs3.6.3.3.4, Mortgage Amortization...

  2. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....3.2, Mortgage Amortization Schedule Inputs 3-32, Loan Group Inputs for Mortgage Amortization... Prepayment Explanatory Variables F 3.6.3.5.2, Multifamily Default and Prepayment Inputs 3-38, Loan Group... Group inputs for Gross Loss Severity F 3.3.4, Interest Rates Outputs3.6.3.3.4, Mortgage Amortization...

  3. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....3.2, Mortgage Amortization Schedule Inputs 3-32, Loan Group Inputs for Mortgage Amortization... Prepayment Explanatory Variables F 3.6.3.5.2, Multifamily Default and Prepayment Inputs 3-38, Loan Group... Group inputs for Gross Loss Severity F 3.3.4, Interest Rates Outputs3.6.3.3.4, Mortgage Amortization...

  4. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Test Methodology and Specifications

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....3.2, Mortgage Amortization Schedule Inputs 3-32, Loan Group Inputs for Mortgage Amortization... Prepayment Explanatory Variables F 3.6.3.5.2, Multifamily Default and Prepayment Inputs 3-38, Loan Group... Group inputs for Gross Loss Severity F 3.3.4, Interest Rates Outputs3.6.3.3.4, Mortgage Amortization...

  5. Review of PCR methodology.

    DOT National Transportation Integrated Search

    1998-03-01

    This study was conducted to review the Pavement Condition Rating (PCR) : methodology currently used by the Ohio DOT. The results of the literature search in this : connection indicated that many Highway agencies use a similar methodology to rate thei...

  6. Estimating breeding proportions and testing hypotheses about costs of reproduction with capture-recapture data

    USGS Publications Warehouse

    Nichols, James D.; Hines, James E.; Pollock, Kenneth H.; Hinz, Robert L.; Link, William A.

    1994-01-01

    The proportion of animals in a population that breeds is an important determinant of population growth rate. Usual estimates of this quantity from field sampling data assume that the probability of appearing in the capture or count statistic is the same for animals that do and do not breed. A similar assumption is required by most existing methods used to test ecologically interesting hypotheses about reproductive costs using field sampling data. However, in many field sampling situations breeding and nonbreeding animals are likely to exhibit different probabilities of being seen or caught. In this paper, we propose the use of multistate capture-recapture models for these estimation and testing problems. This methodology permits a formal test of the hypothesis of equal capture/sighting probabilities for breeding and nonbreeding individuals. Two estimators of breeding proportion (and associated standard errors) are presented, one for the case of equal capture probabilities and one for the case of unequal capture probabilities. The multistate modeling framework also yields formal tests of hypotheses about reproductive costs to future reproduction or survival or both fitness components. The general methodology is illustrated using capture-recapture data on female meadow voles, Microtus pennsylvanicus. Resulting estimates of the proportion of reproductively active females showed strong seasonal variation, as expected, with low breeding proportions in midwinter. We found no evidence of reproductive costs extracted in subsequent survival or reproduction. We believe that this methodological framework has wide application to problems in animal ecology concerning breeding proportions and phenotypic reproductive costs.

  7. Controlled Social Interaction Tasks to Measure Self-Perceptions: No Evidence of Positive Illusions in Boys with ADHD.

    PubMed

    Jiang, Yuanyuan; Johnston, Charlotte

    2017-08-01

    Studies have suggested that children with Attention-Deficit/Hyperactivity Disorder (ADHD) possess a Positive Illusory Bias (PIB) where they have higher self-perceptions of competence than more objective measures of their competence. However, recent research calls into question the primary methodology of these studies, that is, difference scores. This study investigated the PIB in boys with ADHD within the social domain using a novel methodology that refrains from using difference scores. Eighty-one 8- to 12-year-old boys with and without ADHD completed social interaction tasks where their actual social performance was made comparable, allowing for tests of between-group differences in self-perceptions that do not rely on difference scores. In addition, to examine whether clarity of social feedback moderates the presence of the PIB, the social tasks presented unclear, clear positive, or clear negative feedback. Boys rated how well they performed in each social interaction task, and these ratings were compared between ADHD and non-ADHD groups. Compared to the non-ADHD group, boys with ADHD did not show a PIB in their ratings of performance on the social tasks. There also was no moderation of boys' ratings by type of feedback received. In contrast, when the PIB was calculated using difference scores based on child and parent ratings of child competence, boys with ADHD showed a PIB compared to boys without ADHD. These findings call attention to the need to re-examine the phenomenon of the PIB using methodologies outside of difference scores.

  8. Cell-Free DNA Analysis in Maternal Blood: Differences in Estimates between Laboratories with Different Methodologies Using a Propensity Score Approach.

    PubMed

    Bevilacqua, Elisa; Jani, Jacques C; Letourneau, Alexandra; Duiella, Silvia F; Kleinfinger, Pascale; Lohmann, Laurence; Resta, Serena; Cos Sanchez, Teresa; Fils, Jean-François; Mirra, Marilyn; Benachi, Alexandra; Costa, Jean-Marc

    2018-06-13

    To evaluate the failure rate and performance of cell-free DNA (cfDNA) testing, mainly in terms of detection rates for trisomy 21, performed by 2 laboratories using different analytical methods. cfDNA testing was performed on 2,870 pregnancies with the HarmonyTM Prenatal Test using the targeted digital analysis of selected regions (DANSR) method, and on 2,635 pregnancies with the "Cerba test" using the genome-wide massively parallel sequencing (GW-MPS) method, with available outcomes. Propensity score analysis was used to match patients between the 2 groups. A comparison of the detection rates for trisomy 21 between the 2 laboratories was made. In all, 2,811 patients in the Harmony group and 2,530 patients in the Cerba group had no trisomy 21, 18, or 13. Postmatched comparisons of the patient characteristics indicated a higher no-result rate in the Harmony group (1.30%) than in the Cerba group (0.75%; p = 0.039). All 41 cases of trisomy 21 in the Harmony group and 93 cases in the Cerba group were detected. Both methods of cfDNA testing showed low no-result rates and a comparable performance in detecting trisomy 21; yet GW-MPS had a slightly lower no-result rate than the DANSR method. © 2018 S. Karger AG, Basel.

  9. Gas dynamic design of the pipe line compressor with 90% efficiency. Model test approval

    NASA Astrophysics Data System (ADS)

    Galerkin, Y.; Rekstin, A.; Soldatova, K.

    2015-08-01

    Gas dynamic design of the pipe line compressor 32 MW was made for PAO SMPO (Sumy, Ukraine). The technical specification requires compressor efficiency of 90%. The customer offered favorable scheme - single-stage design with console impeller and axial inlet. The authors used the standard optimization methodology of 2D impellers. The original methodology of internal scroll profiling was used to minimize efficiency losses. Radically improved 5th version of the Universal modeling method computer programs was used for precise calculation of expected performances. The customer fulfilled model tests in a 1:2 scale. Tests confirmed the calculated parameters at the design point (maximum efficiency of 90%) and in the whole range of flow rates. As far as the authors know none of compressors have achieved such efficiency. The principles and methods of gas-dynamic design are presented below. The data of the 32 MW compressor presented by the customer in their report at the 16th International Compressor conference (September 2014, Saint- Petersburg) and later transferred to the authors.

  10. Measuring self-rated health status among resettled adult refugee populations to inform practice and policy - a scoping review.

    PubMed

    Dowling, Alison; Enticott, Joanne; Russell, Grant

    2017-12-08

    The health status of refugees is a significant factor in determining their success in resettlement and relies heavily on self-rated measures of refugee health. The selection of robust and appropriate self-rated health measurement tools is challenging due to the number and methodological variation in the use of assessment tools across refugee health studies. This study describes the existing self-report health measures which have been used in studies of adult refugees living in the community to allow us to address the challenges of selecting appropriate assessments to measure health within refugee groups. Electronic databases of Ovid Medline, CINAHL, SCOPUS, Embase and Scopus. This review identified 45 different self-rated health measurements in 183 studies. Most of the studies were cross sectional explorations of the mental health status of refugees living in community settings within Western nations. A third of the tools were designed specifically for use within refugee populations. More than half of the identified measurement tools have been evaluated for reliability and/or validity within refugee populations. Much variation was found in the selection, development and testing of measurement tools across the reviewed studies. This review shows that there are currently a number of reliable and valid tools available for use in refugee health research; however, further work is required to achieve consistency in the quality and in the use of these tools. Methodological guidelines are required to assist researchers and clinicians in the development and testing of self-rated health measurement tools for use in refugee research.

  11. Review of PCR methodology : executive summary.

    DOT National Transportation Integrated Search

    1998-01-01

    This study was conducted to review the Pavement Condition Rating (peR) : methodology currently used by the Ohio DOT. The results of the literature search in this : connection indicated that many Highway agencies use a similar methodology to rate thei...

  12. Investigating Dynamics of Eccentricity in Turbomachines

    NASA Technical Reports Server (NTRS)

    Baun, Daniel

    2010-01-01

    A methodology (and hardware and software to implement the methodology) has been developed as a means of investigating coupling between certain rotordynamic and hydrodynamic phenomena in turbomachines. Originally, the methodology was intended for application in an investigation of coupled rotordynamic and hydrodynamic effects postulated to have caused high synchronous vibration in the space shuttle s high-pressure oxygen turbopump (HPOTP). The methodology can also be applied in investigating (for the purpose of developing means of suppressing) undesired hydrodynamic rotor/stator interactions in turbomachines in general. The methodology and the types of phenomena that can be investigated by use of the methodology are best summarized by citing the original application as an example. In that application, in consideration of the high synchronous vibration in the space-shuttle main engine (SSME) HPOTP, it was determined to be necessary to perform tests to investigate the influence of inducer eccentricity and/or synchronous whirl motion on inducer hydrodynamic forces under prescribed flow and cavitation conditions. It was believed that manufacturing tolerances of the turbopump resulted in some induced runout of the pump rotor. Such runout, if oriented with an inducer blade, would cause that blade to run with tip clearance smaller than the tip clearances of the other inducer blades. It was hypothesized that the resulting hydraulic asymmetry, coupled with alternating blade cavitation, could give rise to the observed high synchronous vibration. In tests performed to investigate this hypothesis, prescribed rotor whirl motions have been imposed on a 1/3-scale water-rig version of the SSME LPOTP inducer (which is also a 4-biased inducer having similar cavitation dynamics as the HPOTP) in a magnetic-bearing test facility. The particular magnetic-bearing test facility, through active vibration control, affords a capability to impose, on the rotor, whirl orbits having shapes and whirl rates prescribed by the user, and to simultaneously measure the resulting hydrodynamic forces generated by the impeller. Active control also made it possible to modulate the inducer-blade running tip clearance and consequently effect alternating blade cavitation. The measured hydraulic forces have been compared and correlated with shroud dynamic-pressure measurements.

  13. Teaching research methodology in medical schools: students' attitudes towards and knowledge about science.

    PubMed

    Hren, Darko; Lukić, Ivan Kresimir; Marusić, Ana; Vodopivec, Ivana; Vujaklija, Ana; Hrabak, Maja; Marusić, Matko

    2004-01-01

    To explore the relationship between teaching scientific methodology in Year 2 of the medical curriculum and student attitudes towards and knowledge about science and scientific methodology. Anonymous questionnaire survey developed for this purpose. Zagreb University School of Medicine, Croatia. A total of 932 students (response rate 58%) from all 6 years were invited to participate. Score on attitude scale with 45 Likert-type statements and score on knowledge test consisting of 8 multiple choice questions. The average attitude score for all students was 166 +/- 22 out of a maximum of 225, indicating a positive attitude towards science and scientific research. The students' average score on the knowledge test was 3.2 +/- 1.7 on 8 questions. Students who had finished Year 2 had the highest mean attitude (173 +/- 24) and knowledge (4.7 +/- 1.7) scores compared with other year groups (P < 0.001, anova and Tukey posthoc test). For students who had attended a mandatory Year 2 course on the principles of scientific research in medicine (Years 3 to 6), multiple linear regression analysis showed that knowledge test score (B = 3.4; SE = 0.4; 95% confidence interval 2.5-4.2; P < 0.001) and average grades (B = 7.6; SE = 1.5; 95% CI 4.6-10.6; P < 0.001) were significant predictors of attitude towards science, but not sex or failure to pass a year (B = - 0.6; SE = 1.7; 95% CI - 3.9-2.6; P = 0.707; and B = - 3.1; SE = 1.9; 95% CI - 6.8-5.7; P = 0.097, respectively). Medical students have generally positive attitudes towards science and scientific research in medicine. Attendance of a course on research methodology is related to a positive attitude towards science.

  14. 78 FR 69647 - Drill Pipe From the People's Republic of China: Notice of Court Decision Not in Harmony With...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... drill pipe green tubes and the labor wage rate in the less-than-fair-value investigation. \\1\\ Downhole... Department revised the labor wage rate and applied the wage rate methodology from Labor Methodologies.\\4\\ On... States, 604 F.3d 1363, 1372 (Fed. Cir. 2010) (``Dorbest''); see also Antidumping Methodologies in...

  15. CO2 Washout Testing of NASA Space Suits

    NASA Technical Reports Server (NTRS)

    Norcross, Jason

    2012-01-01

    During the presentation "CO2 Washout Testing of NASA Spacesuits," Jason Norcross discussed the results of recent carbon dioxide CO2 washout testing of NASA spacesuits including the Rear Entry I-suit (REI), Enhanced Mobility Advanced Crew Escape Suit (EM-ACES), and possibly the ACES and Z-1 EVA prototype. When a spacesuit is used during ground testing, adequate CO2 washout must be provided for the suited subject. Symptoms of acute CO2 exposure depend on the partial pressure of CO2 (ppCO2) available to enter the lungs during respiration. The primary factors during ground-based testing that influence the ppCO2 level in the oronasal area include the metabolic rate of the subject and air flow through the suit. These tests were done to characterize inspired oronasal ppCO2 for a range of workloads and flow rates for which ground testing is nominally performed. During this presentation, Norcross provided descriptions of the spacesuits, test hardware, methodology, and results, as well as implications for future ground testing and verification of flight requirements.

  16. Job Proximity and the Urban Employment Problem: Do Suitable Nearby Jobs Improve Neighbourhood Employment Rates?: A Comment.

    ERIC Educational Resources Information Center

    Houston, Donald

    1998-01-01

    Discusses methodology to examine the problem of spatial mismatch of jobs, showing how the simple accessibility measures used by Daniel Immergluck (1998) are poor reflections of the availability of jobs to an individual and explaining why a gravity model is a favorable alternative. Also discusses the unsuitability of aggregate data for testing the…

  17. 40 CFR Appendix A to Subpart Ddddd... - Methodology and Criteria for Demonstrating Eligibility for the Health-Based Compliance Alternatives

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...

  18. 40 CFR Appendix A to Subpart Ddddd... - Methodology and Criteria for Demonstrating Eligibility for the Health-Based Compliance Alternatives

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...

  19. 40 CFR Appendix A to Subpart Ddddd... - Methodology and Criteria for Demonstrating Eligibility for the Health-Based Compliance Alternatives

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...

  20. Real-Life/Real-Time Elderly Fall Detection with a Triaxial Accelerometer

    PubMed Central

    2018-01-01

    The consequences of a fall on an elderly person can be reduced if the accident is attended by medical personnel within the first hour. Independent elderly people often stay alone for long periods of time, being in more risk if they suffer a fall. The literature offers several approaches for detecting falls with embedded devices or smartphones using a triaxial accelerometer. Most of these approaches have not been tested with the target population or cannot be feasibly implemented in real-life conditions. In this work, we propose a fall detection methodology based on a non-linear classification feature and a Kalman filter with a periodicity detector to reduce the false positive rate. This methodology requires a sampling rate of only 25 Hz; it does not require large computations or memory and it is robust among devices. We tested our approach with the SisFall dataset achieving 99.4% of accuracy. We then validated it with a new round of simulated activities with young adults and an elderly person. Finally, we give the devices to three elderly persons for full-day validations. They continued with their normal life and the devices behaved as expected. PMID:29621156

  1. Real-Life/Real-Time Elderly Fall Detection with a Triaxial Accelerometer.

    PubMed

    Sucerquia, Angela; López, José David; Vargas-Bonilla, Jesús Francisco

    2018-04-05

    The consequences of a fall on an elderly person can be reduced if the accident is attended by medical personnel within the first hour. Independent elderly people often stay alone for long periods of time, being in more risk if they suffer a fall. The literature offers several approaches for detecting falls with embedded devices or smartphones using a triaxial accelerometer. Most of these approaches have not been tested with the target population or cannot be feasibly implemented in real-life conditions. In this work, we propose a fall detection methodology based on a non-linear classification feature and a Kalman filter with a periodicity detector to reduce the false positive rate. This methodology requires a sampling rate of only 25 Hz; it does not require large computations or memory and it is robust among devices. We tested our approach with the SisFall dataset achieving 99.4% of accuracy. We then validated it with a new round of simulated activities with young adults and an elderly person. Finally, we give the devices to three elderly persons for full-day validations. They continued with their normal life and the devices behaved as expected.

  2. Can business and economics students perform elementary arithmetic?

    PubMed

    Standing, Lionel G; Sproule, Robert A; Leung, Ambrose

    2006-04-01

    Business and economics majors (N=146) were tested on the D'Amore Test of Elementary Arithmetic, which employs third-grade test items from 1932. Only 40% of the subjects passed the test by answering 10 out of 10 items correctly. Self-predicted scores were a good predictor of actual scores, but performance was not associated with demographic variables, grades in calculus courses, liking for science or computers, or mathematics anxiety. Scores decreased over the subjects' initial years on campus. The hardest test item, with an error rate of 23%, required the subject to evaluate (36 x 7) + (33 x 7). The results are similar to those of Standing in 2006, despite methodological changes intended to maximize performance.

  3. Identification of critical sediment source areas at regional level

    NASA Astrophysics Data System (ADS)

    Fargas, D.; Casasnovas, J. A. Martínez; Poch, R.

    In order to identify critical sediment sources in large catchments, using easily available terrain information at regional scale, a methodology has been developed to obtain a qualitative assessment necessary for further studies. The main objective of the model is to use basic terrain data related to the erosive processes which contribute to the production, transport and accumulation of sediments through the main water paths in the watershed. The model is based on the selection of homogeneous zones regarding drainage density and lithology, achieved by joining the spatial basic units by a rating system. The values of drainage density are rated according to an erosion class (Bucko & Mazurova, 1958). The lithology is rated by erosion indexes, adapted from FAO (1977). The combination and reclassification of the results brings about five qualitative classes of sediment emission risk. This methodology has been tested an validated for the watershed of the Joaquín Costa reservoir (NE Spain), with a surface of 1500 km 2. The mapping scale was 1:100.000 and the model was implemented through a vector GIS (Arc/Info). The prediction was checked by means of photo-interpretation and field work, which gave a accuracy of 78.5%. The proposed methodology has been proved useful as an initial approach for erosion assessment and soil conservation planning at the regional level, and also to select priority areas where further analyses can be developed.

  4. 49 CFR 1109.4 - Mandatory mediation in rate cases to be considered under the stand-alone cost methodology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Mandatory mediation in rate cases to be considered... § 1109.4 Mandatory mediation in rate cases to be considered under the stand-alone cost methodology. (a) A... methodology must engage in non-binding mediation of its dispute with the railroad upon filing a formal...

  5. Eutrophication risk assessment in coastal embayments using simple statistical models.

    PubMed

    Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G

    2003-09-01

    A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.

  6. Design of clinical trials of antidepressants: should a placebo control arm be included?

    PubMed

    Fritze, J; Möller, H J

    2001-01-01

    There is no doubt that available antidepressants are efficacious and effective. Nevertheless, more effective drugs with improved tolerability are needed. With this need in mind, some protagonists claim that future antidepressants should be proved superior to, or at least as effective as, established antidepressants, making placebo control methodologically dispensable in clinical trials. Moreover, the use of placebo control is criticised as unethical because it might result in effective treatment being withheld. There are, however, a number of methodological reasons why placebo control is indispensable for the proof of efficacy of antidepressants. Comparing investigational antidepressants only with standard antidepressants and not placebo yields ambiguous results that are difficult to interpret, be it in superiority or equivalence testing, and this method of assessment requires larger sample sizes than those required with the use of placebo control. Experimental methodology not adhering to the optimal study design is ethically questionable. Restricting the testing of investigational antidepressants only to superiority over standard antidepressants is an obstacle to therapeutic progress in terms of tolerability and the detection of innovative mechanisms of action from which certain subgroups of future patients might benefit. The use of a methodology that requires larger samples for testing of superiority or equivalence is also ethically questionable. In view of the high placebo response rates in trials of antidepressants, placebo treatment does not mean withholding effective treatment. Accepting the necessity of the clinical evaluation of new, potentially ineffective antidepressants implicitly means accepting placebo control as ethically justified. Three- or multi-arm comparisons including placebo and an active reference represent the optimal study design.

  7. The development of test methodology for testing glassy materials

    NASA Technical Reports Server (NTRS)

    Tucker, Dennis S.

    1987-01-01

    The inherent brittleness of glass invariably leads to a large variability in strength data and a time dependence in strength (i.e., static fatigue). Loading rate plays a large role in strength values. Glass is found to be weaker when supporting loads over long periods as compared to glass which undergoes rapid loading. In this instance the purpose of rapid loading is to fail the glass before any significant crack growth occurs. However, a decrease in strength occurs with a decrease in loading rate, pursuant to substantial crack extension. These properties complicate the structural design allowable for the utilization of glass components in applications such as mirrors for the Space Telescope and AXAF for Spacelab and the space station.

  8. Mild cognitive impairment: historical development and summary of research

    PubMed Central

    Golomb, James; Kluger, Alan; Ferris, Steven H

    2004-01-01

    This review article broadly traces the historical development, diagnostic criteria, clinical and neuropathological characteristics, and treatment strategies related to mild cognitive impairment (MCI), The concept of MCI is considered in the context of other terms that have been developed to characterize the elderly with varying degrees of cognitive impairment Criteria based on clinical global scale ratings, cognitive test performance, and performance on other domains of functioning are discussed. Approaches employing clinical, neuropsychological, neuroimaging, biological, and molecular genetic methodology used in the validation of MCI are considered, including results from cross-sectional, longitudinal, and postmortem investigations. Results of recent drug treatment studies of MCI and related methodological issues are also addressed. PMID:22034453

  9. Short-term forecasting of turbidity in trunk main networks.

    PubMed

    Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward

    2017-11-01

    Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Proactive telephone counseling for smoking cessation: meta-analyses by recruitment channel and methodological quality.

    PubMed

    Tzelepis, Flora; Paul, Christine L; Walsh, Raoul A; McElduff, Patrick; Knight, Jenny

    2011-06-22

    Systematic reviews demonstrated that proactive telephone counseling increases smoking cessation rates. However, these reviews did not differentiate studies by recruitment channel, did not adequately assess methodological quality, and combined different measures of abstinence. Twenty-four randomized controlled trials published before December 31, 2008, included seven of active recruitment, 16 of passive recruitment, and one of mixed recruitment. We rated methodological quality on selection bias, study design, confounders, blinding, data collection methods, withdrawals, and dropouts, according to the Quality Assessment Tool for Quantitative Studies. We conducted random effects meta-analysis to pool the results according to abstinence type and follow-up time for studies overall and segregated by recruitment channel, and methodological quality. The level of statistical heterogeneity was quantified by I(2). All statistical tests were two-sided. Methodological quality ratings indicated two strong, 10 moderate, and 12 weak studies. Overall, compared with self-help materials or no intervention control groups, proactive telephone counseling had a statistically significantly greater effect on point prevalence abstinence (nonsmoking at follow-up or abstinent for at least 24 hours, 7 days before follow-up) at 6-9 months (relative risk [RR] = 1.26, 95% confidence interval [CI] = 1.11 to 1.43, P < .001, I(2) = 21.4%) but not at 12-15 months after recruitment. This pattern also emerged when studies were segregated by recruitment channel (active, passive) or methodological quality (strong/moderate, weak). Overall, the positive effect on prolonged/continuous abstinence (abstinent for 3 months or longer before follow-up) was also statistically significantly greater at 6-9 months (RR = 1.58, CI = 1.26 to 1.98, P < .001, I(2) = 49.1%) and 12-18 months after recruitment (RR = 1.40, CI = 1.23 to 1.60, P < .001, I(2) = 18.5%). With the exception of point prevalence abstinence in the long term, these data support previous results showing that proactive telephone counseling has a positive impact on smoking cessation. Proactive telephone counseling increased prolonged/continuous abstinence long term for both actively and passively recruited smokers.

  11. Azithromycin Treatment Failure for Chlamydia trachomatis Among Heterosexual Men With Nongonococcal Urethritis.

    PubMed

    Kissinger, Patricia J; White, Scott; Manhart, Lisa E; Schwebke, Jane; Taylor, Stephanie N; Mena, Leandro; Khosropour, Christine M; Wilcox, Larissa; Schmidt, Norine; Martin, David H

    2016-10-01

    Three recent prospective studies have suggested that the 1-g dose of azithromycin for Chlamydia trachomatis (Ct) was less effective than expected, reporting a wide range of treatment failure rates (5.8%-22.6%). Reasons for the disparate results could be attributed to geographic or methodological differences. The purpose of this study was to reexamine the studies and attempt to harmonize methodologies to reduce misclassification as a result of false positives from early test-of-cure (TOC) or reinfection as a result of sexual exposure rather than treatment failure. Men who had sex with women, who received 1-g azithromycin under directly observed therapy for presumptive treatment of nongonococcal urethritis with confirmed Ct were included. Baseline screening was performed on urethral swabs or urine, and TOC screening was performed on urine using nucleic acid amplification tests. Posttreatment vaginal sexual exposure was elicited at TOC. Data from the 3 studies were obtained and reanalyzed. Rates of Ct retest positive were examined for all cases, and a sensitivity analysis was conducted to either reclassify potential false positives/reinfections as negative or remove them from the analysis. The crude treatment failure rate was 12.8% (31/242). The rate when potential false positives/reinfections were reclassified as negative was 6.2% (15/242) or when these were excluded from analysis was 10.9% (15/138). In these samples of men who have sex with women with Ct-related nongonococcal urethritis, azithromycin treatment failure was between 6.2% and 12.8%. This range of failure is lower than previously published but higher than the desired World Health Organization's target chlamydia treatment failure rate of < 5%.

  12. Sex Does Not Matter: Gender Bias and Gender Differences in Peer Assessments of Contributions to Group Work

    ERIC Educational Resources Information Center

    Tucker, Richard

    2014-01-01

    This paper considers the possibility of gender bias in peer ratings for contributions to team assignments, as measured by an online self-and-peer assessment tool. The research was conducted to determine whether peer assessment led to reliable and fair marking outcomes. The methodology of Falchikov and Magin was followed in order to test their…

  13. Study of the Effect of Swelling on Irradiation Assisted Stress Corrosion Cracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teysseyre, Sebastien Paul

    2016-09-01

    This report describes the methodology used to study the effect of swelling on the crack growth rate of an irradiation-assisted stress corrosion crack that is propagating in highly irradiated stainless steel 304 material irradiated to 33 dpa in the Experimental Breeder Reactor-II. The material selection, specimens design, experimental apparatus and processes are described. The results of the current test are presented.

  14. 12 CFR 46.6 - Stress test methodologies and practices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Stress test methodologies and practices. 46.6 Section 46.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY ANNUAL STRESS TEST § 46.6 Stress test methodologies and practices. (a) Potential impact on capital. During each quarter of...

  15. 12 CFR 46.6 - Stress test methodologies and practices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 1 2013-01-01 2013-01-01 false Stress test methodologies and practices. 46.6 Section 46.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY ANNUAL STRESS TEST § 46.6 Stress test methodologies and practices. (a) Potential impact on capital. During each quarter of...

  16. Contributions of Various Radiological Sources to Background in a Suburban Environment

    DOE PAGES

    Milvenan, Richard D.; Hayes, Robert B.

    2016-11-01

    This work is a brief overview and comparison of dose rates stemming from both indoor and outdoor natural background radiation and household objects within a suburban environment in North Carolina. Combined gamma and beta dose rates were taken from indoor objects that ranged from the potassium in fruit to the americium in smoke detectors. For outdoor measurements, various height and time data samples were collected to show fluctuations in dose rate due to temperature inversion and geometric attenuation. Although each sample tested proved to have a statistically significant increase over background using Students t-test, no sample proved to be moremore » than a minor increase in natural radiation dose. Furthermore, the relative contributions from natural radioactivity such as potassium in foods and common household items are shown to be easily distinguished from background using standard handheld instrumentation when applied in a systematic, methodological manner.« less

  17. Hypothesis-Testing Demands Trustworthy Data—A Simulation Approach to Inferential Statistics Advocating the Research Program Strategy

    PubMed Central

    Krefeld-Schwalb, Antonia; Witte, Erich H.; Zenker, Frank

    2018-01-01

    In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H0-hypothesis to a statistical H1-verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a “pure” Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis. PMID:29740363

  18. Contributions of Various Radiological Sources to Background in a Suburban Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milvenan, Richard D.; Hayes, Robert B.

    This work is a brief overview and comparison of dose rates stemming from both indoor and outdoor natural background radiation and household objects within a suburban environment in North Carolina. Combined gamma and beta dose rates were taken from indoor objects that ranged from the potassium in fruit to the americium in smoke detectors. For outdoor measurements, various height and time data samples were collected to show fluctuations in dose rate due to temperature inversion and geometric attenuation. Although each sample tested proved to have a statistically significant increase over background using Students t-test, no sample proved to be moremore » than a minor increase in natural radiation dose. Furthermore, the relative contributions from natural radioactivity such as potassium in foods and common household items are shown to be easily distinguished from background using standard handheld instrumentation when applied in a systematic, methodological manner.« less

  19. A New High Channel-Count, High Scan-Rate, Data Acquisition System for the NASA Langley Transonic Dynamics Tunnel

    NASA Technical Reports Server (NTRS)

    Ivanco, Thomas G.; Sekula, Martin K.; Piatak, David J.; Simmons, Scott A.; Babel, Walter C.; Collins, Jesse G.; Ramey, James M.; Heald, Dean M.

    2016-01-01

    A data acquisition system upgrade project, known as AB-DAS, is underway at the NASA Langley Transonic Dynamics Tunnel. AB-DAS will soon serve as the primary data system and will substantially increase the scan-rate capabilities and analog channel count while maintaining other unique aeroelastic and dynamic test capabilities required of the facility. AB-DAS is configurable, adaptable, and enables buffet and aeroacoustic tests by synchronously scanning all analog channels and recording the high scan-rate time history values for each data quantity. AB-DAS is currently available for use as a stand-alone data system with limited capabilities while development continues. This paper describes AB-DAS, the design methodology, and the current features and capabilities. It also outlines the future work and projected capabilities following completion of the data system upgrade project.

  20. Biological treatment of toxic petroleum spent caustic in fluidized bed bioreactor using immobilized cells of Thiobacillus RAI01.

    PubMed

    Potumarthi, Ravichandra; Mugeraya, Gopal; Jetty, Annapurna

    2008-12-01

    In the present studies, newly isolated Thiobacillus sp was used for the treatment of synthetic spent sulfide caustic in a laboratory-scale fluidized bed bioreactor. The sulfide oxidation was tested using Ca-alginate immobilized Thiobacillus sp. Initially, response surface methodology was applied for the optimization of four parameters to check the sulfide oxidation efficiency in batch mode. Further, reactor was operated in continuous mode for 51 days at different sulfide loading rates and retention times to test the sulfide oxidation and sulfate and thiosulfate formation. Sulfide conversions in the range of 90-98% were obtained at almost all sulfide loading rates and hydraulic retention times. However, increased loading rates resulted in lower sulfide oxidation capacity. All the experiments were conducted at constant pH of around 6 and temperature of 30 +/- 5 degrees C.

  1. Hypothesis-Testing Demands Trustworthy Data-A Simulation Approach to Inferential Statistics Advocating the Research Program Strategy.

    PubMed

    Krefeld-Schwalb, Antonia; Witte, Erich H; Zenker, Frank

    2018-01-01

    In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H 0 -hypothesis to a statistical H 1 -verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a "pure" Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis.

  2. Testing the non-unity of rate ratio under inverse sampling.

    PubMed

    Tang, Man-Lai; Liao, Yi Jie; Ng, Hong Keung Tony; Chan, Ping Shing

    2007-08-01

    Inverse sampling is considered to be a more appropriate sampling scheme than the usual binomial sampling scheme when subjects arrive sequentially, when the underlying response of interest is acute, and when maximum likelihood estimators of some epidemiologic indices are undefined. In this article, we study various statistics for testing non-unity rate ratios in case-control studies under inverse sampling. These include the Wald, unconditional score, likelihood ratio and conditional score statistics. Three methods (the asymptotic, conditional exact, and Mid-P methods) are adopted for P-value calculation. We evaluate the performance of different combinations of test statistics and P-value calculation methods in terms of their empirical sizes and powers via Monte Carlo simulation. In general, asymptotic score and conditional score tests are preferable for their actual type I error rates are well controlled around the pre-chosen nominal level, and their powers are comparatively the largest. The exact version of Wald test is recommended if one wants to control the actual type I error rate at or below the pre-chosen nominal level. If larger power is expected and fluctuation of sizes around the pre-chosen nominal level are allowed, then the Mid-P version of Wald test is a desirable alternative. We illustrate the methodologies with a real example from a heart disease study. (c) 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  3. 42 CFR 416.171 - Determination of payment rates for ASC services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Determination of payment rates for ASC services... Determination of payment rates for ASC services. (a) Standard methodology. The standard methodology for determining the national unadjusted payment rate for ASC services is to calculate the product of the...

  4. A mixture gatekeeping procedure based on the Hommel test for clinical trial applications.

    PubMed

    Brechenmacher, Thomas; Xu, Jane; Dmitrienko, Alex; Tamhane, Ajit C

    2011-07-01

    When conducting clinical trials with hierarchically ordered objectives, it is essential to use multiplicity adjustment methods that control the familywise error rate in the strong sense while taking into account the logical relations among the null hypotheses. This paper proposes a gatekeeping procedure based on the Hommel (1988) test, which offers power advantages compared to other p value-based tests proposed in the literature. A general description of the procedure is given and details are presented on how it can be applied to complex clinical trial designs. Two clinical trial examples are given to illustrate the methodology developed in the paper.

  5. Visualization and statistical comparisons of microbial communities using R packages on Phylochip data.

    PubMed

    Holmes, Susan; Alekseyenko, Alexander; Timme, Alden; Nelson, Tyrrell; Pasricha, Pankaj Jay; Spormann, Alfred

    2011-01-01

    This article explains the statistical and computational methodology used to analyze species abundances collected using the LNBL Phylochip in a study of Irritable Bowel Syndrome (IBS) in rats. Some tools already available for the analysis of ordinary microarray data are useful in this type of statistical analysis. For instance in correcting for multiple testing we use Family Wise Error rate control and step-down tests (available in the multtest package). Once the most significant species are chosen we use the hypergeometric tests familiar for testing GO categories to test specific phyla and families. We provide examples of normalization, multivariate projections, batch effect detection and integration of phylogenetic covariation, as well as tree equalization and robustification methods.

  6. A methodology of SiP testing based on boundary scan

    NASA Astrophysics Data System (ADS)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  7. Novel methodology for pharmaceutical expenditure forecast.

    PubMed

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making.

  8. Quality of reporting of surveys in critical care journals: a methodologic review.

    PubMed

    Duffett, Mark; Burns, Karen E; Adhikari, Neill K; Arnold, Donald M; Lauzier, François; Kho, Michelle E; Meade, Maureen O; Hayani, Omar; Koo, Karen; Choong, Karen; Lamontagne, François; Zhou, Qi; Cook, Deborah J

    2012-02-01

    Adequate reporting is needed to judge methodologic quality and assess the risk of bias of surveys. The objective of this study is to describe the methodology and quality of reporting of surveys published in five critical care journals. All issues (1996-2009) of the American Journal of Respiratory and Critical Care Medicine, Critical Care, Critical Care Medicine, Intensive Care Medicine, and Pediatric Critical Care Medicine. Two reviewers hand-searched all issues in duplicate. We included publications of self-administered questionnaires of health professionals and excluded surveys that were part of a multi-method study or measured the effect of an intervention. Data were abstracted in duplicate. We included 151 surveys. The frequency of survey publication increased at an average rate of 0.38 surveys per 1000 citations per year from 1996-2009 (p for trend = 0.001). The median number of respondents and reported response rates were 217 (interquartile range 90 to 402) and 63.3% (interquartile range 45.0% to 81.0%), respectively. Surveys originated predominantly from North America (United States [40.4%] and Canada [18.5%]). Surveys most frequently examined stated practice (78.8%), attitudes or opinions (60.3%), and less frequently knowledge (9.9%). The frequency of reporting on the survey design and methods were: 1) instrument development: domains (59.1%), item generation (33.1%), item reduction (12.6%); 2) instrument testing: pretesting or pilot testing (36.2%) and assessments of clarity (25.2%) or clinical sensibility (15.7%); and 3) clinimetric properties: qualitative or quantitative description of at least one of face, content, construct validity, intra- or inter-rater reliability, or consistency (28.5%). The reporting of five key elements of survey design and conduct did not significantly change over time. Surveys, primarily conducted in North America and focused on self-reported practice, are increasingly published in highly cited critical care journals. More uniform and comprehensive reporting will facilitate assessment of methodologic quality.

  9. Appropriate Use Criteria in Dermatopathology: Initial Recommendations from the American Society of Dermatopathology.

    PubMed

    Vidal, Claudia I; Armbrect, Eric A; Andea, Aleodor A; Bohlke, Angela K; Comfere, Nneka I; Hughes, Sarah R; Kim, Jinah; Kozel, Jessica A; Lee, Jason B; Linos, Konstantinos; Litzner, Brandon R; Missall, Tricia A; Novoa, Roberto A; Sundram, Uma; Swick, Brian L; Hurley, M Yadira; Alam, Murad; Argenyi, Zsolt; Duncan, Lyn M; Elston, Dirk M; Emanuel, Patrick O; Ferringer, Tammie; Fung, Maxwell A; Hosler, Gregory A; Lazar, Alexander J; Lowe, Lori; Plaza, Jose A; Prieto, Victor G; Robinson, June K; Schaffer, Andras; Subtil, Antonio; Wang, Wei-Lien

    2018-04-21

    Appropriate use criteria (AUC) provide physicians guidance in test selection, can affect health care delivery, reimbursement policy, and physician decision-making. The American Society of Dermatopathology (ASDP), with input from the American Academy of Dermatology (AAD) and the College of American Pathologists (CAP), sought to develop AUC in dermatopathology. The RAND/UCLA appropriateness methodology, which combines evidence-based medicine, clinical experience and expert judgment, was used to develop AUC in dermatopathology. With the number of ratings predetermined at 3, AUC were developed for 211 clinical scenarios (CS) involving 12 ancillary studies (AS). Consensus was reached for 188 (89%) CS, with 93 (44%) considered "usually appropriate", 52 (25%) "rarely appropriate", and 43 (20%) "uncertain appropriateness". The methodology requires a focus on appropriateness without comparison between tests and irrespective of cost. The ultimate decision of when to order specific test rests with the physician and is one where the expected benefit exceeds the negative consequences. This publication outlines the recommendation of appropriateness - AUC for 12 tests used in dermatopathology. Importantly, these recommendations may change considering new evidence. Results deemed "uncertain appropriateness" and where consensus was not reached may benefit from further research. Copyright © 2018. Published by Elsevier Inc.

  10. Analysis of Flowfields over Four-Engine DC-X Rockets

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Cornelison, Joni

    1996-01-01

    The objective of this study is to validate a computational methodology for the aerodynamic performance of an advanced conical launch vehicle configuration. The computational methodology is based on a three-dimensional, viscous flow, pressure-based computational fluid dynamics formulation. Both wind-tunnel and ascent flight-test data are used for validation. Emphasis is placed on multiple-engine power-on effects. Computational characterization of the base drag in the critical subsonic regime is the focus of the validation effort; until recently, almost no multiple-engine data existed for a conical launch vehicle configuration. Parametric studies using high-order difference schemes are performed for the cold-flow tests, whereas grid studies are conducted for the flight tests. The computed vehicle axial force coefficients, forebody, aftbody, and base surface pressures compare favorably with those of tests. The results demonstrate that with adequate grid density and proper distribution, a high-order difference scheme, finite rate afterburning kinetics to model the plume chemistry, and a suitable turbulence model to describe separated flows, plume/air mixing, and boundary layers, computational fluid dynamics is a tool that can be used to predict the low-speed aerodynamic performance for rocket design and operations.

  11. Unified high-temperature behavior of thin-gauge superalloys

    NASA Astrophysics Data System (ADS)

    England, Raymond Oliver

    This research proposes a methodology for accelerated testing in the area of high-temperature creep and oxidation resistance for thin-gauge superalloy materials. Traditional long-term creep (stress-relaxation) and oxidation tests are completed to establish a baseline. The temperature range used in this study is between 1200 and 1700°F. The alloys investigated are Incoloy MA 956, Waspaloy, Haynes 214, Haynes 242, Haynes 230, and Incoloy 718. The traditional creep test involves loading the specimens to a constant test mandrel radius of curvature, and measuring the retained radius of curvature as a function of time. The accelerated creep test uses a servohydraulic test machine to conduct single specimen, variable strain-rate load relaxation experiments. Standard metallographic evaluations are used to determine extent and morphology of attack in the traditional oxidation tests, while the accelerated oxidation test utilizes thermogravimetric analysis to obtain oxidation rate data. The traditional long-term creep testing indicates that the mechanically-alloyed material Incoloy MA 956 and Haynes alloy 214 may be suitable for long-term, high-temperature (above 1400°F) structural applications. The accelerated creep test produced a continuous linear function of log stress versus strain rate which can be used to calculate creep rate. The long-term and traditional oxidation tests indicate that Al2O3 scale formers such as Incoloy MA 956 and Haynes 214 are much more resistant to high-temperature oxidation than Cr2O3 scale formers such as Waspaloy. Both accelerated tests can be completed within roughly one day, and can evaluate multiple test temperatures using standardized single specimens. These simple experiments can be correlated with traditional long-term tests which require years to complete.

  12. Load and resistance factor rating (LRFR) in New York State : volume II.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...

  13. Load and resistance factor rating (LRFR) in NYS : volume II final report.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...

  14. Terminology and Methodology Related to the Use of Heart Rate Responsivity in Infancy Research

    ERIC Educational Resources Information Center

    Woodcock, James M.

    1971-01-01

    Methodological problems in measuring and interpreting infantile heart rate reactivity in research are discussed. Various ways of describing cardiac activity are listed. Attention is given to the relationship between resting state and heart rate responsivity. (Author/WY)

  15. Load and resistance factor rating (LRFR) in NYS : volume I final report.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...

  16. Load and resistance factor rating (LRFR) in New York State : volume I.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...

  17. Highlights in emergency medicine medical education research: 2008.

    PubMed

    Farrell, Susan E; Coates, Wendy C; Khun, Gloria J; Fisher, Jonathan; Shayne, Philip; Lin, Michelle

    2009-12-01

    The purpose of this article is to highlight medical education research studies published in 2008 that were methodologically superior and whose outcomes were pertinent to teaching and education in emergency medicine. Through a PubMed search of the English language literature in 2008, 30 medical education research studies were independently identified as hypothesis-testing investigations and measurements of educational interventions. Six reviewers independently rated and scored all articles based on eight anchors, four of which related to methodologic criteria. Articles were ranked according to their total rating score. A ranking agreement among the reviewers of 83% was established a priori as a minimum for highlighting articles in this review. Five medical education research studies met the a priori criteria for inclusion and are reviewed and summarized here. Four of these employed experimental or quasi-experimental methodology. Although technology was not a component of the structured literature search employed to identify the candidate articles for this review, 14 of the articles identified, including four of the five highlighted articles, employed or studied technology as a focus of the educational research. Overall, 36% of the reviewed studies were supported by funding; three of the highlighted articles were funded studies. This review highlights quality medical education research studies published in 2008, with outcomes of relevance to teaching and education in emergency medicine. It focuses on research methodology, notes current trends in the use of technology for learning in emergency medicine, and suggests future avenues for continued rigorous study in education.

  18. Poster — Thur Eve — 56: Design of Quality Assurance Methodology for VMAT system on Agility System equipped with CVDR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thind, K; Tolakanahalli, R

    2014-08-15

    The aim of this study was to analyze the feasibility of designing comprehensive QA plans using iComCAT for Elekta machines equipped with Agility multileaf collimator and continuously variable dose rate. Test plans with varying MLC speed, gantry speed, and dose rate were created and delivered in a controlled manner. A strip test was designed with three 1 cm MLC positions and delivered using dynamic, StepNShoot and VMAT techniques. Plans were also designed to test error in MLC position with various gantry speeds and various MLC speeds. The delivery fluence was captured using the electronic portal-imaging device. Gantry speed was foundmore » to be within tolerance as per the Canadian standards. MLC positioning errors at higher MLC speed with gravity effects does add more than 2 mm discrepancy. More tests need to be performed to evaluate MLC performance using independent measurement systems. The treatment planning system with end-to-end testing necessary for commissioning was also investigated and found to have >95% passing rates within 3%/3mm gamma criteria. Future studies involve performing off-axis gantry starshot pattern and repeating the tests on three matched Elekta linear accelerators.« less

  19. Joint Test Protocol: Environmentally Friendly Zirconium Oxide Pretreatment Demonstration

    DTIC Science & Technology

    2013-12-01

    coatings . Loss of paint adhesion is the primary failure mode on aluminum and steel. 3.7.3 Test Methodology The test methodology for pencil hardness...conversion pretreatment coatings . Loss of paint adhesion is the primary failure mode on aluminum and steel. 3.8.3 Test Methodology The test...SUPPLEMENTARY NOTES 14. ABSTRACT There is a need to implement innovative and cost- effective replacement technologies to address the multiple health, safety

  20. Determination of Time Dependent Virus Inactivation Rates

    NASA Astrophysics Data System (ADS)

    Chrysikopoulos, C. V.; Vogler, E. T.

    2003-12-01

    A methodology is developed for estimating temporally variable virus inactivation rate coefficients from experimental virus inactivation data. The methodology consists of a technique for slope estimation of normalized virus inactivation data in conjunction with a resampling parameter estimation procedure. The slope estimation technique is based on a relatively flexible geostatistical method known as universal kriging. Drift coefficients are obtained by nonlinear fitting of bootstrap samples and the corresponding confidence intervals are obtained by bootstrap percentiles. The proposed methodology yields more accurate time dependent virus inactivation rate coefficients than those estimated by fitting virus inactivation data to a first-order inactivation model. The methodology is successfully applied to a set of poliovirus batch inactivation data. Furthermore, the importance of accurate inactivation rate coefficient determination on virus transport in water saturated porous media is demonstrated with model simulations.

  1. Assessing the Impact of Clothing and Individual Equipment (CIE) on Soldier Physical, Biomechanical, and Cognitive Performance Part 1: Test Methodology

    DTIC Science & Technology

    2018-02-01

    29 during Soldier Equipment Configuration Impact on Performance: Establishing a Test Methodology for the...Performance of Medium Rucksack Prototypes An investigation: Comparison of live-fire and weapon simulator test methodologies and the of three extremity armor

  2. Instruments to assess self-care among healthy children: A systematic review of measurement properties.

    PubMed

    Urpí-Fernández, Ana-María; Zabaleta-Del-Olmo, Edurne; Montes-Hidalgo, Javier; Tomás-Sábado, Joaquín; Roldán-Merino, Juan-Francisco; Lluch-Canut, María-Teresa

    2017-12-01

    To identify, critically appraise and summarize the measurement properties of instruments to assess self-care in healthy children. Assessing self-care is a proper consideration for nursing practice and nursing research. No systematic review summarizes instruments of measurement validated in healthy children. Psychometric review in accordance with the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) panel. MEDLINE, CINAHL, PsycINFO, Web of Science and Open Grey were searched from their inception to December 2016. Validation studies with a healthy child population were included. Search was not restricted by language. Two reviewers independently assessed the methodological quality of included studies using the COSMIN checklist. Eleven studies were included in the review assessing the measurement properties of ten instruments. There was a maximum of two studies per instrument. None of the studies evaluated the properties of test-retest reliability, measurement error, criterion validity and responsiveness. Internal consistency and structural validity were rated as "excellent" or "good" in four studies. Four studies were rated as "excellent" in content validity. Cross-cultural validity was rated as "poor" in the two studies (three instruments) which cultural adaptation was carried out. The evidence available does not allow firm conclusions about the instruments identified in terms of reliability and validity. Future research should focus on generate evidence about a wider range of measurement properties of these instruments using a rigorous methodology, as well as instrument testing on different countries and child population. © 2017 John Wiley & Sons Ltd.

  3. Development of Improved Accelerated Corrosion Qualification Test Methodology for Aerospace Materials

    DTIC Science & Technology

    2014-11-01

    irradiation and ozone gas • Cumulative damage model for predicting atmospheric corrosion rates of 1010 steel was developed using inputs from weather...data: – Temperature, – Relative humidity (%RH) – Atmospheric contaminants (chloride, SO2, and ozone ) levels Silver Al Alloy 7075 Al Alloy...2024 Al Alloy 6061 Copper Steel Ozone generator Ozone monitor 10 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited

  4. 42 CFR 413.337 - Methodology for calculating the prospective payment rates.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...-STAGE RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.337 Methodology for calculating the...

  5. 42 CFR 413.337 - Methodology for calculating the prospective payment rates.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-STAGE RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.337 Methodology for calculating the...

  6. Evaluation of the Carefusion Alaris PC infusion pump for hyperbaric oxygen therapy conditions: Technical report.

    PubMed

    Smale, Andrew; Tsouras, Theo

    2017-01-01

    We present a standardized test methodology and results for our evaluation of the Carefusion Alaris PC infusion pump, comprising the model 8015 PC Unit and the model 8100 Large Volume Pump (LVP) module. The evaluation consisted of basic suitability testing, internal component inspection, surface temperature measurement of selected internal components, and critical performance testing (infusion rate accuracy and occlusion alarm pressure) during conditions of typical hyperbaric oxygen (HBO₂) treatment in our facility's class A multiplace chamber. We have found that the pumps pose no enhanced risk as an ignition source, and that the pumps operate within manufacturer's specifications for flow rate and occlusion alarms at all stages of HBO₂ treatments, up to 4.0 ATA and pressurization and depressurization rates up to 180 kPa/minute. The pumps do not require purging with air or nitrogen and can be used unmodified, subject to the following conditions: pumps are undamaged, clean, fully charged, and absent from alcohol cleaning residue; pumps are powered from the internal NiMH battery only; maximum pressure exposure 4.0 ATA; maximum pressurization and depressurization rate of 180 kPa/minute; LVP modules locked in place with retaining screws. Copyright© Undersea and Hyperbaric Medical Society.

  7. MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes

    USGS Publications Warehouse

    Williams, B.K.

    1988-01-01

    Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.

  8. Novel methodology to obtain salient biomechanical characteristics of insole materials.

    PubMed

    Lavery, L A; Vela, S A; Ashry, H R; Lanctot, D R; Athanasiou, K A

    1997-06-01

    Viscoelastic inserts are commonly used as artificial shock absorbers to prevent neuropathic foot ulcerations by decreasing pressure on the sole of the foot. Unfortunately, there is little scientific information available to guide physicians in the selection of appropriate insole materials. Therefore, a novel methodology was developed to form a rational platform for biomechanical characterizations of insole material durability, which consisted of in vivo gait analysis and in vitro bioengineering measurements. Results show significant differences in the compressive stiffness of the tested insoles and the rate of change over time in both compressive stiffness and peak pressures measured. Good correlations were found between pressure-time integral and Young's modulus (r2 = 0.93), and total energy applied and Young's modulus (r2 = 0.87).

  9. The inverse problem of brain energetics: ketone bodies as alternative substrates

    NASA Astrophysics Data System (ADS)

    Calvetti, D.; Occhipinti, R.; Somersalo, E.

    2008-07-01

    Little is known about brain energy metabolism under ketosis, although there is evidence that ketone bodies have a neuroprotective role in several neurological disorders. We investigate the inverse problem of estimating reaction fluxes and transport rates in the different cellular compartments of the brain, when the data amounts to a few measured arterial venous concentration differences. By using a recently developed methodology to perform Bayesian Flux Balance Analysis and a new five compartment model of the astrocyte-glutamatergic neuron cellular complex, we are able to identify the preferred biochemical pathways during shortage of glucose and in the presence of ketone bodies in the arterial blood. The analysis is performed in a minimally biased way, therefore revealing the potential of this methodology for hypothesis testing.

  10. Systematic review of communication partner training in aphasia: methodological quality.

    PubMed

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  11. CO2 Washout Testing of the REI and EM-ACES Space Suits

    NASA Technical Reports Server (NTRS)

    Mitchell, Kathryn C.; Norcross, Jason

    2012-01-01

    When a space suit is used during ground testing, adequate carbon dioxide (CO2) washout must be provided for the suited subject. Symptoms of acute CO2 exposure depend on partial pressure of CO2 (ppCO2), metabolic rate of the subject, and other factors. This test was done to characterize inspired oronasal ppCO2 in the Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES) for a range of workloads and flow rates for which ground testing is nominally performed. Three subjects were tested in each suit. In all but one case, each subject performed the test twice. Suit pressure was maintained at 4.3 psid. Subjects wore the suit while resting, performing arm ergometry, and walking on a treadmill to generate metabolic workloads of about 500 to 3000 BTU/hr. Supply airflow was varied between 6, 5, and 4 actual cubic feet per minute (ACFM) at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored in real time by gas analyzers with sampling tubes connected to the mask. Metabolic rate was calculated from the total CO2 production measured by an additional gas analyzer at the suit air outlet. Real-time metabolic rate was used to adjust the arm ergometer or treadmill workload to meet target metabolic rates. In both suits, inspired CO2 was affected mainly by the metabolic rate of the subject: increased metabolic rate significantly (P < 0.05) increased inspired ppCO2. Decreased air flow caused small increases in inspired ppCO2. The effect of flow was more evident at metabolic rates . 2000 BTU/hr. CO2 washout values of the EM-ACES were slightly but not significantly better than those of the REI suit. Regression equations were developed for each suit to predict the mean inspired ppCO2 as a function of metabolic rate and suit flow rate. This paper provides detailed descriptions of the test hardware, methodology, and results as well as implications for future ground testing in the REI-suit and EM-ACES.

  12. 75 FR 62403 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-08

    ... Project: 2011-2014 National Survey on Drug Use and Health: Methodological Field Tests (OMB No. 0930-0290..., SAMHSA received a three-year renewal of its generic clearance for methodological field tests. This will be a request for another renewal of the generic approval to continue methodological tests over the...

  13. Eye-Tracking as a Tool in Process-Oriented Reading Test Validation

    ERIC Educational Resources Information Center

    Solheim, Oddny Judith; Uppstad, Per Henning

    2011-01-01

    The present paper addresses the continuous need for methodological reflection on how to validate inferences made on the basis of test scores. Validation is a process that requires many lines of evidence. In this article we discuss the potential of eye tracking methodology in process-oriented reading test validation. Methodological considerations…

  14. A comparative review of nurse turnover rates and costs across countries.

    PubMed

    Duffield, Christine M; Roche, Michael A; Homer, Caroline; Buchan, James; Dimitrelis, Sofia

    2014-12-01

    To compare nurse turnover rates and costs from four studies in four countries (US, Canada, Australia, New Zealand) that have used the same costing methodology; the original Nursing Turnover Cost Calculation Methodology. Measuring and comparing the costs and rates of turnover is difficult because of differences in definitions and methodologies. Comparative review. Searches were carried out within CINAHL, Business Source Complete and Medline for studies that used the original Nursing Turnover Cost Calculation Methodology and reported on both costs and rates of nurse turnover, published from 2014 and prior. A comparative review of turnover data was conducted using four studies that employed the original Nursing Turnover Cost Calculation Methodology. Costing data items were converted to percentages, while total turnover costs were converted to US 2014 dollars and adjusted according to inflation rates, to permit cross-country comparisons. Despite using the same methodology, Australia reported significantly higher turnover costs ($48,790) due to higher termination (~50% of indirect costs) and temporary replacement costs (~90% of direct costs). Costs were almost 50% lower in the US ($20,561), Canada ($26,652) and New Zealand ($23,711). Turnover rates also varied significantly across countries with the highest rate reported in New Zealand (44·3%) followed by the US (26·8%), Canada (19·9%) and Australia (15·1%). A significant proportion of turnover costs are attributed to temporary replacement, highlighting the importance of nurse retention. The authors suggest a minimum dataset is also required to eliminate potential variability across countries, states, hospitals and departments. © 2014 John Wiley & Sons Ltd.

  15. An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing.

    PubMed

    Chalil Madathil, Kapil; Greenstein, Joel S

    2017-11-01

    Collaborative virtual reality-based systems have integrated high fidelity voice-based communication, immersive audio and screen-sharing tools into virtual environments. Such three-dimensional collaborative virtual environments can mirror the collaboration among usability test participants and facilitators when they are physically collocated, potentially enabling moderated usability tests to be conducted effectively when the facilitator and participant are located in different places. We developed a virtual collaborative three-dimensional remote moderated usability testing laboratory and employed it in a controlled study to evaluate the effectiveness of moderated usability testing in a collaborative virtual reality-based environment with two other moderated usability testing methods: the traditional lab approach and Cisco WebEx, a web-based conferencing and screen sharing approach. Using a mixed methods experimental design, 36 test participants and 12 test facilitators were asked to complete representative tasks on a simulated online shopping website. The dependent variables included the time taken to complete the tasks; the usability defects identified and their severity; and the subjective ratings on the workload index, presence and satisfaction questionnaires. Remote moderated usability testing methodology using a collaborative virtual reality system performed similarly in terms of the total number of defects identified, the number of high severity defects identified and the time taken to complete the tasks with the other two methodologies. The overall workload experienced by the test participants and facilitators was the least with the traditional lab condition. No significant differences were identified for the workload experienced with the virtual reality and the WebEx conditions. However, test participants experienced greater involvement and a more immersive experience in the virtual environment than in the WebEx condition. The ratings for the virtual environment condition were not significantly different from those for the traditional lab condition. The results of this study suggest that participants were productive and enjoyed the virtual lab condition, indicating the potential of a virtual world based approach as an alternative to conventional approaches for synchronous usability testing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Inhibition of acetylcholinesterase in guppies (Poecilia reticulata) by chlorpyrifos at sublethal concentrations: Methodological aspects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van der Wel, H.; Welling, W.

    1989-04-01

    Acetylcholinesterase activity is a potential biochemical indicator of toxic stress in fish and a sensitive parameter for testing water for the presence of organophosphates. A number of methodological aspects regarding the determination of the in vivo effect of chlorpyrifos on acetylcholinesterase in guppies have been investigated. It was found that with acetylthiocholine as a substrate, the contribution of pseudocholinesterase to the total cholinesterase activity can be neglected. Protection of acetylcholinesterase of guppies exposed to chlorpyrifos from additional, artifactual in vitro enzyme inhibition during homogenization is necessary. Very low concentrations of acetone in the exposure medium, resulting from dilution of themore » stock solution of chlorpyrifos in acetone, can result in large decreases in the oxygen content of this medium. This may affect the uptake rate of the toxic compound and, thereby, cholinesterase inhibition. Very low, sublethal concentrations of chlorpyrifos result in high inhibition levels of acetylcholinesterase (80-90%) in guppies within 2 weeks of continuous exposure. Recovery of the enzyme activity occurs after the exposed animals are kept in clean medium for 4 days, but the rate of recovery is considerably lower than the rate of inhibition.« less

  17. The Relevance of External Quality Assessment for Molecular Testing for ALK Positive Non-Small Cell Lung Cancer: Results from Two Pilot Rounds Show Room for Optimization

    PubMed Central

    Tembuyser, Lien; Tack, Véronique; Zwaenepoel, Karen; Pauwels, Patrick; Miller, Keith; Bubendorf, Lukas; Kerr, Keith; Schuuring, Ed; Thunnissen, Erik; Dequeker, Elisabeth M. C.

    2014-01-01

    Background and Purpose Molecular profiling should be performed on all advanced non-small cell lung cancer with non-squamous histology to allow treatment selection. Currently, this should include EGFR mutation testing and testing for ALK rearrangements. ROS1 is another emerging target. ALK rearrangement status is a critical biomarker to predict response to tyrosine kinase inhibitors such as crizotinib. To promote high quality testing in non-small cell lung cancer, the European Society of Pathology has introduced an external quality assessment scheme. This article summarizes the results of the first two pilot rounds organized in 2012–2013. Materials and Methods Tissue microarray slides consisting of cell-lines and resection specimens were distributed with the request for routine ALK testing using IHC or FISH. Participation in ALK FISH testing included the interpretation of four digital FISH images. Results Data from 173 different laboratories was obtained. Results demonstrate decreased error rates in the second round for both ALK FISH and ALK IHC, although the error rates were still high and the need for external quality assessment in laboratories performing ALK testing is evident. Error rates obtained by FISH were lower than by IHC. The lowest error rates were observed for the interpretation of digital FISH images. Conclusion There was a large variety in FISH enumeration practices. Based on the results from this study, recommendations for the methodology, analysis, interpretation and result reporting were issued. External quality assessment is a crucial element to improve the quality of molecular testing. PMID:25386659

  18. The relevance of external quality assessment for molecular testing for ALK positive non-small cell lung cancer: results from two pilot rounds show room for optimization.

    PubMed

    Tembuyser, Lien; Tack, Véronique; Zwaenepoel, Karen; Pauwels, Patrick; Miller, Keith; Bubendorf, Lukas; Kerr, Keith; Schuuring, Ed; Thunnissen, Erik; Dequeker, Elisabeth M C

    2014-01-01

    Molecular profiling should be performed on all advanced non-small cell lung cancer with non-squamous histology to allow treatment selection. Currently, this should include EGFR mutation testing and testing for ALK rearrangements. ROS1 is another emerging target. ALK rearrangement status is a critical biomarker to predict response to tyrosine kinase inhibitors such as crizotinib. To promote high quality testing in non-small cell lung cancer, the European Society of Pathology has introduced an external quality assessment scheme. This article summarizes the results of the first two pilot rounds organized in 2012-2013. Tissue microarray slides consisting of cell-lines and resection specimens were distributed with the request for routine ALK testing using IHC or FISH. Participation in ALK FISH testing included the interpretation of four digital FISH images. Data from 173 different laboratories was obtained. Results demonstrate decreased error rates in the second round for both ALK FISH and ALK IHC, although the error rates were still high and the need for external quality assessment in laboratories performing ALK testing is evident. Error rates obtained by FISH were lower than by IHC. The lowest error rates were observed for the interpretation of digital FISH images. There was a large variety in FISH enumeration practices. Based on the results from this study, recommendations for the methodology, analysis, interpretation and result reporting were issued. External quality assessment is a crucial element to improve the quality of molecular testing.

  19. Application of an Artificial Neural Network to the Prediction of OH Radical Reaction Rate Constants for Evaluating Global Warming Potential.

    PubMed

    Allison, Thomas C

    2016-03-03

    Rate constants for reactions of chemical compounds with hydroxyl radical are a key quantity used in evaluating the global warming potential of a substance. Experimental determination of these rate constants is essential, but it can also be difficult and time-consuming to produce. High-level quantum chemistry predictions of the rate constant can suffer from the same issues. Therefore, it is valuable to devise estimation schemes that can give reasonable results on a variety of chemical compounds. In this article, the construction and training of an artificial neural network (ANN) for the prediction of rate constants at 298 K for reactions of hydroxyl radical with a diverse set of molecules is described. Input to the ANN consists of counts of the chemical bonds and bends present in the target molecule. The ANN is trained using 792 (•)OH reaction rate constants taken from the NIST Chemical Kinetics Database. The mean unsigned percent error (MUPE) for the training set is 12%, and the MUPE of the testing set is 51%. It is shown that the present methodology yields rate constants of reasonable accuracy for a diverse set of inputs. The results are compared to high-quality literature values and to another estimation scheme. This ANN methodology is expected to be of use in a wide range of applications for which (•)OH reaction rate constants are required. The model uses only information that can be gathered from a 2D representation of the molecule, making the present approach particularly appealing, especially for screening applications.

  20. A systematic review of randomized trials assessing human papillomavirus testing in cervical cancer screening

    PubMed Central

    Patanwala, Insiyyah Y.; Bauer, Heidi M.; Miyamoto, Justin; Park, Ina U.; Huchko, Megan J.; Smith-McCune, Karen K.

    2013-01-01

    Our objective was to assess the sensitivity and specificity of human papillomavirus (HPV) testing for cervical cancer screening in randomized trials. We conducted a systematic literature search of the following databases: MEDLINE, CINAHL, EMBASE, and Cochrane. Eligible studies were randomized trials comparing HPV-based to cytology-based screening strategies, with disease status determined by colposcopy/biopsy for participants with positive results. Disease rates (cervical intraepithelial neoplasia [CIN]2 or greater and CIN3 or greater), sensitivity, and positive predictive value were abstracted or calculated from the articles. Six studies met inclusion criteria. Relative sensitivities for detecting CIN3 or greater of HPV testing-based strategies vs cytology ranged from 0.8 to 2.1. The main limitation of our study was that testing methodologies and screening/management protocols were highly variable across studies. Screening strategies in which a single initial HPV-positive test led to colposcopy were more sensitive than cytology but resulted in higher colposcopy rates. These results have implications for cotesting with HPV and cytology as recommended in the United States. PMID:23159693

  1. ChargeOut! : discounted cash flow compared with traditional machine-rate analysis

    Treesearch

    Ted Bilek

    2008-01-01

    ChargeOut!, a discounted cash-flow methodology in spreadsheet format for analyzing machine costs, is compared with traditional machine-rate methodologies. Four machine-rate models are compared and a common data set representative of logging skidders’ costs is used to illustrate the differences between ChargeOut! and the machine-rate methods. The study found that the...

  2. Force on Force Modeling with Formal Task Structures and Dynamic Geometry

    DTIC Science & Technology

    2017-03-24

    task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test

  3. Field Test of the Methodology for Succession Planning for Technical Experts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cain, Ronald A.; Kirk, Bernadette Lugue; Agreda, Carla L.

    This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, hereafter referred to as “core competencies”. The methodology has been field tested by interviewing selected retiring subject matter experts (SMEs).

  4. Whole-Genome Sequencing and Assembly with High-Throughput, Short-Read Technologies

    PubMed Central

    Sundquist, Andreas; Ronaghi, Mostafa; Tang, Haixu; Pevzner, Pavel; Batzoglou, Serafim

    2007-01-01

    While recently developed short-read sequencing technologies may dramatically reduce the sequencing cost and eventually achieve the $1000 goal for re-sequencing, their limitations prevent the de novo sequencing of eukaryotic genomes with the standard shotgun sequencing protocol. We present SHRAP (SHort Read Assembly Protocol), a sequencing protocol and assembly methodology that utilizes high-throughput short-read technologies. We describe a variation on hierarchical sequencing with two crucial differences: (1) we select a clone library from the genome randomly rather than as a tiling path and (2) we sample clones from the genome at high coverage and reads from the clones at low coverage. We assume that 200 bp read lengths with a 1% error rate and inexpensive random fragment cloning on whole mammalian genomes is feasible. Our assembly methodology is based on first ordering the clones and subsequently performing read assembly in three stages: (1) local assemblies of regions significantly smaller than a clone size, (2) clone-sized assemblies of the results of stage 1, and (3) chromosome-sized assemblies. By aggressively localizing the assembly problem during the first stage, our method succeeds in assembling short, unpaired reads sampled from repetitive genomes. We tested our assembler using simulated reads from D. melanogaster and human chromosomes 1, 11, and 21, and produced assemblies with large sets of contiguous sequence and a misassembly rate comparable to other draft assemblies. Tested on D. melanogaster and the entire human genome, our clone-ordering method produces accurate maps, thereby localizing fragment assembly and enabling the parallelization of the subsequent steps of our pipeline. Thus, we have demonstrated that truly inexpensive de novo sequencing of mammalian genomes will soon be possible with high-throughput, short-read technologies using our methodology. PMID:17534434

  5. Targeted Pressure Management During CO 2 Sequestration: Optimization of Well Placement and Brine Extraction

    DOE PAGES

    Cihan, Abdullah; Birkholzer, Jens; Bianchi, Marco

    2014-12-31

    Large-scale pressure increases resulting from carbon dioxide (CO 2) injection in the subsurface can potentially impact caprock integrity, induce reactivation of critically stressed faults, and drive CO 2 or brine through conductive features into shallow groundwater. Pressure management involving the extraction of native fluids from storage formations can be used to minimize pressure increases while maximizing CO2 storage. However, brine extraction requires pumping, transportation, possibly treatment, and disposal of substantial volumes of extracted brackish or saline water, all of which can be technically challenging and expensive. This paper describes a constrained differential evolution (CDE) algorithm for optimal well placement andmore » injection/ extraction control with the goal of minimizing brine extraction while achieving predefined pressure contraints. The CDE methodology was tested for a simple optimization problem whose solution can be partially obtained with a gradient-based optimization methodology. The CDE successfully estimated the true global optimum for both extraction well location and extraction rate, needed for the test problem. A more complex example application of the developed strategy was also presented for a hypothetical CO 2 storage scenario in a heterogeneous reservoir consisting of a critically stressed fault nearby an injection zone. Through the CDE optimization algorithm coupled to a numerical vertically-averaged reservoir model, we successfully estimated optimal rates and locations for CO 2 injection and brine extraction wells while simultaneously satisfying multiple pressure buildup constraints to avoid fault activation and caprock fracturing. The study shows that the CDE methodology is a very promising tool to solve also other optimization problems related to GCS, such as reducing ‘Area of Review’, monitoring design, reducing risk of leakage and increasing storage capacity and trapping.« less

  6. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    DTIC Science & Technology

    2016-05-01

    identifying and mapping flaw size distributions on glass surfaces for predicting mechanical response. International Journal of Applied Glass ...ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education

  7. 75 FR 78720 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    .... Proposed Project: 2011-2014 National Survey on Drug Use and Health: Methodological Field Tests (OMB No..., SAMHSA received a 3-year renewal of its generic clearance for methodological field tests. This will be a request for another renewal of the generic approval to continue methodological tests over the next 3 years...

  8. Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments

    DTIC Science & Technology

    2016-03-24

    NUWC-NPT Technical Report 12,186 24 March 2016 Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology , and...Microfibers, Test Methodology , and Experiments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Anthony B...5 4 MEASUREMENTS AND EXPERIMENTAL APPARATUS ...........................................9 5 SAMPLE PREPARATION

  9. TESTING TREE-CLASSIFIER VARIANTS AND ALTERNATE MODELING METHODOLOGIES IN THE EAST GREAT BASIN MAPPING UNIT OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    EPA Science Inventory

    We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...

  10. 76 FR 67515 - Self-Regulatory Organizations; Chicago Mercantile Exchange, Inc.; Notice of Filing and Order...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ..., determined by the Clearing House using stress test methodology equal to the theoretical two largest IRS Clearing Member losses produced by such stress test or such other methodology determined by the IRS Risk... portion, determined by the Clearing House using stress test methodology equal to the theoretical third and...

  11. De-individualized psychophysiological strain assessment during a flight simulation test—Validation of a space methodology

    NASA Astrophysics Data System (ADS)

    Johannes, Bernd; Salnitski, Vyacheslav; Soll, Henning; Rauch, Melina; Hoermann, Hans-Juergen

    For the evaluation of an operator's skill reliability indicators of work quality as well as of psychophysiological states during the work have to be considered. The herein presented methodology and measurement equipment were developed and tested in numerous terrestrial and space experiments using a simulation of a spacecraft docking on a space station. However, in this study the method was applied to a comparable terrestrial task—the flight simulator test (FST) used in the DLR selection procedure for ab initio pilot applicants for passenger airlines. This provided a large amount of data for a statistical verification of the space methodology. For the evaluation of the strain level of applicants during the FST psychophysiological measurements were used to construct a "psychophysiological arousal vector" (PAV) which is sensitive to various individual reaction patterns of the autonomic nervous system to mental load. Its changes and increases will be interpreted as "strain". In the first evaluation study, 614 subjects were analyzed. The subjects first underwent a calibration procedure for the assessment of their autonomic outlet type (AOT) and on the following day they performed the FST, which included three tasks and was evaluated by instructors applying well-established and standardized rating scales. This new method will possibly promote a wide range of other future applications in aviation and space psychology.

  12. Utilization of genetic tests: analysis of gene-specific billing in Medicare claims data.

    PubMed

    Lynch, Julie A; Berse, Brygida; Dotson, W David; Khoury, Muin J; Coomer, Nicole; Kautter, John

    2017-08-01

    We examined the utilization of precision medicine tests among Medicare beneficiaries through analysis of gene-specific tier 1 and 2 billing codes developed by the American Medical Association in 2012. We conducted a retrospective cross-sectional study. The primary source of data was 2013 Medicare 100% fee-for-service claims. We identified claims billed for each laboratory test, the number of patients tested, expenditures, and the diagnostic codes indicated for testing. We analyzed variations in testing by patient demographics and region of the country. Pharmacogenetic tests were billed most frequently, accounting for 48% of the expenditures for new codes. The most common indications for testing were breast cancer, long-term use of medications, and disorders of lipid metabolism. There was underutilization of guideline-recommended tumor mutation tests (e.g., epidermal growth factor receptor) and substantial overutilization of a test discouraged by guidelines (methylenetetrahydrofolate reductase). Methodology-based tier 2 codes represented 15% of all claims billed with the new codes. The highest rate of testing per beneficiary was in Mississippi and the lowest rate was in Alaska. Gene-specific billing codes significantly improved our ability to conduct population-level research of precision medicine. Analysis of these data in conjunction with clinical records should be conducted to validate findings.Genet Med advance online publication 26 January 2017.

  13. Classification of malignant and benign lung nodules using taxonomic diversity index and phylogenetic distance.

    PubMed

    de Sousa Costa, Robherson Wector; da Silva, Giovanni Lucca França; de Carvalho Filho, Antonio Oseas; Silva, Aristófanes Corrêa; de Paiva, Anselmo Cardoso; Gattass, Marcelo

    2018-05-23

    Lung cancer presents the highest cause of death among patients around the world, in addition of being one of the smallest survival rates after diagnosis. Therefore, this study proposes a methodology for diagnosis of lung nodules in benign and malignant tumors based on image processing and pattern recognition techniques. Mean phylogenetic distance (MPD) and taxonomic diversity index (Δ) were used as texture descriptors. Finally, the genetic algorithm in conjunction with the support vector machine were applied to select the best training model. The proposed methodology was tested on computed tomography (CT) images from the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), with the best sensitivity of 93.42%, specificity of 91.21%, accuracy of 91.81%, and area under the ROC curve of 0.94. The results demonstrate the promising performance of texture extraction techniques using mean phylogenetic distance and taxonomic diversity index combined with phylogenetic trees. Graphical Abstract Stages of the proposed methodology.

  14. Acquisition Challenge: The Importance of Incompressibility in Comparing Learning Curve Models

    DTIC Science & Technology

    2015-10-01

    parameters for all four learning mod- els used in the study . The learning rate factor, b, is the slope of the linear regression line, which in this case is...incorporated within the DoD acquisition environment. This study tested three alternative learning models (the Stanford-B model, DeJong’s learning formula...appropriate tools to calculate accurate and reliable predictions. However, conventional learning curve methodology has been in practice since the pre

  15. 18 CFR 342.4 - Other rate changing methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Other rate changing methodologies. 342.4 Section 342.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... regard to the applicable ceiling level under § 342.3. (b) Market-based rates. A carrier may attempt to...

  16. Exact test-based approach for equivalence test with parameter margin.

    PubMed

    Cassie Dong, Xiaoyu; Bian, Yuanyuan; Tsong, Yi; Wang, Tianhua

    2017-01-01

    The equivalence test has a wide range of applications in pharmaceutical statistics which we need to test for the similarity between two groups. In recent years, the equivalence test has been used in assessing the analytical similarity between a proposed biosimilar product and a reference product. More specifically, the mean values of the two products for a given quality attribute are compared against an equivalence margin in the form of ±f × σ R , where ± f × σ R is a function of the reference variability. In practice, this margin is unknown and is estimated from the sample as ±f × S R . If we use this estimated margin with the classic t-test statistic on the equivalence test for the means, both Type I and Type II error rates may inflate. To resolve this issue, we develop an exact-based test method and compare this method with other proposed methods, such as the Wald test, the constrained Wald test, and the Generalized Pivotal Quantity (GPQ) in terms of Type I error rate and power. Application of those methods on data analysis is also provided in this paper. This work focuses on the development and discussion of the general statistical methodology and is not limited to the application of analytical similarity.

  17. Objective impairments of gait and balance in adults living with HIV-1 infection: a systematic review and meta-analysis of observational studies.

    PubMed

    Berner, Karina; Morris, Linzette; Baumeister, Jochen; Louw, Quinette

    2017-08-01

    Gait and balance deficits are reported in adults with HIV infection and are associated with reduced quality of life. Current research suggests an increased fall-incidence in this population, with fall rates among middle-aged adults with HIV approximating that in seronegative elderly populations. Gait and postural balance rely on a complex interaction of the motor system, sensory control, and cognitive function. However, due to disease progression and complications related to ongoing inflammation, these systems may be compromised in people with HIV. Consequently, locomotor impairments may result that can contribute to higher-than-expected fall rates. The aim of this review was to synthesize the evidence regarding objective gait and balance impairments in adults with HIV, and to emphasize those which could contribute to increased fall risk. This review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. An electronic search of published observational studies was conducted in March 2016. Methodological quality was assessed using the NIH Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies. Narrative synthesis of gait and balance outcomes was performed, and meta-analyses where possible. Seventeen studies were included, with fair to low methodological quality. All studies used clinical tests for gait-assessment. Gait outcomes assessed were speed, initiation-time and cadence. No studies assessed kinetics or kinematics. Balance was assessed using both instrumented and clinical tests. Outcomes were mainly related to center of pressure, postural reflex latencies, and timed clinical tests. There is some agreement that adults with HIV walk slower and have increased center of pressure excursions and -long loop postural reflex latencies, particularly under challenging conditions. Gait and balance impairments exist in people with HIV, resembling fall-associated parameters in the elderly. Impairments are more pronounced during challenging conditions, might be associated with disease severity, are not influenced by antiretroviral therapy, and might not be associated with peripheral neuropathy. Results should be interpreted cautiously due to overall poor methodological quality and heterogeneity. Locomotor impairments in adults with HIV are currently insufficiently quantified. Future research involving more methodological uniformity is warranted to better understand such impairments and to inform clinical decision-making, including fall-prevention strategies, in this population.

  18. 76 FR 50993 - Agency Information Collection Activities: Proposed Collection; Comment Request-Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...

  19. Analysis of Two Advanced Smoothing Algorithms.

    DTIC Science & Technology

    1985-09-01

    59 B. METHODOLOGY . ......... ........... 60 6 C. TESTING AND RESULTS ---- LINEAR UNDERLYING FUNCTION...SMOOTHING ALGORITHMS ...... .................... 94 A. GENERAL ......... ....................... .. 94 B. METHODOLOGY ............................ .95 C...to define succinctly. 59 B. METHODOLOGY There is no established procedure to follow in testing the efficiency and effectiveness of a smoothing

  20. Proposed Objective Odor Control Test Methodology for Waste Containment

    NASA Technical Reports Server (NTRS)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  1. Numerical study on the sequential Bayesian approach for radioactive materials detection

    NASA Astrophysics Data System (ADS)

    Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng

    2013-01-01

    A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.

  2. [Optimization of process of icraiin be hydrolyzed to Baohuoside I by cellulase based on Plackett-Burman design combined with CCD response surface methodology].

    PubMed

    Song, Chuan-xia; Chen, Hong-mei; Dai, Yu; Kang, Min; Hu, Jia; Deng, Yun

    2014-11-01

    To optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase by Plackett-Burman design combined with Central Composite Design (CCD) response surface methodology. To select the main influencing factors by Plackett-Burman design, using CCD response surface methodology to optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase. Taking substrate concentration, the pH of buffer and reaction time as independent variables, with conversion rate of icariin as dependent variable,using regression fitting of completely quadratic response surface between independent variable and dependent variable,the optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase was intuitively analyzed by 3D surface chart, and taking verification tests and predictive analysis. The best enzymatic hydrolytic process was as following: substrate concentration 8. 23 mg/mL, pH 5. 12 of buffer,reaction time 35. 34 h. The optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase is determined by Plackett-Burman design combined with CCD response surface methodology. The optimized enzymatic hydrolytic process is simple, convenient, accurate, reproducible and predictable.

  3. Attrition Rate of Oxygen Carriers in Chemical Looping Combustion Systems

    NASA Astrophysics Data System (ADS)

    Feilen, Harry Martin

    This project developed an evaluation methodology for determining, accurately and rapidly, the attrition resistance of oxygen carrier materials used in chemical looping technologies. Existing test protocols, to evaluate attrition resistance of granular materials, are conducted under non-reactive and ambient temperature conditions. They do not accurately reflect the actual behavior under the unique process conditions of chemical looping, including high temperatures and cyclic operation between oxidizing and reducing atmospheres. This project developed a test method and equipment that represented a significant improvement over existing protocols. Experimental results obtained from this project have shown that hematite exhibits different modes of attrition, including both due to mechanical stresses and due to structural changes in the particles due to chemical reaction at high temperature. The test methodology has also proven effective in providing reactivity changes of the material with continued use, a property, which in addition to attrition, determines material life. Consumption/replacement cost due to attrition or loss of reactivity is a critical factor in the economic application of the chemical looping technology. This test method will allow rapid evaluation of a wide range of materials that are best suited for this technology. The most important anticipated public benefit of this project is the acceleration of the development of chemical looping technology for lowering greenhouse gas emissions from fossil fuel combustion.

  4. SU-F-BRD-04: Robustness Analysis of Proton Breast Treatments Using An Alpha-Stable Distribution Parameterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van den Heuvel, F; Hackett, S; Fiorini, F

    Purpose: Currently, planning systems allow robustness calculations to be performed, but a generalized assessment methodology is not yet available. We introduce and evaluate a methodology to quantify the robustness of a plan on an individual patient basis. Methods: We introduce the notion of characterizing a treatment instance (i.e. one single fraction delivery) by describing the dose distribution within an organ as an alpha-stable distribution. The parameters of the distribution (shape(α), scale(γ), position(δ), and symmetry(β)), will vary continuously (in a mathematical sense) as the distributions change with the different positions. The rate of change of the parameters provides a measure ofmore » the robustness of the treatment. The methodology is tested in a planning study of 25 patients with known residual errors at each fraction. Each patient was planned using Eclipse with an IBA-proton beam model. The residual error space for every patient was sampled 30 times, yielding 31 treatment plans for each patient and dose distributions in 5 organs. The parameters’ change rate as a function of Euclidean distance from the original plan was analyzed. Results: More than 1,000 dose distributions were analyzed. For 4 of the 25 patients the change in scale rate (γ) was considerably higher than the lowest change rate, indicating a lack of robustness. The sign of the shape change rate (α) also seemed indicative but the experiment lacked the power to prove significance. Conclusion: There are indications that this robustness measure is a valuable tool to allow a more patient individualized approach to the determination of margins. In a further study we will also evaluate this robustness measure using photon treatments, and evaluate the impact of using breath hold techniques, and the a Monte Carlo based dose deposition calculation. A principle component analysis is also planned.« less

  5. Brittle materials at high-loading rates: an open area of research

    NASA Astrophysics Data System (ADS)

    Forquin, Pascal

    2017-01-01

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates. This article is part of the themed issue 'Experimental testing and modelling of brittle materials at high strain rates'.

  6. Brittle materials at high-loading rates: an open area of research.

    PubMed

    Forquin, Pascal

    2017-01-28

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates.This article is part of the themed issue 'Experimental testing and modelling of brittle materials at high strain rates'. © 2016 The Author(s).

  7. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology

    PubMed Central

    Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele

    2015-01-01

    In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers. PMID:28793427

  8. ICS-II USA research design and methodology.

    PubMed

    Rana, H; Andersen, R M; Nakazono, T T; Davidson, P L

    1997-05-01

    The purpose of the WHO-sponsored International Collaborative Study of Oral Health Outcomes (ICS-II) was to provide policy-markers and researchers with detailed, reliable, and valid data on the oral health situation in their countries or regions, together with comparative data from other dental care delivery systems. ICS-II used a cross-sectional design with no explicit control groups or experimental interventions. A standardized methodology was developed and tested for collecting and analyzing epidemiological, sociocultural, economic, and delivery system data. Respondent information was obtained by household interviews, and clinical examinations were conducted by calibrated oral epidemiologists. Discussed are the sampling design characteristics for the USA research locations, response rates, samples size for interview and oral examination data, weighting procedures, and statistical methods. SUDAAN was used to adjust variance calculations, since complex sampling designs were used.

  9. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology.

    PubMed

    Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele

    2015-07-07

    In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers.

  10. Working on a Standard Joint Unit: A pilot test.

    PubMed

    Casajuana, Cristina; López-Pelayo, Hugo; Mercedes Balcells, María; Miquel, Laia; Teixidó, Lídia; Colom, Joan; Gual, Antoni

    2017-09-29

    Assessing cannabis consumption remains complex due to no reliable registration systems. We tested the likelihood of establishing a Standard Joint Unit (SJU) which considers the main cannabinoids with implication on health through a naturalistic approach.  Methodology. Pilot study with current cannabis users of four areas of Barcelona: universities, nightclubs, out-patient mental health service, and cannabis associations. We designed and administered a questionnaire on cannabis use-patterns and determined the willingness to donate a joint for analysis. Descriptive statistics were used to analyze the data. Forty volunteers answered the questionnaire (response rate 95%); most of them were men (72.5%) and young adults (median age 24.5 years; IQR 8.75 years) who consume daily or nearly daily (70%). Most participants consume marihuana (85%) and roll their joints with a median of 0.25 gr of marihuana. Two out of three (67.5%) stated they were willing to donate a joint. Obtaining an SJU with the planned methodology has proved to be feasible. Pre-testing resulted in an improvement of the questionnaire and retribution to incentivize donations. Establishing an SJU is essential to improve our knowledge on cannabis-related outcomes.

  11. ACL reconstruction in patients aged 40 years and older: a systematic review and introduction of a new methodology score for ACL studies.

    PubMed

    Brown, Christopher A; McAdams, Timothy R; Harris, Alex H S; Maffulli, Nicola; Safran, Marc R

    2013-09-01

    Treatment of the anterior cruciate ligament (ACL)-deficient knee in older patients remains a core debate. To perform a systematic review of studies that assessed outcomes in patients aged 40 years and older treated with ACL reconstruction and to provide a new methodological scoring system that is directed at critical assessment of studies evaluating ACL surgical outcomes: the ACL Methodology Score (AMS). Systematic review. A comprehensive literature search was performed from 1995 to 2012 using MEDLINE, EMBASE, and Scopus. Inclusion criteria for studies were primary ACL injury, patient age of 40 years and older, and mean follow-up of at least 21 months after reconstruction. Nineteen studies met the inclusion criteria from the 371 abstracts from MEDLINE and 880 abstracts from Scopus. Clinical outcomes (International Knee Documentation Committee [IKDC], Lysholm, and Tegner activity scores), joint stability measures (Lachman test, pivot-shift test, and instrumented knee arthrometer assessment), graft type, complications, and reported chondral or meniscal injury were evaluated in this review. A new methodology scoring system was developed to be specific at critically analyzing ACL outcome studies and used to examine each study design. Nineteen studies describing 627 patients (632 knees; mean age, 49.0 years; range, 42.6-60.0 years) were included in the review. The mean time to surgery was 32.0 months (range, 2.9-88.0 months), with a mean follow-up of 40.2 months (range, 21.0-114.0 months). The IKDC, Lysholm, and Tegner scores and knee laxity assessment indicated favorable results in the studies that reported these outcomes. Patients did not demonstrate a significant difference between graft types and functional outcome scores or stability assessment. The mean AMS was 43.9 ± 7.2 (range, 33.5-57.5). The level of evidence rating did not positively correlate with the AMS, which suggests that the new AMS system may be able to detect errors in methodology or reporting that may not be taken into account by the classic level of evidence rating. Patients aged 40 years and older with an ACL injury can have satisfactory outcomes after reconstruction. However, the quality of currently available data is still limited, such that further well-designed studies are needed to determine long-term efficacy and to better inform our patients with regard to expected outcomes.

  12. Methods of analysis speech rate: a pilot study.

    PubMed

    Costa, Luanna Maria Oliveira; Martins-Reis, Vanessa de Oliveira; Celeste, Letícia Côrrea

    2016-01-01

    To describe the performance of fluent adults in different measures of speech rate. The study included 24 fluent adults, of both genders, speakers of Brazilian Portuguese, who were born and still living in the metropolitan region of Belo Horizonte, state of Minas Gerais, aged between 18 and 59 years. Participants were grouped by age: G1 (18-29 years), G2 (30-39 years), G3 (40-49 years), and G4 (50-59 years). The speech samples were obtained following the methodology of the Speech Fluency Assessment Protocol. In addition to the measures of speech rate proposed by the protocol (speech rate in words and syllables per minute), the rate of speech into phonemes per second and the articulation rate with and without the disfluencies were calculated. We used the nonparametric Friedman test and the Wilcoxon test for multiple comparisons. Groups were compared using the nonparametric Kruskal Wallis. The significance level was of 5%. There were significant differences between measures of speech rate involving syllables. The multiple comparisons showed that all the three measures were different. There was no effect of age for the studied measures. These findings corroborate previous studies. The inclusion of temporal acoustic measures such as speech rate in phonemes per second and articulation rates with and without disfluencies can be a complementary approach in the evaluation of speech rate.

  13. A methodology for testing fault-tolerant software

    NASA Technical Reports Server (NTRS)

    Andrews, D. M.; Mahmood, A.; Mccluskey, E. J.

    1985-01-01

    A methodology for testing fault tolerant software is presented. There are problems associated with testing fault tolerant software because many errors are masked or corrected by voters, limiter, or automatic channel synchronization. This methodology illustrates how the same strategies used for testing fault tolerant hardware can be applied to testing fault tolerant software. For example, one strategy used in testing fault tolerant hardware is to disable the redundancy during testing. A similar testing strategy is proposed for software, namely, to move the major emphasis on testing earlier in the development cycle (before the redundancy is in place) thus reducing the possibility that undetected errors will be masked when limiters and voters are added.

  14. Standardized Laboratory Test Requirements for Hardening Equipment to Withstand Wave Impact Shock in Small High Speed Craft

    DTIC Science & Technology

    2017-02-06

    and methodology for transitioning craft acceleration data to laboratory shock test requirements are summarized and example requirements for...engineering rationale, assumptions, and methodology for transitioning craft acceleration data to laboratory shock test requirements are summarized and... Methodologies for Small High-Speed Craft Structure, Equipment, Shock Isolation Seats, and Human Performance At-Sea, 10 th Symposium on High

  15. Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.

    1977-01-01

    A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.

  16. Assessing the Medication Adherence App Marketplace From the Health Professional and Consumer Vantage Points

    PubMed Central

    2017-01-01

    Background Nonadherence produces considerable health consequences and economic burden to patients and payers. One approach to improve medication nonadherence that has gained interest in recent years is the use of smartphone adherence apps. The development of smartphone adherence apps has increased rapidly since 2012; however, literature evaluating the clinical app and effectiveness of smartphone adherence apps to improve medication adherence is generally lacking. Objective The aims of this study were to (1) provide an updated evaluation and comparison of medication adherence apps in the marketplace by assessing the features, functionality, and health literacy (HL) of the highest-ranking adherence apps and (2) indirectly measure the validity of our rating methodology by determining the relationship between our app evaluations and Web-based consumer ratings. Methods Two independent reviewers assessed the features and functionality using a 4-domain rating tool of all adherence apps identified based on developer claims. The same reviewers downloaded and tested the 100 highest-ranking apps including an additional domain for assessment of HL. Pearson product correlations were estimated between the consumer ratings and our domain and total scores. Results A total of 824 adherence apps were identified; of these, 645 unique apps were evaluated after applying exclusion criteria. The median initial score based on descriptions was 14 (max of 68; range 0-60). As a result, 100 of the highest-scoring unique apps underwent user testing. The median overall user-tested score was 31.5 (max of 73; range 0-60). The majority of the user tested the adherence apps that underwent user testing reported a consumer rating score in their respective online marketplace. The mean consumer rating was 3.93 (SD 0.84). The total user-tested score was positively correlated with consumer ratings (r=.1969, P=.04). Conclusions More adherence apps are available in the Web-based marketplace, and the quality of these apps varies considerably. Consumer ratings are positively but weakly correlated with user-testing scores suggesting that our rating tool has some validity but that consumers and clinicians may assess adherence app quality differently. PMID:28428169

  17. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    PubMed

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.

  18. Radiation Susceptibility Assessment of Off the Shelf (OTS) Hardware

    NASA Technical Reports Server (NTRS)

    Culpepper, William X.; Nicholson, Leonard L. (Technical Monitor)

    2000-01-01

    The reduction in budgets, shortening of schedules and necessity of flying near state of the art technology have forced projects and designers to utilize not only modern, non-space rated EEE parts but also OTS boards, subassemblies and systems. New instrumentation, communications, portable computers and navigation systems for the International Space Station, Space Shuttle, and Crew Return Vehicle are examples of the realization of this paradigm change at the Johnson Space Center. Because of this change, there has been a shift in the radiation assessment methodology from individual part testing using low energy heavy ions to board and box level testing using high-energy particle beams. Highlights of several years of board and system level testing are presented along with lessons learned, present areas of concern, insights into test costs, and future challenges.

  19. COTS Ceramic Chip Capacitors: An Evaluation of the Parts and Assurance Methodologies

    NASA Technical Reports Server (NTRS)

    Brusse, Jay A.; Sampson, Michael J.

    2004-01-01

    Commercial-Off-The-Shelf (COTS) multilayer ceramic chip capacitors (MLCCs) are continually evolving to reduce physical size and increase volumetric efficiency. Designers of high reliability aerospace and military systems are attracted to these attributes of COTS MLCCs and would like to take advantage of them while maintaining the high standards for long-term reliable operation they are accustomed io when selecting military qualified established reliability (MIL-ER) MLCCs. However, MIL-ER MLCCs are not available in the full range of small chip sizes with high capacitance as found in today's COTS MLCCs. The objectives for this evaluation were to assess the long-term performance of small case size COTS MLCCs and to identify effective, lower-cost product assurance methodologies. Fifteen (15) lots of COTS X7R dielectric MLCCs from four (4) different manufacturers and two (2) MIL-ER BX dielectric MLCCs from two (2) of the same manufacturers were evaluated. Both 0805 and 0402 chip sizes were included. Several voltage ratings were tested ranging from a high of 50 volts to a low of 6.3 volts. The evaluation consisted of a comprehensive screening and qualification test program based upon MIL-PRF-55681 (i.e., voltage conditioning, thermal shock, moisture resistance, 2000-hour life test, etc.). In addition, several lot characterization tests were performed including Destructive Physical Analysis (DPA), Highly Accelerated Life Test (HALT) and Dielectric Voltage Breakdown Strength. The data analysis included a comparison of the 2000-hour life test results (used as a metric for long-term performance) relative to the screening and characterization test results. Results of this analysis indicate that the long-term life performance of COTS MLCCs is variable -- some lots perform well, some lots perform poorly. DPA and HALT were found to be promising lot characterization tests to identify substandard COTS MLCC lots prior to conducting more expensive screening and qualification tests. The results indicate that lot- specific screening and qualification are still recommended for high reliability applications. One significant and concerning observation is that MIL- type voltage conditioning (100 hours at twice rated voltage, 125 C) was not an effective screen in removing infant mortality parts for the particular lots of COTS MLCCs evaluated.

  20. Isotope-labelled urea to test colon drug delivery devices in vivo: principles, calculations and interpretations.

    PubMed

    Maurer, Marina J M; Schellekens, Reinout C A; Wutzke, Klaus D; Stellaard, Frans

    2013-01-01

    This paper describes various methodological aspects that were encountered during the development of a system to monitor the in vivo behaviour of a newly developed colon delivery device that enables oral drug treatment of inflammatory bowel diseases. [(13)C]urea was chosen as the marker substance. Release of [(13)C]urea in the ileocolonic region is proven by the exhalation of (13)CO2 in breath due to bacterial fermentation of [(13)C]urea. The (13)CO2 exhalation kinetics allows the calculation of a lag time as marker for delay of release, a pulse time as marker for the speed of drug release and the fraction of the dose that is fermented. To determine the total bioavailability, also the fraction of the dose absorbed from the intestine must be quantified. Initially, this was done by calculating the time-dependent [(13)C]urea appearance in the body urea pool via measurement of (13)C abundance and concentration of plasma urea. Thereafter, a new methodology was successfully developed to obtain the bioavailability data by measurement of the urinary excretion rate of [(13)C]urea. These techniques required two experimental days, one to test the coated device, another to test the uncoated device to obtain reference values for the situation that 100 % of [(13)C]urea is absorbed. This is hampered by large day-to-day variations in urea metabolism. Finally, a completely non-invasive, one-day test was worked out based on a dual isotope approach applying a simultaneous administration of [(13)C]urea in a coated device and [(15)N2]urea in an uncoated device. All aspects of isotope-related analytical methodologies and required calculation and correction systems are described.

  1. Assessment of Methodological Quality of Economic Evaluations in Belgian Drug Reimbursement Applications

    PubMed Central

    Simoens, Steven

    2013-01-01

    Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474

  2. Assessment of methodological quality of economic evaluations in belgian drug reimbursement applications.

    PubMed

    Simoens, Steven

    2013-01-01

    This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.

  3. Syndromes of collateral-reported psychopathology for ages 18-59 in 18 Societies

    PubMed Central

    Ivanova, Masha Y.; Achenbach, Thomas M.; Rescorla, Leslie A.; Turner, Lori V.; Árnadóttir, Hervör Alma; Au, Alma; Caldas, J. Carlos; Chaalal, Nebia; Chen, Yi Chuen; da Rocha, Marina M.; Decoster, Jeroen; Fontaine, Johnny R.J.; Funabiki, Yasuko; Guðmundsson, Halldór S.; Kim, Young Ah; Leung, Patrick; Liu, Jianghong; Malykh, Sergey; Marković, Jasminka; Oh, Kyung Ja; Petot, Jean-Michel; Samaniego, Virginia C.; Silvares, Edwiges Ferreira de Mattos; Šimulionienė, Roma; Šobot, Valentina; Sokoli, Elvisa; Sun, Guiju; Talcott, Joel B.; Vázquez, Natalia; Zasępa, Ewa

    2017-01-01

    The purpose was to advance research and clinical methodology for assessing psychopathology by testing the international generalizability of an 8-syndrome model derived from collateral ratings of adult behavioral, emotional, social, and thought problems. Collateral informants rated 8,582 18–59-year-old residents of 18 societies on the Adult Behavior Checklist (ABCL). Confirmatory factor analyses tested the fit of the 8-syndrome model to ratings from each society. The primary model fit index (Root Mean Square Error of Approximation) showed good model fit for all societies, while secondary indices (Tucker Lewis Index, Comparative Fit Index) showed acceptable to good fit for 17 societies. Factor loadings were robust across societies and items. Of the 5,007 estimated parameters, 4 (0.08%) were outside the admissible parameter space, but 95% confidence intervals included the admissible space, indicating that the 4 deviant parameters could be due to sampling fluctuations. The findings are consistent with previous evidence for the generalizability of the 8-syndrome model in self-ratings from 29 societies, and support the 8-syndrome model for operationalizing phenotypes of adult psychopathology from multi-informant ratings in diverse societies. PMID:29399019

  4. On the Determination of Magnesium Degradation Rates under Physiological Conditions.

    PubMed

    Nidadavolu, Eshwara Phani Shubhakar; Feyerabend, Frank; Ebel, Thomas; Willumeit-Römer, Regine; Dahms, Michael

    2016-07-28

    The current physiological in vitro tests of Mg degradation follow the procedure stated according to the ASTM standard. This standard, although useful in predicting the initial degradation behavior of an alloy, has its limitations in interpreting the same for longer periods of immersion in cell culture media. This is an important consequence as the alloy's degradation is time dependent. Even if two different alloys show similar corrosion rates in a short term experiment, their degradation characteristics might differ with increased immersion times. Furthermore, studies concerning Mg corrosion extrapolate the corrosion rate from a single time point measurement to the order of a year (mm/y), which might not be appropriate because of time dependent degradation behavior. In this work, the above issues are addressed and a new methodology of performing long-term immersion tests in determining the degradation rates of Mg alloys was put forth. For this purpose, cast and extruded Mg-2Ag and powder pressed and sintered Mg-0.3Ca alloy systems were chosen. DMEM Glutamax +10% FBS (Fetal Bovine Serum) +1% Penicillin streptomycin was used as cell culture medium. The advantages of such a method in predicting the degradation rates in vivo deduced from in vitro experiments are discussed.

  5. Effectiveness of interventions to reduce ordering of thyroid function tests: a systematic review.

    PubMed

    Zhelev, Zhivko; Abbott, Rebecca; Rogers, Morwenna; Fleming, Simon; Patterson, Anthea; Hamilton, William Trevor; Heaton, Janet; Thompson Coon, Jo; Vaidya, Bijay; Hyde, Christopher

    2016-06-03

    To evaluate the effectiveness of behaviour changing interventions targeting ordering of thyroid function tests. Systematic review. MEDLINE, EMBASE and the Cochrane Database up to May 2015. We included studies evaluating the effectiveness of behaviour change interventions aiming to reduce ordering of thyroid function tests. Randomised controlled trials (RCTs), non-randomised controlled studies and before and after studies were included. There were no language restrictions. 2 reviewers independently screened all records identified by the electronic searches and reviewed the full text of any deemed potentially relevant. Study details were extracted from the included papers and their methodological quality assessed independently using a validated tool. Disagreements were resolved through discussion and arbitration by a third reviewer. Meta-analysis was not used. 27 studies (28 papers) were included. They evaluated a range of interventions including guidelines/protocols, changes to funding policy, education, decision aids, reminders and audit/feedback; often intervention types were combined. The most common outcome measured was the rate of test ordering, but the effect on appropriateness, test ordering patterns and cost were also measured. 4 studies were RCTs. The majority of the studies were of poor or moderate methodological quality. The interventions were variable and poorly reported. Only 4 studies reported unsuccessful interventions but there was no clear pattern to link effect and intervention type or other characteristics. The results suggest that behaviour change interventions are effective particularly in reducing the volume of thyroid function tests. However, due to the poor methodological quality and reporting of the studies, the likely presence of publication bias and the questionable relevance of some interventions to current day practice, we are unable to draw strong conclusions or recommend the implementation of specific intervention types. Further research is thus justified. CRD42014006192. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  6. Effectiveness of interventions to reduce ordering of thyroid function tests: a systematic review

    PubMed Central

    Abbott, Rebecca; Rogers, Morwenna; Fleming, Simon; Patterson, Anthea; Hamilton, William Trevor; Heaton, Janet; Vaidya, Bijay; Hyde, Christopher

    2016-01-01

    Objectives To evaluate the effectiveness of behaviour changing interventions targeting ordering of thyroid function tests. Design Systematic review. Data sources MEDLINE, EMBASE and the Cochrane Database up to May 2015. Eligibility criteria for selecting studies We included studies evaluating the effectiveness of behaviour change interventions aiming to reduce ordering of thyroid function tests. Randomised controlled trials (RCTs), non-randomised controlled studies and before and after studies were included. There were no language restrictions. Study appraisal and synthesis methods 2 reviewers independently screened all records identified by the electronic searches and reviewed the full text of any deemed potentially relevant. Study details were extracted from the included papers and their methodological quality assessed independently using a validated tool. Disagreements were resolved through discussion and arbitration by a third reviewer. Meta-analysis was not used. Results 27 studies (28 papers) were included. They evaluated a range of interventions including guidelines/protocols, changes to funding policy, education, decision aids, reminders and audit/feedback; often intervention types were combined. The most common outcome measured was the rate of test ordering, but the effect on appropriateness, test ordering patterns and cost were also measured. 4 studies were RCTs. The majority of the studies were of poor or moderate methodological quality. The interventions were variable and poorly reported. Only 4 studies reported unsuccessful interventions but there was no clear pattern to link effect and intervention type or other characteristics. Conclusions The results suggest that behaviour change interventions are effective particularly in reducing the volume of thyroid function tests. However, due to the poor methodological quality and reporting of the studies, the likely presence of publication bias and the questionable relevance of some interventions to current day practice, we are unable to draw strong conclusions or recommend the implementation of specific intervention types. Further research is thus justified. Trial registration number CRD42014006192. PMID:27259523

  7. The Air Force Manufacturing Technology (MANTECH): Technology transfer methodology as exemplified by the radar transmit/receive module program

    NASA Technical Reports Server (NTRS)

    Houpt, Tracy; Ridgely, Margaret

    1991-01-01

    The Air Force Manufacturing Technology program is involved with the improvement of radar transmit/receive modules for use in active phased array radars for advanced fighter aircraft. Improvements in all areas of manufacture and test of these modules resulting in order of magnitude improvements in the cost of and the rate of production are addressed, as well as the ongoing transfer of this technology to the Navy.

  8. Application Of The Iberdrola Licensing Methodology To The Cofrentes BWR-6 110% Extended Power Up-rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier

    Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less

  9. Validating a Finite Element Model of a Structure Subjected to Mine Blast with Experimental Modal Analysis

    DTIC Science & Technology

    2017-11-01

    The Under-body Blast Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and... Evaluation Command to assess the vulnerability of vehicles to under-body blast. Finite element (FE) models are part of the current UBM for T&E methodology...Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and Evaluation Command

  10. Wavelet-based information filtering for fault diagnosis of electric drive systems in electric ships.

    PubMed

    Silva, Andre A; Gupta, Shalabh; Bazzi, Ali M; Ulatowski, Arthur

    2017-09-22

    Electric machines and drives have enjoyed extensive applications in the field of electric vehicles (e.g., electric ships, boats, cars, and underwater vessels) due to their ease of scalability and wide range of operating conditions. This stems from their ability to generate the desired torque and power levels for propulsion under various external load conditions. However, as with the most electrical systems, the electric drives are prone to component failures that can degrade their performance, reduce the efficiency, and require expensive maintenance. Therefore, for safe and reliable operation of electric vehicles, there is a need for automated early diagnostics of critical failures such as broken rotor bars and electrical phase failures. In this regard, this paper presents a fault diagnosis methodology for electric drives in electric ships. This methodology utilizes the two-dimensional, i.e. scale-shift, wavelet transform of the sensor data to filter optimal information-rich regions which can enhance the diagnosis accuracy as well as reduce the computational complexity of the classifier. The methodology was tested on sensor data generated from an experimentally validated simulation model of electric drives under various cruising speed conditions. The results in comparison with other existing techniques show a high correct classification rate with low false alarm and miss detection rates. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Development and application of an analysis methodology for interpreting ambiguous historical pressure data in the WIPP gas-generation experiments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felicione, F. S.

    2006-01-23

    The potential for generation of gases in transuranic (TRU) waste by microbial activity, chemical interactions, corrosion, and radiolysis was addressed in the Argonne National Laboratory-West (ANL-West) Gas-Generation Experiments (GGE). Data was collected over several years by simulating the conditions in the Waste Isolation Pilot Plant (WIPP) after the eventual intrusion of brine into the repository. Fourteen test containers with various actual TRU waste immersed in representative brine were inoculated with WIPP-relevant microbes, pressurized with inert gases, and kept in an inert-atmosphere environment for several years to provide estimates of the gas-generation rates that will be used in computer models formore » future WIPP Performance Assessments. Modest temperature variations occurred during the long-term ANL-West experiments. Although the experiment temperatures always remained well within the experiment specifications, the small temperature variation was observed to affect the test container pressure far more than had been anticipated. In fact, the pressure variations were so large, and seemingly erratic, that it was impossible to discern whether the data was even valid and whether the long-term pressure trend was increasing, decreasing, or constant. The result was that no useful estimates of gas-generation rates could be deduced from the pressure data. Several initial attempts were made to quantify the pressure fluctuations by relating these to the measured temperature variation, but none was successful. The work reported here carefully analyzed the pressure measurements to determine if these were valid or erroneous data. It was found that a thorough consideration of the physical phenomena that were occurring can, in conjunction with suitable gas laws, account quite accurately for the pressure changes that were observed. Failure of the earlier attempts to validate the data was traced to the omission of several phenomena, the most important being the variation in the headspace volume caused by thermal expansion and contraction within the brine and waste. A further effort was directed at recovering useful results from the voluminous archived pressure data. An analytic methodology to do this was developed. This methodology was applied to each archived pressure measurement to nullify temperature and other effects to yield an adjusted pressure, from which gas-generation rates could be calculated. A review of the adjusted-pressure data indicated that generated-gas concentrations among these containers after approximately 3.25 years of test operation ranged from zero to over 17,000 ppm by volume. Four test containers experienced significant gas generation. All test containers that showed evidence of significant gas generation contained carbon-steel in the waste, indicating that corrosion was the predominant source of gas generation.« less

  12. Circle of life: rationale, design, and baseline results of an HIV prevention intervention among young American Indian adolescents of the Northern Plains.

    PubMed

    Kaufman, Carol E; Mitchell, Christina M; Beals, Janette; Desserich, Jennifer A; Wheeler, Cindy; Keane, Ellen M; Whitesell, Nancy Rumbaugh; Sam, Angela; Sedey, Cory

    2010-03-01

    In spite of significant disparities in sexual health outcomes for American Indian youth, no studies exist examining the effectiveness of HIV-prevention interventions. Circle of Life is an HIV-prevention intervention specifically developed for American Indian middle-school youth. We describe the rationale, methodology, and baseline results of a longitudinal randomized trial of Circle of Life conducted among American Indian youth aged 11-15 in a reservation community. The innovative design includes two pre-intervention waves to determine patterns of behavior prior to the intervention that might be associated with a differential impact of the intervention on sexual risk. We used one-way analysis of variance and chi-square tests to test for significant differences between randomized group assignment at each baseline wave and generalized estimating equations (GEE) to test significant differences in the rate of change in outcomes by group longitudinally. We present the collaborative and adaptive strategies for consenting, assenting, and data collection methodology in this community. Achieved response rates are comparable to other similar studies. Results from the two baseline waves indicate that few outcomes significantly varied by randomized intervention assignment. Ten percent of youth reported having had sex at Wave 1, rising to 15% at Wave 2. Among those who had had sex, the majority (>70%) reported using a condom at last sex. The project is well positioned to carry out the longitudinal assessments of the intervention to determine the overall impact of the Circle of Life and the differential impact by pre-intervention patterns of behavior across youth.

  13. Time-discounting and tobacco smoking: a systematic review and network analysis

    PubMed Central

    Barlow, Pepita; McKee, Martin; Reeves, Aaron; Galea, Gauden; Stuckler, David

    2017-01-01

    Abstract Background: Tobacco smoking harms health, so why do people smoke and fail to quit? An explanation originating in behavioural economics suggests a role for time-discounting, which describes how the value of a reward, such as better health, decreases with delay to its receipt. A large number of studies test the relationship of time-discounting with tobacco outcomes but the temporal pattern of this relationship and its variation according to measurement methods remain unclear. We review the association between time-discounting and smoking across (i) the life course, from initiation to cessation, and (ii) diverse discount measures. Methods: We identified 69 relevant studies in Web of Science and PubMed. We synthesized findings across methodologies and evaluated discount measures, study quality and cross-disciplinary fertilization. Results: In 44 out of 54 studies, smokers more greatly discounted the future than non-smokers and, in longitudinal studies, higher discounting predicted future smoking. Smokers with lower time-discount rates achieved higher quit rates. Findings were consistent across studies measuring discount rates using hypothetical monetary or cigarette reward scenarios. The methodological quality of the majority of studies was rated as ‘moderate’ and co-citation analysis revealed an isolation of economics journals and a dearth of studies in public health. Conclusion: There is moderate yet consistent evidence that high time-discounting is a risk factor for smoking and unsuccessful cessation. Policy scenarios assuming a flat rate of population discounting may inadequately capture smokers’ perceptions of costs and benefits. PMID:27818375

  14. How to assess and compare inter-rater reliability, agreement and correlation of ratings: an exemplary analysis of mother-father and parent-teacher expressive vocabulary rating pairs

    PubMed Central

    Stolarova, Margarita; Wolf, Corinna; Rinker, Tanja; Brielmann, Aenne

    2014-01-01

    This report has two main purposes. First, we combine well-known analytical approaches to conduct a comprehensive assessment of agreement and correlation of rating-pairs and to dis-entangle these often confused concepts, providing a best-practice example on concrete data and a tutorial for future reference. Second, we explore whether a screening questionnaire developed for use with parents can be reliably employed with daycare teachers when assessing early expressive vocabulary. A total of 53 vocabulary rating pairs (34 parent–teacher and 19 mother–father pairs) collected for two-year-old children (12 bilingual) are evaluated. First, inter-rater reliability both within and across subgroups is assessed using the intra-class correlation coefficient (ICC). Next, based on this analysis of reliability and on the test-retest reliability of the employed tool, inter-rater agreement is analyzed, magnitude and direction of rating differences are considered. Finally, Pearson correlation coefficients of standardized vocabulary scores are calculated and compared across subgroups. The results underline the necessity to distinguish between reliability measures, agreement and correlation. They also demonstrate the impact of the employed reliability on agreement evaluations. This study provides evidence that parent–teacher ratings of children's early vocabulary can achieve agreement and correlation comparable to those of mother–father ratings on the assessed vocabulary scale. Bilingualism of the evaluated child decreased the likelihood of raters' agreement. We conclude that future reports of agreement, correlation and reliability of ratings will benefit from better definition of terms and stricter methodological approaches. The methodological tutorial provided here holds the potential to increase comparability across empirical reports and can help improve research practices and knowledge transfer to educational and therapeutic settings. PMID:24994985

  15. Crewmember Performance Before, During, And After Spaceflight

    PubMed Central

    Kelly, Thomas H; Hienz, Robert D; Zarcone, Troy J; Wurster, Richard M; Brady, Joseph V

    2005-01-01

    The development of technologies for monitoring the welfare of crewmembers is a critical requirement for extended spaceflight. Behavior analytic methodologies provide a framework for studying the performance of individuals and groups, and brief computerized tests have been used successfully to examine the impairing effects of sleep, drug, and nutrition manipulations on human behavior. The purpose of the present study was to evaluate the feasibility and sensitivity of repeated performance testing during spaceflight. Four National Aeronautics and Space Administration crewmembers were trained to complete computerized questionnaires and performance tasks at repeated regular intervals before and after a 10-day shuttle mission and at times that interfered minimally with other mission activities during spaceflight. Two types of performance, Digit-Symbol Substitution trial completion rates and response times during the most complex Number Recognition trials, were altered slightly during spaceflight. All other dimensions of the performance tasks remained essentially unchanged over the course of the study. Verbal ratings of Fatigue increased slightly during spaceflight and decreased during the postflight test sessions. Arousal ratings increased during spaceflight and decreased postflight. No other consistent changes in rating-scale measures were observed over the course of the study. Crewmembers completed all mission requirements in an efficient manner with no indication of clinically significant behavioral impairment during the 10-day spaceflight. These results support the feasibility and utility of computerized task performances and questionnaire rating scales for repeated measurement of behavior during spaceflight. PMID:16262187

  16. Dynamic fatigue testing of Zerodur glass-ceramic

    NASA Technical Reports Server (NTRS)

    Tucker, Dennis S.

    1988-01-01

    The inherent brittleness of glass invariably leads to a large variability in strength data and a time dependence in strength. Leading rate plays a large role in strength values. Glass is found to be weaker when supporting loads over long periods of time as compared to glass which undergoes rapid leading. These properties complicate the structural design allowables for the utilization of glass components in an application such as Advanced X-ray Astrophysics Facility (AXAF). The test methodology to obtain parameters which can be used to predict the reliability and life time of Zerodur glass-ceramic which is to be used for the mirrors in the AXAF is described.

  17. Assessment of change in knowledge about research methods among delegates attending research methodology workshop.

    PubMed

    Shrivastava, Manisha; Shah, Nehal; Navaid, Seema

    2018-01-01

    In an era of evidence based medicine research is an essential part of medical profession whether clinical or academic. A research methodology workshop intends to help participants, those who are newer to research field or those who are already doing empirical research. The present study was conducted to assess the changes in knowledge of the participants of a research methodology workshop through a structured questionnaire. With administrative and ethical approval, a four day research methodology workshop was planned. The participants were subjected to a structured questionnaire (pre-test) containing 20 multiple choice questions (Q1-Q 20) related to the topics to be covered in research methodology workshop before the commencement of the workshop and then subjected to similar posttest questionnaire after the completion of workshop. The mean values of pre and post-test scores were calculated and the results were analyzed and compared. Out of the total 153 delegates, 45(29 %) were males and 108 were (71 %) females. 92 (60%) participants consented to fill the pre-test questionnaire and 68 (44%) filled the post-test questionnaire. The mean Pre-test and post-test scores at 95% Confidence Interval were 07.62 (SD ±3.220) and 09.66 (SD ±2.477) respectively. The differences were found to be significant using Paired Sample T test ( P <0.003). There was increase in knowledge of the delegates after attending research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  18. Novel optoelectronic methodology for testing of MOEMS

    NASA Astrophysics Data System (ADS)

    Pryputniewicz, Ryszard J.; Furlong, Cosme

    2003-01-01

    Continued demands for delivery of high performance micro-optoelectromechanical systems (MOEMS) place unprecedented requirements on methods used in their development and operation. Metrology is a major and inseparable part of these methods. Optoelectronic methodology is an essential field of metrology. Due to its scalability, optoelectronic methodology is particularly suitable for testing of MOEMS where measurements must be made with ever increasing accuracy and precision. This was particularly evident during the last few years, characterized by miniaturization of devices, when requirements for measurements have rapidly increased as the emerging technologies introduced new products, especially, optical MEMS. In this paper, a novel optoelectronic methodology for testing of MOEMS is described and its applications are illustrated with representative examples. These examples demonstrate capability to measure submicron deformations of various components of the micromirror device, under operating conditions, and show viability of the optoelectronic methodology for testing of MOEMS.

  19. An enhanced methodology for spacecraft correlation activity using virtual testing tools

    NASA Astrophysics Data System (ADS)

    Remedia, Marcello; Aglietti, Guglielmo S.; Appolloni, Matteo; Cozzani, Alessandro; Kiley, Andrew

    2017-11-01

    Test planning and post-test correlation activity have been issues of growing importance in the last few decades and many methodologies have been developed to either quantify or improve the correlation between computational and experimental results. In this article the methodologies established so far are enhanced with the implementation of a recently developed procedure called Virtual Testing. In the context of fixed-base sinusoidal tests (commonly used in the space sector for correlation), there are several factors in the test campaign that affect the behaviour of the satellite and are not normally taken into account when performing analyses: different boundary conditions created by the shaker's own dynamics, non-perfect control system, signal delays etc. All these factors are the core of the Virtual Testing implementation, which will be thoroughly explained in this article and applied to the specific case of Bepi-Colombo spacecraft tested on the ESA QUAD Shaker. Correlation activity will be performed in the various stages of the process, showing important improvements observed after applying the final complete methodology.

  20. Performance Modeling of Experimental Laser Lightcrafts

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Chen, Yen-Sen; Liu, Jiwen; Myrabo, Leik N.; Mead, Franklin B., Jr.; Turner, Jim (Technical Monitor)

    2001-01-01

    A computational plasma aerodynamics model is developed to study the performance of a laser propelled Lightcraft. The computational methodology is based on a time-accurate, three-dimensional, finite-difference, chemically reacting, unstructured grid, pressure-based formulation. The underlying physics are added and tested systematically using a building-block approach. The physics modeled include non-equilibrium thermodynamics, non-equilibrium air-plasma finite-rate kinetics, specular ray tracing, laser beam energy absorption and refraction by plasma, non-equilibrium plasma radiation, and plasma resonance. A series of transient computations are performed at several laser pulse energy levels and the simulated physics are discussed and compared with those of tests and literatures. The predicted coupling coefficients for the Lightcraft compared reasonably well with those of tests conducted on a pendulum apparatus.

  1. Novel methodology for pharmaceutical expenditure forecast

    PubMed Central

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    Background and objective The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical expenditure forecast’; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. Results This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. Conclusions This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making. PMID:27226843

  2. Integrating Test-Form Formatting into Automated Test Assembly

    ERIC Educational Resources Information Center

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  3. Mechanical Behavior of Glidcop Al-15 at High Temperature and Strain Rate

    NASA Astrophysics Data System (ADS)

    Scapin, M.; Peroni, L.; Fichera, C.

    2014-05-01

    Strain rate and temperature are variables of fundamental importance for the definition of the mechanical behavior of materials. In some elastic-plastic models, the effects, coming from these two quantities, are considered to act independently. This approach should, in some cases, allow to greatly simplify the experimental phase correlated to the parameter identification of the material model. Nevertheless, in several applications, the material is subjected to dynamic load at very high temperature, as, for example, in case of machining operation or high energy deposition on metals. In these cases, to consider the effect of strain rate and temperature decoupled could not be acceptable. In this perspective, in this work, a methodology for testing materials varying both strain rate and temperature was described and applied for the mechanical characterization of Glidcop Al-15, a copper-based composite reinforced with alumina dispersion, often used in nuclear applications. The tests at high strain rate were performed using the Hopkinson Bar setup for the direct tensile tests. The heating of the specimen was performed using an induction coil system and the temperature was controlled on the basis of signals from thermocouples directly welded on the specimen surface. Varying the strain rate, Glidcop Al-15 shows a moderate strain-rate sensitivity at room temperature, while it considerably increases at high temperature: material thermal softening and strain-rate hardening are strongly coupled. The experimental data were fitted using a modified formulation of the Zerilli-Armstrong model able to reproduce this kind of behavior with a good level of accuracy.

  4. Experimental Methodology for Measuring Combustion and Injection-Coupled Responses

    NASA Technical Reports Server (NTRS)

    Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.

    2006-01-01

    A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.

  5. Methodology for earthquake rupture rate estimates of fault networks: example for the western Corinth rift, Greece

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Lyon-Caen, Hélène; Boiselet, Aurélien

    2017-10-01

    Modeling the seismic potential of active faults is a fundamental step of probabilistic seismic hazard assessment (PSHA). An accurate estimation of the rate of earthquakes on the faults is necessary in order to obtain the probability of exceedance of a given ground motion. Most PSHA studies consider faults as independent structures and neglect the possibility of multiple faults or fault segments rupturing simultaneously (fault-to-fault, FtF, ruptures). The Uniform California Earthquake Rupture Forecast version 3 (UCERF-3) model takes into account this possibility by considering a system-level approach rather than an individual-fault-level approach using the geological, seismological and geodetical information to invert the earthquake rates. In many places of the world seismological and geodetical information along fault networks is often not well constrained. There is therefore a need to propose a methodology relying on geological information alone to compute earthquake rates of the faults in the network. In the proposed methodology, a simple distance criteria is used to define FtF ruptures and consider single faults or FtF ruptures as an aleatory uncertainty, similarly to UCERF-3. Rates of earthquakes on faults are then computed following two constraints: the magnitude frequency distribution (MFD) of earthquakes in the fault system as a whole must follow an a priori chosen shape and the rate of earthquakes on each fault is determined by the specific slip rate of each segment depending on the possible FtF ruptures. The modeled earthquake rates are then compared to the available independent data (geodetical, seismological and paleoseismological data) in order to weight different hypothesis explored in a logic tree.The methodology is tested on the western Corinth rift (WCR), Greece, where recent advancements have been made in the understanding of the geological slip rates of the complex network of normal faults which are accommodating the ˜ 15 mm yr-1 north-south extension. Modeling results show that geological, seismological and paleoseismological rates of earthquakes cannot be reconciled with only single-fault-rupture scenarios and require hypothesizing a large spectrum of possible FtF rupture sets. In order to fit the imposed regional Gutenberg-Richter (GR) MFD target, some of the slip along certain faults needs to be accommodated either with interseismic creep or as post-seismic processes. Furthermore, computed individual faults' MFDs differ depending on the position of each fault in the system and the possible FtF ruptures associated with the fault. Finally, a comparison of modeled earthquake rupture rates with those deduced from the regional and local earthquake catalog statistics and local paleoseismological data indicates a better fit with the FtF rupture set constructed with a distance criteria based on 5 km rather than 3 km, suggesting a high connectivity of faults in the WCR fault system.

  6. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  7. Estimation of rates-across-sites distributions in phylogenetic substitution models.

    PubMed

    Susko, Edward; Field, Chris; Blouin, Christian; Roger, Andrew J

    2003-10-01

    Previous work has shown that it is often essential to account for the variation in rates at different sites in phylogenetic models in order to avoid phylogenetic artifacts such as long branch attraction. In most current models, the gamma distribution is used for the rates-across-sites distributions and is implemented as an equal-probability discrete gamma. In this article, we introduce discrete distribution estimates with large numbers of equally spaced rate categories allowing us to investigate the appropriateness of the gamma model. With large numbers of rate categories, these discrete estimates are flexible enough to approximate the shape of almost any distribution. Likelihood ratio statistical tests and a nonparametric bootstrap confidence-bound estimation procedure based on the discrete estimates are presented that can be used to test the fit of a parametric family. We applied the methodology to several different protein data sets, and found that although the gamma model often provides a good parametric model for this type of data, rate estimates from an equal-probability discrete gamma model with a small number of categories will tend to underestimate the largest rates. In cases when the gamma model assumption is in doubt, rate estimates coming from the discrete rate distribution estimate with a large number of rate categories provide a robust alternative to gamma estimates. An alternative implementation of the gamma distribution is proposed that, for equal numbers of rate categories, is computationally more efficient during optimization than the standard gamma implementation and can provide more accurate estimates of site rates.

  8. The relationship between vulnerable attachment style, psychopathology, drug abuse, and retention in treatment among methadone maintenance treatment patients.

    PubMed

    Potik, David; Peles, Einat; Abramsohn, Yahli; Adelson, Miriam; Schreiber, Shaul

    2014-01-01

    The relationship between vulnerable attachment style, psychopathology, drug abuse, and retention in treatment among patients in methadone maintenance treatment (MMT) was examined by the Vulnerable Attachment Style Questionnaire (VASQ), the Symptom Checklist-90 (SCL-90), and drug abuse urine tests. After six years, retention in treatment and repeated urine test results were studied. Patients with vulnerable attachment style (a high VASQ score) had higher rates of drug abuse and higher psychopathology levels compared to patients with secure attachment style, especially on the interpersonal sensitivity, anxiety, hostility, phobic anxiety, and paranoid ideation scales. Drug abstinence at baseline was related to retention in treatment and to higher rates of drug abstinence after six years in MMT, whereas a vulnerable attachment style could not predict drug abstinence and retention in treatment. Clinical Implications concerning treatment of drug abusing populations and methodological issues concerning the VASQ's subscales are also discussed.

  9. Development and Validation of a Translation Test.

    ERIC Educational Resources Information Center

    Ghonsooly, Behzad

    1993-01-01

    Translation testing methodology has been criticized for its subjective character. No real strides have so far been made in developing an objective translation test. In this paper, certain detailed procedures including various phases of pretesting have been performed to achieve objectivity and scorability in translation testing methodology. In…

  10. Methodological variations and their effects on reported medication administration error rates.

    PubMed

    McLeod, Monsey Chan; Barber, Nick; Franklin, Bryony Dean

    2013-04-01

    Medication administration errors (MAEs) are a problem, yet methodological variation between studies presents a potential barrier to understanding how best to increase safety. Using the UK as a case-study, we systematically summarised methodological variations in MAE studies, and their effects on reported MAE rates. Nine healthcare databases were searched for quantitative observational MAE studies in UK hospitals. Methodological variations were analysed and meta-analysis of MAE rates performed using studies that used the same definitions. Odds ratios (OR) were calculated to compare MAE rates between intravenous (IV) and non-IV doses, and between paediatric and adult doses. We identified 16 unique studies reporting three MAE definitions, 44 MAE subcategories and four different denominators. Overall adult MAE rates were 5.6% of a total of 21 533 non-IV opportunities for error (OE) (95% CI 4.6% to 6.7%) and 35% of a total of 154 IV OEs (95% CI 2% to 68%). MAEs were five times more likely in IV than non-IV doses (pooled OR 5.1; 95% CI 3.5 to 7.5). Including timing errors of ±30 min increased the MAE rate from 27% to 69% of 320 IV doses in one study. Five studies were unclear as to whether the denominator included dose omissions; omissions accounted for 0%-13% of IV doses and 1.8%-5.1% of non-IV doses. Wide methodological variations exist even within one country, some with significant effects on reported MAE rates. We have made recommendations for future MAE studies; these may be applied both within and outside the UK.

  11. Brittle materials at high-loading rates: an open area of research

    PubMed Central

    2017-01-01

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956517

  12. Tuning the Mechanical and Antimicrobial Performance of a Cu-Based Metallic Glass Composite through Cooling Rate Control and Annealing.

    PubMed

    Villapún, Victor M; Esat, F; Bull, S; Dover, L G; González, S

    2017-05-06

    The influence of cooling rate on the wear and antimicrobial performance of a Cu 52 Z 41 Al₇ (at. %) bulk metallic glass (BMG) composite was studied and the results compared to those of the annealed sample (850 °C for 48 h) and to pure copper. The aim of this basic research is to explore the potential use of the material in preventing the spread of infections. The cooling rate is controlled by changing the mould diameter (2 mm and 3 mm) upon suction casting and controlling the mould temperature (chiller on and off). For the highest cooling rate conditions CuZr is formed but CuZr₂ starts to crystallise as the cooling rate decreases, resulting in an increase in the wear resistance and brittleness, as measured by scratch tests. A decrease in the cooling rate also increases the antimicrobial performance, as shown by different methodologies (European, American and Japanese standards). Annealing leads to the formation of new intermetallic phases (Cu 10 Zr₇ and Cu₂ZrAl) resulting in maximum scratch hardness and antimicrobial performance. However, the annealed sample corrodes during the antimicrobial tests (within 1 h of contact with broth). The antibacterial activity of copper was proved to be higher than that of any of the other materials tested but it exhibits very poor wear properties. Cu-rich BMG composites with optimised microstructure would be preferable for some applications where the durability requirements are higher than the antimicrobial needs.

  13. Falsifiability is not optional.

    PubMed

    LeBel, Etienne P; Berger, Derek; Campbell, Lorne; Loving, Timothy J

    2017-08-01

    Finkel, Eastwick, and Reis (2016; FER2016) argued the post-2011 methodological reform movement has focused narrowly on replicability, neglecting other essential goals of research. We agree multiple scientific goals are essential, but argue, however, a more fine-grained language, conceptualization, and approach to replication is needed to accomplish these goals. Replication is the general empirical mechanism for testing and falsifying theory. Sufficiently methodologically similar replications, also known as direct replications, test the basic existence of phenomena and ensure cumulative progress is possible a priori. In contrast, increasingly methodologically dissimilar replications, also known as conceptual replications, test the relevance of auxiliary hypotheses (e.g., manipulation and measurement issues, contextual factors) required to productively investigate validity and generalizability. Without prioritizing replicability, a field is not empirically falsifiable. We also disagree with FER2016's position that "bigger samples are generally better, but . . . that very large samples could have the downside of commandeering resources that would have been better invested in other studies" (abstract). We identify problematic assumptions involved in FER2016's modifications of our original research-economic model, and present an improved model that quantifies when (and whether) it is reasonable to worry that increasing statistical power will engender potential trade-offs. Sufficiently powering studies (i.e., >80%) maximizes both research efficiency and confidence in the literature (research quality). Given that we are in agreement with FER2016 on all key open science points, we are eager to start seeing the accelerated rate of cumulative knowledge development of social psychological phenomena such a sufficiently transparent, powered, and falsifiable approach will generate. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. 76 FR 52892 - Energy Conservation Program: Energy Conservation Standards for Fluorescent Lamp Ballasts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-24

    ... between the DOE test data and the data submitted by NEMA; describe the methodological changes DOE is... differences between test data obtained by DOE and test data submitted by NEMA; (3) describe the methodological...

  15. Methodological strategies in using home sleep apnea testing in research and practice.

    PubMed

    Miller, Jennifer N; Schulz, Paula; Pozehl, Bunny; Fiedler, Douglas; Fial, Alissa; Berger, Ann M

    2017-11-14

    Home sleep apnea testing (HSAT) has increased due to improvements in technology, accessibility, and changes in third party reimbursement requirements. Research studies using HSAT have not consistently reported procedures and methodological challenges. This paper had two objectives: (1) summarize the literature on use of HSAT in research of adults and (2) identify methodological strategies to use in research and practice to standardize HSAT procedures and information. Search strategy included studies of participants undergoing sleep testing for OSA using HSAT. MEDLINE via PubMed, CINAHL, and Embase with the following search terms: "polysomnography," "home," "level III," "obstructive sleep apnea," and "out of center testing." Research articles that met inclusion criteria (n = 34) inconsistently reported methods and methodological challenges in terms of: (a) participant sampling; (b) instrumentation issues; (c) clinical variables; (d) data processing; and (e) patient acceptability. Ten methodological strategies were identified for adoption when using HSAT in research and practice. Future studies need to address the methodological challenges summarized in this paper as well as identify and report consistent HSAT procedures and information.

  16. ChargeOut! : determining machine and capital equipment charge-out rates using discounted cash-flow analysis

    Treesearch

    E.M. (Ted) Bilek

    2007-01-01

    The model ChargeOut! was developed to determine charge-out rates or rates of return for machines and capital equipment. This paper introduces a costing methodology and applies it to a piece of capital equipment. Although designed for the forest industry, the methodology is readily transferable to other sectors. Based on discounted cash-flow analysis, ChargeOut!...

  17. Indicators of Student Flow Rates in Honduras: An Assessment of an Alternative Methodology, with Two Methodologies for Estimating Student Flow Rates. BRIDGES Research Report No. 6.

    ERIC Educational Resources Information Center

    Cuadra, Ernesto; Crouch, Luis

    Student promotion, repetition, and dropout rates constitute the basic data needed to forecast future enrollment and new resources. Information on student flow is significantly related to policy formulation aimed at improving internal efficiency, because dropping out and grade repetition increase per pupil cost, block access to eligible school-age…

  18. Testing for nonlinearity in non-stationary physiological time series.

    PubMed

    Guarín, Diego; Delgado, Edilson; Orozco, Álvaro

    2011-01-01

    Testing for nonlinearity is one of the most important preprocessing steps in nonlinear time series analysis. Typically, this is done by means of the linear surrogate data methods. But it is a known fact that the validity of the results heavily depends on the stationarity of the time series. Since most physiological signals are non-stationary, it is easy to falsely detect nonlinearity using the linear surrogate data methods. In this document, we propose a methodology to extend the procedure for generating constrained surrogate time series in order to assess nonlinearity in non-stationary data. The method is based on the band-phase-randomized surrogates, which consists (contrary to the linear surrogate data methods) in randomizing only a portion of the Fourier phases in the high frequency domain. Analysis of simulated time series showed that in comparison to the linear surrogate data method, our method is able to discriminate between linear stationarity, linear non-stationary and nonlinear time series. Applying our methodology to heart rate variability (HRV) records of five healthy patients, we encountered that nonlinear correlations are present in this non-stationary physiological signals.

  19. Quantification of Anti-Aggregation Activity of Chaperones: A Test-System Based on Dithiothreitol-Induced Aggregation of Bovine Serum Albumin

    PubMed Central

    Borzova, Vera A.; Markossian, Kira A.; Kara, Dmitriy A.; Chebotareva, Natalia A.; Makeeva, Valentina F.; Poliansky, Nikolay B.; Muranov, Konstantin O.; Kurganov, Boris I.

    2013-01-01

    The methodology for quantification of the anti-aggregation activity of protein and chemical chaperones has been elaborated. The applicability of this methodology was demonstrated using a test-system based on dithiothreitol-induced aggregation of bovine serum albumin at 45°C as an example. Methods for calculating the initial rate of bovine serum albumin aggregation (v agg) have been discussed. The comparison of the dependences of v agg on concentrations of intact and cross-linked α-crystallin allowed us to make a conclusion that a non-linear character of the dependence of v agg on concentration of intact α-crystallin was due to the dynamic mobility of the quaternary structure of α-crystallin and polydispersity of the α-crystallin–target protein complexes. To characterize the anti-aggregation activity of the chemical chaperones (arginine, arginine ethyl ester, arginine amide and proline), the semi-saturation concentration [L]0.5 was used. Among the chemical chaperones studied, arginine ethyl ester and arginine amide reveal the highest anti-aggregation activity ([L]0.5 = 53 and 58 mM, respectively). PMID:24058554

  20. Cross Time-Frequency Analysis for Combining Information of Several Sources: Application to Estimation of Spontaneous Respiratory Rate from Photoplethysmography

    PubMed Central

    Peláez-Coca, M. D.; Orini, M.; Lázaro, J.; Bailón, R.; Gil, E.

    2013-01-01

    A methodology that combines information from several nonstationary biological signals is presented. This methodology is based on time-frequency coherence, that quantifies the similarity of two signals in the time-frequency domain. A cross time-frequency analysis method, based on quadratic time-frequency distribution, has been used for combining information of several nonstationary biomedical signals. In order to evaluate this methodology, the respiratory rate from the photoplethysmographic (PPG) signal is estimated. The respiration provokes simultaneous changes in the pulse interval, amplitude, and width of the PPG signal. This suggests that the combination of information from these sources will improve the accuracy of the estimation of the respiratory rate. Another target of this paper is to implement an algorithm which provides a robust estimation. Therefore, respiratory rate was estimated only in those intervals where the features extracted from the PPG signals are linearly coupled. In 38 spontaneous breathing subjects, among which 7 were characterized by a respiratory rate lower than 0.15 Hz, this methodology provided accurate estimates, with the median error {0.00; 0.98} mHz ({0.00; 0.31}%) and the interquartile range error {4.88; 6.59} mHz ({1.60; 1.92}%). The estimation error of the presented methodology was largely lower than the estimation error obtained without combining different PPG features related to respiration. PMID:24363777

  1. Direct access to dithiobenzoate RAFT agent fragmentation rate coefficients by ESR spin-trapping.

    PubMed

    Ranieri, Kayte; Delaittre, Guillaume; Barner-Kowollik, Christopher; Junkers, Thomas

    2014-12-01

    The β-scission rate coefficient of tert-butyl radicals fragmenting off the intermediate resulting from their addition to tert-butyl dithiobenzoate-a reversible addition-fragmentation chain transfer (RAFT) agent-is estimated via the recently introduced electron spin resonance (ESR)-trapping methodology as a function of temperature. The newly introduced ESR-trapping methodology is critically evaluated and found to be reliable. At 20 °C, a fragmentation rate coefficient of close to 0.042 s(-1) is observed, whereas the activation parameters for the fragmentation reaction-determined for the first time-read EA = 82 ± 13.3 kJ mol(-1) and A = (1.4 ± 0.25) × 10(13) s(-1) . The ESR spin-trapping methodology thus efficiently probes the stability of the RAFT adduct radical under conditions relevant for the pre-equilibrium of the RAFT process. It particularly indicates that stable RAFT adduct radicals are indeed formed in early stages of the RAFT poly-merization, at least when dithiobenzoates are employed as controlling agents as stipulated by the so-called slow fragmentation theory. By design of the methodology, the obtained fragmentation rate coefficients represent an upper limit. The ESR spin-trapping methodology is thus seen as a suitable tool for evaluating the fragmentation rate coefficients of a wide range of RAFT adduct radicals. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  3. The Study of Productivity Measurement and Incentive Methodology (Phase III - Paper Test). Volume 1

    DTIC Science & Technology

    1986-03-14

    possible to measure explicitly, in terms of dollars the profit impacts of these uncontrollable as well as controllable factors and to de - termine and...the rate of engineering changes increases. • Production processes arc be- coming less reliant on direct la - bor as the primary factor in pro...MFPMM makes it •: —.sibie to measure explicitly, in terms of dollars the profit impacts of these uncontrollable as well as controllable factors and to de

  4. Development of Fatigue and Crack Propagation Design and Analysis Methodology in a Corrosive Environment for Typical Mechanically-Fastened Joints. Volume 2. State-of-the-Art Assessment.

    DTIC Science & Technology

    1983-03-01

    120] hypothesized a linear summation model to predict the corrosion -fatigue behavior above Kjscc for a high-strength steel . The model considers the...120] could satisfactorily predict the rates of corrosion -fatigue-crack growth for 18-Ni Maraging steels tested in several gaseous and aqueous...NADC-83126-60 Vol. II 6. The corrosion fatigue behavior of titanium alloys is very complex. Therefore, a better understanding of corrosion fatigue

  5. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  6. 77 FR 59348 - Revisions to Page 700 of FERC Form No. 6

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-27

    .... The components of an oil pipeline's rate base are governed by the Trended Original Cost Methodology... ratemaking methodology to the Trended Original Cost methodology as adopted in Opinion 154-B. The SRB was to... trended original cost methodology divides the nominal return on equity component of the cost of service...

  7. CO2 Washout Testing of the REI and EM-ACES Space Suits

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Norcross, Jason

    2011-01-01

    Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy and eventually unconsciousness or even death. Symptoms depend on several factors including partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject and physiological differences between subjects. The objective of this test was to characterize inspired oronasal ppCO2 in the Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES) across a range of workloads and flow rates for which ground testing is nominally performed. Three subjects were tested in each suit. In all but one case, each subject performed the test twice to allow for comparison between tests. Suit pressure was maintained at 4.3 psid. Subjects wore the suit while resting, performing arm ergometry, and walking on a treadmill to generate metabolic workloads of approximately 500 to 3000 BTU/hr. Supply airflow was varied at 6, 5 and 4 actual cubic feet per minute (ACFM) at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the total CO2 production measured by an additional gas analyzer at the air outlet from the suit. Real-time metabolic rate was used to adjust the arm ergometer or treadmill workload to meet target metabolic rates. In both suits, inspired CO2 was primarily affected by the metabolic rate of the subject, with increased metabolic rate resulting in increased inspired ppCO2. Suit flow rate also affected inspired ppCO2, with decreased flow causing small increases in inspired ppCO2. The effect of flow was more evident at metabolic rates greater than or equal to 2000 BTU/hr. Results were consistent between suits, with the EM-ACES demonstrating slightly better CO2 washout than the REI suit, but not statistically significant. Regression equations were developed for each suit to predict the mean inspired ppCO2 as a function of metabolic rate and suit flow rate. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future ground testing in the REI and EM-ACES.

  8. SRB ascent aerodynamic heating design criteria reduction study, volume 1

    NASA Technical Reports Server (NTRS)

    Crain, W. K.; Frost, C. L.; Engel, C. D.

    1989-01-01

    An independent set of solid rocket booster (SRB) convective ascent design environments were produced which would serve as a check on the Rockwell IVBC-3 environments used to design the ascent phase of flight. In addition, support was provided for lowering the design environments such that Thermal Protection System (TPS), based on conservative estimates, could be removed leading to a reduction in SRB refurbishment time and cost. Ascent convective heating rates and loads were generated at locations in the SRB where lowering the thermal environment would impact the TPS design. The ascent thermal environments are documented along with the wind tunnel/flight test data base used as well as the trajectory and environment generation methodology. Methodology, as well as, environment summaries compared to the 1980 Design and Rockwell IVBC-3 Design Environment are presented in this volume, 1.

  9. A methodological analysis of chaplaincy research: 2000-2009.

    PubMed

    Galek, Kathleen; Flannelly, Kevin J; Jankowski, Katherine R B; Handzo, George F

    2011-01-01

    The present article presents a comprehensive review and analysis of quantitative research conducted in the United States on chaplaincy and closely related topics published between 2000 and 2009. A combined search strategy identified 49 quantitative studies in 13 journals. The analysis focuses on the methodological sophistication of the studies, compared to earlier research on chaplaincy and pastoral care. Cross-sectional surveys of convenience samples still dominate the field, but sample sizes have increased somewhat over the past three decades. Reporting of the validity and reliability of measures continues to be low, although reporting of response rates has improved. Improvements in the use of inferential statistics and statistical controls were also observed, compared to previous research. The authors conclude that more experimental research is needed on chaplaincy, along with an increased use of hypothesis testing, regardless of the research designs that are used.

  10. Methods for Measuring Specific Rates of Mercury Methylation and Degradation and Their Use in Determining Factors Controlling Net Rates of Mercury Methylation

    PubMed Central

    Ramlal, Patricia S.; Rudd, John W. M.; Hecky, Robert E.

    1986-01-01

    A method was developed to estimate specific rates of demethylation of methyl mercury in aquatic samples by measuring the volatile 14C end products of 14CH3HgI demethylation. This method was used in conjunction with a 203Hg2+ radiochemical method which determines specific rates of mercury methylation. Together, these methods enabled us to examine some factors controlling the net rate of mercury methylation. The methodologies were field tested, using lake sediment samples from a recently flooded reservoir in the Southern Indian Lake system which had developed a mercury contamination problem in fish. Ratios of the specific rates of methylation/demethylation were calculated. The highest ratios of methylation/demethylation were calculated. The highest ratios of methylation/demethylation occurred in the flooded shorelines of Southern Indian Lake. These results provide an explanation for the observed increases in the methyl mercury concentrations in fish after flooding. PMID:16346959

  11. Previous concrete as one of the technology to overcome the puddle

    NASA Astrophysics Data System (ADS)

    Agung Putra Handana, M.; Karolina, Rahmi; Syahputra, Eko; Zulfikar

    2018-03-01

    Some construction waste has been utilized as a material in certain concrete compositions for engineering building materials. One is a concrete that has been removed after testing at a laboratory called recycle concrete. Disposed concrete, crushed and filtered with filter number 50; 37.5; 19; 9.5; and 4.75 mm are subsequently converted into rough aggregate materials in the manufacture of pervious concrete to be tested for compressive strength and infiltration velocity to water. Pervious concrete test specimens in the form of cylinders with dimensions (15 x 30) cm and plate-shaped with dimension (100 x 100 x 10) cm with the quality plan Fc ' = 15 MPa at age 28 days. The research methodology consisted of testing of wear, test object preparation, periodic maintenance, visual inspection, compressive strength testing, and infiltration rate of specimens against water (based on ASTM C1701). Treatment of specimens by spraying periodically before the test time. From the results of the Los Angeles wear test, it appears that recycled aggregate has an average wear rate of 20.88% (based on SNI 03-2417-1991) on the Los Angeles test) and the visual test on the specimen is appropriate (based on SNI 03 -0691-1996 on paving block) as the basis for testing the specimens. The largest compressive strength was found in pervious concrete with 9.5 mm graded aggregates of 5.89 MPa, while the smallest compressive strength of 50 mm gradation was 2.15 MPa and had a compressive strength of 28% of pervious concrete compressive strength on generally (based on SNI 03-6805-2002). The fastest infiltration speed occurs in 50 mm pervious gradient concrete at 4.52 inc / hr and is late in 9.5 mm grading of 2.068 inc / hr or an inflation rate inflation rate of 54.25% for gradation of 9.5 mm to 50 mm gradation, So that in accordance with the purpose of pervious concrete use, concrete that can drain water to the bottom layer

  12. Methodological Choices in Rating Speech Samples

    ERIC Educational Resources Information Center

    O'Brien, Mary Grantham

    2016-01-01

    Much pronunciation research critically relies upon listeners' judgments of speech samples, but researchers have rarely examined the impact of methodological choices. In the current study, 30 German native listeners and 42 German L2 learners (L1 English) rated speech samples produced by English-German L2 learners along three continua: accentedness,…

  13. 77 FR 52110 - Agency Response to Public Comments of Safety Measurement System Changes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-28

    ... University of Michigan Transportation Research Institute ( http://csa.fmcsa.dot.gov/Documents/Evaluation-of... the revised methodology have a 3.9% greater future crash rate and 3.6% greater future HM violation rate than those previously identified for intervention using the existing SMS methodology. Details...

  14. Assessing technical performance at diverse ambulatory care sites.

    PubMed

    Osterweis, M; Bryant, E

    1978-01-01

    The purpose of the large study reported here was to develop and test methods for assessing the quality of health care that would be broadly applicable to diverse ambulatory care organizations for periodic comparative review. Methodological features included the use of an age-sex stratified random sampling scheme, dependence on medical records as the source of data, a fixed study period year, use of Kessner's tracer methodology (including not only acute and chronic diseases but also screening and immunization rates as indicators), and a fixed tracer matrix at all test sites. This combination of methods proved more efficacious in estimating certain parameters for the total patient populations at each site (including utilization patterns, screening, and immunization rates) and the process of care for acute conditions than it did in examining the process of care for the selected chronic condition. It was found that the actual process of care at all three sites for the three acute conditions (streptococcal pharyngitis, urinary tract infection, and iron deficiency anemia) often differed from the expected process in terms of both diagnostic procedures and treatment. For hypertension, the chronic disease tracer, medical records were frequently a deficient data source from which to draw conclusions about the adequacy of treatment. Several aspects of the study methodology were found to be detrimental to between-site comparisons of the process of care for chronic disease management. The use of an age-sex stratified random sampling scheme resulted in the identification of too few cases of hypertension at some sites for analytic purposes, thereby necessitating supplementary sampling by diagnosis. The use of a fixed study period year resulted in an arbitrary starting point in the course of the disease. Furthermore, in light of the diverse sociodemographic characteristics of the patient populations, the use of a fixed matrix of tracer conditions for all test sites is questionable. The discussion centers on these and other problems encountered in attempting to compare technical performance within diverse ambulatory care organizations and provides some guidelines as to the utility of alternative methods for assessing the quality of health care.

  15. An appraisal of the utility or futility of ENT consultant postal questionnaires.

    PubMed

    Ryan, Stephen; Saunders, J; Clarke, E; Fenton, J E

    2013-03-01

    Despite an increase in ENT postal questionnaires, the quality of their methodology has been questioned (Ramphul et al. in J Laryngol Otol 119:175-178, 1). This retrospective study examined whether quality and utility of such questionnaires published since 2005 has improved. Seventeen consultant postal questionnaires published between 2005 and 2012 were reviewed. Quality of questionnaires was assessed using a 30-point score based on compliance with 15 criteria previously established to evaluate postal questionnaire study-design (Ramphul et al. in J Laryngol Otol 119:175-178, 1). Citation rates were used as an indicator of utility. The specific comments made in each citing paper was reviewed providing information on whether questionnaire findings (a) had an impact on clinical practice, (b) were the citing comments positive, (c) negative or (d) non-specific. Recurrent methodological flaws were identified in all questionnaires. The average score assigned was 44 %, versus 32 % previously reported (Ramphul et al. in J Laryngol Otol 119:175-178, 1) (P < 0.01, Student's t test). The low citation rate demonstrates poor utility for postal questionnaires. Citations were general non-specific referencing with no clear indication that questionnaire findings positively impacted clinical practice. In conclusion, although the quality of ENT postal questionnaire has improved since the original study (Ramphul et al. in J Laryngol Otol 119:175-178, 1), important recurring methodological flaws still exist. The poor utility, based on low citation rates, also reflects the continued deficiencies in design quality. It is recommended that authors of questionnaire-based research should ensure that guidelines for questionnaire design are adhered in order to improve the validity of findings and hence impact on clinical practice.

  16. The reporting characteristics and methodological quality of Cochrane reviews about health policy research.

    PubMed

    Xiu-xia, Li; Ya, Zheng; Yao-long, Chen; Ke-hu, Yang; Zong-jiu, Zhang

    2015-04-01

    The systematic review has increasingly become a popular tool for researching health policy. However, due to the complexity and diversity in the health policy research, it has also encountered more challenges. We set out the Cochrane reviews on health policy research as a representative to provide the first examination of epidemiological and descriptive characteristics as well as the compliance of methodological quality with the AMSTAR. 99 reviews were included by inclusion criteria, 73% of which were Implementation Strategies, 15% were Financial Arrangements and 12% were Governance Arrangements; involved Public Health (34%), Theoretical Exploration (18%), Hospital Management (17%), Medical Insurance (12%), Pharmaceutical Policy (9%), Community Health (7%) and Rural Health (2%). Only 39% conducted meta-analysis, and 49% reported being updates, and none was rated low methodological quality. Our research reveals that the quantity and quality of the evidence should be improved, especially Financial Arrangements and Governance Arrangements involved Rural Health, Health Care Reform and Health Equity, etc. And the reliability of AMSTAR needs to be tested in larger range in this field. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Determining radiated sound power of building structures by means of laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Roozen, N. B.; Labelle, L.; Rychtáriková, M.; Glorieux, C.

    2015-06-01

    This paper introduces a methodology that makes use of laser Doppler vibrometry to assess the acoustic insulation performance of a building element. The sound power radiated by the surface of the element is numerically determined from the vibrational pattern, offering an alternative for classical microphone measurements. Compared to the latter the proposed analysis is not sensitive to room acoustical effects. This allows the proposed methodology to be used at low frequencies, where the standardized microphone based approach suffers from a high uncertainty due to a low acoustic modal density. Standardized measurements as well as laser Doppler vibrometry measurements and computations have been performed on two test panels, a light-weight wall and a gypsum block wall and are compared and discussed in this paper. The proposed methodology offers an adequate solution for the assessment of the acoustic insulation of building elements at low frequencies. This is crucial in the framework of recent proposals of acoustic standards for measurement approaches and single number sound insulation performance ratings to take into account frequencies down to 50 Hz.

  18. Changes in clinical trials methodology over time: a systematic review of six decades of research in psychopharmacology.

    PubMed

    Brunoni, André R; Tadini, Laura; Fregni, Felipe

    2010-03-03

    There have been many changes in clinical trials methodology since the introduction of lithium and the beginning of the modern era of psychopharmacology in 1949. The nature and importance of these changes have not been fully addressed to date. As methodological flaws in trials can lead to false-negative or false-positive results, the objective of our study was to evaluate the impact of methodological changes in psychopharmacology clinical research over the past 60 years. We performed a systematic review from 1949 to 2009 on MEDLINE and Web of Science electronic databases, and a hand search of high impact journals on studies of seven major drugs (chlorpromazine, clozapine, risperidone, lithium, fluoxetine and lamotrigine). All controlled studies published 100 months after the first trial were included. Ninety-one studies met our inclusion criteria. We analyzed the major changes in abstract reporting, study design, participants' assessment and enrollment, methodology and statistical analysis. Our results showed that the methodology of psychiatric clinical trials changed substantially, with quality gains in abstract reporting, results reporting, and statistical methodology. Recent trials use more informed consent, periods of washout, intention-to-treat approach and parametric tests. Placebo use remains high and unchanged over time. Clinical trial quality of psychopharmacological studies has changed significantly in most of the aspects we analyzed. There was significant improvement in quality reporting and internal validity. These changes have increased study efficiency; however, there is room for improvement in some aspects such as rating scales, diagnostic criteria and better trial reporting. Therefore, despite the advancements observed, there are still several areas that can be improved in psychopharmacology clinical trials.

  19. Rates of climatic niche evolution are correlated with species richness in a large and ecologically diverse radiation of songbirds.

    PubMed

    Title, Pascal O; Burns, Kevin J

    2015-05-01

    By employing a recently inferred phylogeny and museum occurrence records, we examine the relationship of ecological niche evolution to diversification in the largest family of songbirds, the tanagers (Thraupidae). We test whether differences in species numbers in the major clades of tanagers can be explained by differences in rate of climatic niche evolution. We develop a methodological pipeline to process and filter occurrence records. We find that, of the ecological variables examined, clade richness is higher in clades with higher climatic niche rate, and that this rate is also greater for clades that occupy a greater extent of climatic space. Additionally, we find that more speciose clades contain species with narrower niche breadths, suggesting that clades in which species are more successful at diversifying across climatic gradients have greater potential for speciation or are more buffered from the risk of extinction. © 2015 John Wiley & Sons Ltd/CNRS.

  20. Fatigue life and crack growth prediction methodology

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.; Phillips, E. P.; Everett, R. A., Jr.

    1993-01-01

    The capabilities of a plasticity-induced crack-closure model and life-prediction code to predict fatigue crack growth and fatigue lives of metallic materials are reviewed. Crack-tip constraint factors, to account for three-dimensional effects, were selected to correlate large-crack growth rate data as a function of the effective-stress-intensity factor range (delta(K(sub eff))) under constant-amplitude loading. Some modifications to the delta(K(sub eff))-rate relations were needed in the near threshold regime to fit small-crack growth rate behavior and endurance limits. The model was then used to calculate small- and large-crack growth rates, and in some cases total fatigue lives, for several aluminum and titanium alloys under constant-amplitude, variable-amplitude, and spectrum loading. Fatigue lives were calculated using the crack growth relations and microstructural features like those that initiated cracks. Results from the tests and analyses agreed well.

  1. Evaluation of criteria for developing traffic safety materials for Latinos.

    PubMed

    Streit-Kaplan, Erica L; Miara, Christine; Formica, Scott W; Gallagher, Susan Scavo

    2011-03-01

    This quantitative study assessed the validity of guidelines that identified four key characteristics of culturally appropriate Spanish-language traffic safety materials: language, translation, formative evaluation, and credible source material. From a sample of 190, the authors randomly selected 12 Spanish-language educational materials for analysis by 15 experts. Hypotheses included that the experts would rate materials with more of the key characteristics as more effective (likely to affect behavioral change) and rate materials originally developed in Spanish and those that utilized formative evaluation (e.g., pilot tests, focus groups) as more culturally appropriate. Although results revealed a weak association between the number of key characteristics in a material and the rating of its effectiveness, reviewers rated materials originally created in Spanish and those utilizing formative evaluation as significantly more culturally appropriate. The findings and methodology demonstrated important implications for developers and evaluators of any health-related materials for Spanish speakers and other population groups.

  2. Estimation of Apollo Lunar Dust Transport using Optical Extinction Measurements

    NASA Astrophysics Data System (ADS)

    Lane, John E.; Metzger, Philip T.

    2015-04-01

    A technique to estimate mass erosion rate of surface soil during landing of the Apollo Lunar Module (LM) and total mass ejected due to the rocket plume interaction is proposed and tested. The erosion rate is proportional to the product of the second moment of the lofted particle size distribution N(D), and third moment of the normalized soil size distribution S(D), divided by the integral of S(D)ṡD2/v(D), where D is particle diameter and v(D) is the vertical component of particle velocity. The second moment of N(D) is estimated by optical extinction analysis of the Apollo cockpit video. Because of the similarity between mass erosion rate of soil as measured by optical extinction and rainfall rate as measured by radar reflectivity, traditional NWS radar/rainfall correlation methodology can be applied to the lunar soil case where various S(D) models are assumed corresponding to specific lunar sites.

  3. Modeling the Rapid Boil-Off of a Cryogenic Liquid When Injected into a Low Pressure Cavity

    NASA Technical Reports Server (NTRS)

    Lira, Eric

    2016-01-01

    Many launch vehicle cryogenic applications require the modeling of injecting a cryogenic liquid into a low pressure cavity. The difficulty of such analyses lies in accurately predicting the heat transfer coefficient between the cold liquid and a warm wall in a low pressure environment. The heat transfer coefficient and the behavior of the liquid is highly dependent on the mass flow rate into the cavity, the cavity wall temperature and the cavity volume. Testing was performed to correlate the modeling performed using Thermal Desktop and Sinda Fluint Thermal and Fluids Analysis Software. This presentation shall describe a methodology to model the cryogenic process using Sinda Fluint, a description of the cryogenic test set up, a description of the test procedure and how the model was correlated to match the test results.

  4. Source credibility and evidence format: examining the effectiveness of HIV/AIDS messages for young African Americans.

    PubMed

    Major, Lesa Hatley; Coleman, Renita

    2012-01-01

    Using experimental methodology, this study tests the effectiveness of HIV/AIDS prevention messages tailored specifically to college-aged African Americans. To test interaction effects, it intersects source role and evidence format. The authors used gain-framed and loss-framed information specific to young African Americans and HIV to test message effectiveness between statistical and emotional evidence formats, and for the first time, a statistical/emotional combination format. It tests which source--physician or minister--that young African Americans believe is more effective when delivering HIV/AIDS messages to young African Americans. By testing the interaction between source credibility and evidence format, this research expands knowledge on creating effective health messages in several major areas. Findings include a significant interaction between the role of physician and the combined statistical/emotional format. This message was rated as the most effective way to deliver HIV/AIDS prevention messages.

  5. Assessment of variations in wear test methodology.

    PubMed

    Gouvêa, Cresus V D; Weig, Karin; Filho, Thales R M; Barros, Renata N

    2010-01-01

    The properties of composite resin for dental fillings were improved by development, but its weakness continues to be its wear strength. Several tests have been proposed to evaluate wear in composite resin materials. The aim of this study was to verify how polishing and the type of abrasive can influence the wear rate of composite resin. The test was carried out on two groups. In one group we employed an ormocer and a hybrid composite that was polished group the composite was polished with the same abrasive paper plus a 1 microm and 0.25 microm grit diamond paste. A three-body wear test was performed using the metal sphere of the wear test machine, the composite and an abrasive. A diamond paste and aluminum oxide dispersion were used as abrasive. Analysis of the results showed that there was no difference between polishing techniques, but revealed a difference between abrasives.

  6. The Impact of Explicit Teaching of Methodological Aspects of Physics on Scientistic Beliefs and Interest

    ERIC Educational Resources Information Center

    Korte, Stefan; Berger, Roland; Hänze, Martin

    2017-01-01

    We assessed the impact of teaching methodological aspects of physics on students' scientistic beliefs and subject interest in physics in a repeated-measurement design with a total of 142 students of upper secondary physics classes. Students gained knowledge of methodological aspects from the pre-test to the post-test and reported reduced…

  7. On Improving the Experiment Methodology in Pedagogical Research

    ERIC Educational Resources Information Center

    Horakova, Tereza; Houska, Milan

    2014-01-01

    The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…

  8. Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets

    DTIC Science & Technology

    2017-07-01

    principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for

  9. 77 FR 6971 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six months.../CDC's analysis of costs to the Government is based on the current methodology (ELISA) used to test NHP... different methodology or changes in the availability of ELISA reagents will affect the amount of the user...

  10. Instrumentation Methodology for Automobile Crash Testing

    DOT National Transportation Integrated Search

    1974-08-01

    Principal characteristics of existing data acquisition practices and instrumentation methodologies have been reviewed to identify differences which are responsible for difficulties in comparing and interpreting structural crash test data. Recommendat...

  11. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  12. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  13. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  14. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    PubMed

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  15. A survey of deans and ADEA activities on dental licensure issues.

    PubMed

    Ranney, Richard R; Haden, N Karl; Weaver, Richard W; Valachovic, Richard W

    2003-10-01

    A written survey on issues in clinical testing for licensure was sent to the deans of all dental schools in the United States. Response rate was 89 percent. Results indicate that administrative leaders of the country's dental schools think that third-party evaluation of graduates is appropriate, but they do not have confidence in current clinical tests for licensure. More than nine out of ten respondents indicated that change was needed in current testing procedures, and 82 percent thought the tests as currently conducted were not valid for decision purposes. Regional differences existed among the responses, with the least dissatisfaction occurring in the West. The highest-rated and most frequently mentioned reasons for dissatisfaction with clinical tests were the involvement of patients (human subjects) as currently done. Most respondents favored a national level for licensure tests, although the majority also approved of the recently enacted New York law that permits completion of a postgraduate year in an accredited program to substitute for clinical testing. Respondents indicated a belief that a national database on academic measures as compared to outcomes on clinical licensure tests would be useful, with overall grade point average or class rank as the favored academic measure. Informed by the recommendations of its representatives to the AADE-ADEA Innovative Testing and Educational Methodologies Committee and results of the survey of deans, ADEA is pursuing steps to foster change in the clinical licensure process.

  16. A Methodology to Monitor Airborne PM10 Dust Particles Using a Small Unmanned Aerial Vehicle

    PubMed Central

    Alvarado, Miguel; Gonzalez, Felipe; Erskine, Peter; Cliff, David; Heuff, Darlene

    2017-01-01

    Throughout the process of coal extraction from surface mines, gases and particles are emitted in the form of fugitive emissions by activities such as hauling, blasting and transportation. As these emissions are diffuse in nature, estimations based upon emission factors and dispersion/advection equations need to be measured directly from the atmosphere. This paper expands upon previous research undertaken to develop a relative methodology to monitor PM10 dust particles produced by mining activities making use of small unmanned aerial vehicles (UAVs). A module sensor using a laser particle counter (OPC-N2 from Alphasense, Great Notley, Essex, UK) was tested. An aerodynamic flow experiment was undertaken to determine the position and length of a sampling probe of the sensing module. Flight tests were conducted in order to demonstrate that the sensor provided data which could be used to calculate the emission rate of a source. Emission rates are a critical variable for further predictive dispersion estimates. First, data collected by the airborne module was verified using a 5.0 m tower in which a TSI DRX 8533 (reference dust monitoring device, TSI, Shoreview, MN, USA) and a duplicate of the module sensor were installed. Second, concentration values collected by the monitoring module attached to the UAV (airborne module) obtaining a percentage error of 1.1%. Finally, emission rates from the source were calculated, with airborne data, obtaining errors as low as 1.2%. These errors are low and indicate that the readings collected with the airborne module are comparable to the TSI DRX and could be used to obtain specific emission factors from fugitive emissions for industrial activities. PMID:28216557

  17. Real time testing of intelligent relays for synchronous distributed generation islanding detection

    NASA Astrophysics Data System (ADS)

    Zhuang, Davy

    As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.

  18. Parametric studies of stitching effectiveness for preventing substructure disbond

    NASA Technical Reports Server (NTRS)

    Flanagan, Gerry; Furrow, Keith

    1995-01-01

    A methodology is desired that will allow a designer to select appropriate amounts of through-thickness reinforcement needed to meet design requirements. The goal is to use a relatively simple analysis to minimize the amount of testing that needs to be performed, and to make test results from simple configurations applicable to more general structures. Using this methodology one should be able to optimize the selection of stitching materials, the weight of the yarn, and the stitching density. The analysis approach is to treat substructure disbond as a crack propagation problem. In this approach, the stitches have little influence until a delamination begins to grow. Once the delamination reaches, or extends beyond a stitch, the stitch serves to reduce the strain-energy-release-rate (G) at the crack tip for a given applied load. The reduced G can then be compared to the unstitched materials toughness to predict the load required to further extend the crack. The current model treats the stitch as a simple spring which responds to displacements in the vertical (through-thickness) direction. In concept, this approach is similar to that proposed by other authors. Test results indicate that the model should be refined to include the shearing stiffness of the stitch. The strain-energy-release-rate calculations are performed using a code which uses interconnected higher-order plates to model built-up composite cross-sections. When plates are stacked vertically, the interfacial tractions between the plates can be computed. The plate differential equations are solved in closed-form. The code, called SUBLAM, was developed as part of this section in one dimension. Because of this limitation, rows of stitches are treated as a two-dimensional sheet. The spring stiffness of a row of stitches can be estimated from the stitch material, weight, and density. As a practical and conservative approach, we can assume that the stitch is bonded until a crack passes the stitch location. After the crack passes, it is fully bonded. A series of tests were performed to exercise this methodology and incorporated an attached flange such that the sudden change in thickness initiated a delamination. The analysis was used to estimate the materials' critical G from that of the unstitched specimens. With this data, a prediction was made for the load required to delaminate the stitched specimens. Using the methodology, design charts have been created for simplified geometries. These charts give stitch force, along with G(sub 1) and G(sub 2) as as function of the stitch spring stiffness. Using the charts, it should be possible to determine the stitch spring stiffness and strength required to reduce the G to a desired level. From these parameters, the actual stitching material, weight, and density can be computed.

  19. Assessing importance and satisfaction judgments of intermodal work commuters with electronic survey methodology.

    DOT National Transportation Integrated Search

    2013-09-01

    Recent advances in multivariate methodology provide an opportunity to further the assessment of service offerings in public transportation for work commuting. We offer methodologies that are alternative to direct rating scale and have advantages in t...

  20. Screening tests for aphasia in patients with stroke: a systematic review.

    PubMed

    El Hachioui, Hanane; Visch-Brink, Evy G; de Lau, Lonneke M L; van de Sandt-Koenderman, Mieke W M E; Nouwens, Femke; Koudstaal, Peter J; Dippel, Diederik W J

    2017-02-01

    Aphasia has a large impact on the quality of life and adds significantly to the costs of stroke care. Early recognition of aphasia in stroke patients is important for prognostication and well-timed treatment planning. We aimed to identify available screening tests for differentiating between aphasic and non-aphasic stroke patients, and to evaluate test accuracy, reliability, and feasibility. We searched PubMed, EMbase, Web of Science, and PsycINFO for published studies on screening tests aimed at assessing aphasia in stroke patients. The reference lists of the selected articles were scanned, and several experts were contacted to detect additional references. Of each screening test, we estimated the sensitivity, specificity, likelihood ratio of a positive test, likelihood ratio of a negative test, and diagnostic odds ratio (DOR), and rated the degree of bias of the validation method. We included ten studies evaluating eight screening tests. There was a large variation across studies regarding sample size, patient characteristics, and reference tests used for validation. Many papers failed to report on the consecutiveness of patient inclusion, time between aphasia onset and administration of the screening test, and blinding. Of the three studies that were rated as having an intermediate or low risk of bias, the DOR was highest for the Language Screening Test and ScreeLing. Several screening tools for aphasia in stroke are available, but many tests have not been verified properly. Methodologically sound validation studies of aphasia screening tests are needed to determine their usefulness in clinical practice.

  1. Self-administered physical activity questionnaires for the elderly: a systematic review of measurement properties.

    PubMed

    Forsén, Lisa; Loland, Nina Waaler; Vuillemin, Anne; Chinapaw, Mai J M; van Poppel, Mireille N M; Mokkink, Lidwine B; van Mechelen, Willem; Terwee, Caroline B

    2010-07-01

    To systematically review and appraise studies examining self-administered physical activity questionnaires (PAQ) for the elderly. This article is one of a group of four articles in Sports Medicine on the content and measurement properties of PAQs. LITERATURE SEARCH METHODOLOGY: Searches in PubMed, EMBASE and SportDiscu (until May 2009) on self-administered PAQ. Inclusion criteria were as follows: (i) the study examined (at least one of) the measurement properties of a self-administered PAQ; (ii) the questionnaire aimed to measure physical activity (PA) in older people; (iii) the average age of the study population was >55 years; (iv) the article was written in English. We excluded PA interviews, diaries and studies that evaluated the measurement properties of a self-administered PAQ in a specific population, such as patients. We used a standard checklist (qualitative attributes and measurement properties of PA questionnaires [QAPAQ]) for appraising the measurement properties of PAQs. Eighteen articles on 13 PAQs were reviewed, including 16 reliability analyses and 25 validity analyses (of which 15 were on construct validity, seven on health/functioning associations, two on known-groups validity and one on responsiveness). Many studies suffered from methodological flaws, e.g. too small sample size or inadequate time interval between test and retest. Three PAQs received a positive rating on reliability: IPAQ-C (International Physical Activity Questionnaire-Chinese), intraclass correlation coefficient (ICC) > or = 0.81; WHI-PAQ (Women's Health Initiative-PAQ), ICC = 0.76; and PASE (Physical Activity Scale for the Elderly), Pearson correlation coefficient (r) = 0.84. However, PASE was negatively rated on reliability in another study (ICC = 0.65). One PAQ received a positive rating on construct validity: PASE against Mini-Logger (r > 0.52), but PASE was negatively rated in another study against accelerometer and another PAQ, Spearman correlation coefficient = 0.17 and 0.48, respectively. Three of the 13 PAQs were tested for health/functioning associations and all three were positively rated in some categories of PA in many studies (r > 0.30). Even though several studies showed an association between the tested PAQ and health/functioning variables, the knowledge about reliability and construct validity of self-administrated PAQs for older adults is still scarce and more high-quality validation studies are needed.

  2. Biofidelity assessment of the 6-year-old ATDs in lateral impact.

    PubMed

    Yaek, J L; Li, Y; Lemanski, P J; Begeman, P C; Rouhana, S W; Cavanaugh, J M

    2016-07-03

    The objective of this study was to assess and compare the current lateral impact biofidelity of the shoulder, thorax, abdomen, and pelvis of the Q6, Q6s, and Hybrid III (HIII) 6-year-old anthropomorphic test devices (ATDs) through lateral impact testing. A series of lateral impact pendulum tests, vertical drop tests, and Wayne State University (WSU) sled tests was performed, based on the procedures detailed in ISO/TR 9790 (1999) and scaling to the 6-year-old using Irwin et al. ( 2002 ). The HIII used in this study was tested with the Ford-designed abdomen described in Rouhana ( 2006 ) and Elhagediab et al. ( 2006 ). The data collected from the 3 different ATDs were filtered using SAE J211 (SAE International 2003 ), aligned using the methodology described by Donnelly and Moorhouse ( 2012 ), and compared for each body region tested (shoulder, thorax, abdomen, and pelvis). The biofidelity performance in lateral impact for the 3 ATDs was assessed against the scaled biofidelity targets published in Irwin et al. ( 2002 ), the abdominal biofidelity target suggested in van Ratingen et al. ( 1997 ), and the biofidelity targets published in Rhule et al. ( 2013 ). Regional and overall biofidelity rankings for each of the 3 ATDs were performed using both the ISO 9790 biofidelity rating system (ISO/TR 9790 1999) and the NHTSA's external biofidelity ranking system (BRS; Rhule et al. 2013 ). All 3 6-year-old ATD's pelvises were rated as least biofidelic of the 4 body regions tested, based on both the ISO and BRS biofidelity rating systems, followed by the shoulder and abdomen, respectively. The thorax of all 3 ATDs was rated as the most biofidelic body region using the aforementioned biofidelity rating systems. The HIII 6-year-old ATD was rated last in overall biofidelity of the 3 tested ATDs, based on both rating systems. The Q6s ATD was rated as having the best overall biofidelity using both rating systems. All 3 ATDs are more biofidelic in the thorax and abdomen than the shoulder and pelvis, with the pelvis being the least biofidelic of all 4 tested body regions. None of the 3 tested 6-year-old ATDs had an overall ranking of 2.0 or less, based on the BRS ranking. Therefore, it is expected that none of the 3 ATDs would mechanically respond like a postmortem human subject (PMHS) in a lateral impact crash test based on this ranking system. With respect to the ISO biofidelity rating, the HIII dummy would be considered unsuitable and the Q-series dummies would be considered marginal for assessing side impact occupant protection.

  3. Performance Modeling of an Experimental Laser Propelled Lightcraft

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Chen, Yen-Sen; Liu, Jiwen; Myrabo, Leik N.; Mead, Franklin B., Jr.

    2000-01-01

    A computational plasma aerodynamics model is developed to study the performance of an experimental laser propelled lightcraft. The computational methodology is based on a time-accurate, three-dimensional, finite-difference, chemically reacting, unstructured grid, pressure- based formulation. The underlying physics are added and tested systematically using a building-block approach. The physics modeled include non-equilibn'um thermodynamics, non-equilibrium air-plasma finite-rate kinetics, specular ray tracing, laser beam energy absorption and equi refraction by plasma, non-equilibrium plasma radiation, and plasma resonance. A series of transient computations are performed at several laser pulse energy levels and the simulated physics are discussed and compared with those of tests and literature. The predicted coupling coefficients for the lightcraft compared reasonably well with those of tests conducted on a pendulum apparatus.

  4. On the characterization of the heterogeneous mechanical response of human brain tissue.

    PubMed

    Forte, Antonio E; Gentleman, Stephen M; Dini, Daniele

    2017-06-01

    The mechanical characterization of brain tissue is a complex task that scientists have tried to accomplish for over 50 years. The results in the literature often differ by orders of magnitude because of the lack of a standard testing protocol. Different testing conditions (including humidity, temperature, strain rate), the methodology adopted, and the variety of the species analysed are all potential sources of discrepancies in the measurements. In this work, we present a rigorous experimental investigation on the mechanical properties of human brain, covering both grey and white matter. The influence of testing conditions is also shown and thoroughly discussed. The material characterization performed is finally adopted to provide inputs to a mathematical formulation suitable for numerical simulations of brain deformation during surgical procedures.

  5. Machine cost analysis using the traditional machine-rate method and ChargeOut!

    Treesearch

    E. M. (Ted) Bilek

    2009-01-01

    Forestry operations require ever more use of expensive capital equipment. Mechanization is frequently necessary to perform cost-effective and safe operations. Increased capital should mean more sophisticated capital costing methodologies. However the machine rate method, which is the costing methodology most frequently used, dates back to 1942. CHARGEOUT!, a recently...

  6. Time-discounting and tobacco smoking: a systematic review and network analysis.

    PubMed

    Barlow, Pepita; McKee, Martin; Reeves, Aaron; Galea, Gauden; Stuckler, David

    2017-06-01

    Tobacco smoking harms health, so why do people smoke and fail to quit? An explanation originating in behavioural economics suggests a role for time-discounting, which describes how the value of a reward, such as better health, decreases with delay to its receipt. A large number of studies test the relationship of time-discounting with tobacco outcomes but the temporal pattern of this relationship and its variation according to measurement methods remain unclear. We review the association between time-discounting and smoking across (i) the life course, from initiation to cessation, and (ii) diverse discount measures. We identified 69 relevant studies in Web of Science and PubMed. We synthesized findings across methodologies and evaluated discount measures, study quality and cross-disciplinary fertilization. In 44 out of 54 studies, smokers more greatly discounted the future than non-smokers and, in longitudinal studies, higher discounting predicted future smoking. Smokers with lower time-discount rates achieved higher quit rates. Findings were consistent across studies measuring discount rates using hypothetical monetary or cigarette reward scenarios. The methodological quality of the majority of studies was rated as 'moderate' and co-citation analysis revealed an isolation of economics journals and a dearth of studies in public health. There is moderate yet consistent evidence that high time-discounting is a risk factor for smoking and unsuccessful cessation. Policy scenarios assuming a flat rate of population discounting may inadequately capture smokers' perceptions of costs and benefits. © The Author 2016; Published by Oxford University Press on behalf of the International Epidemiological Association

  7. Standardization of Rocket Engine Pulse Time Parameters

    NASA Technical Reports Server (NTRS)

    Larin, Max E.; Lumpkin, Forrest E.; Rauer, Scott J.

    2001-01-01

    Plumes of bipropellant thrusters are a source of contamination. Small bipropellant thrusters are often used for spacecraft attitude control and orbit correction. Such thrusters typically operate in a pulse mode, at various pulse lengths. Quantifying their contamination effects onto spacecraft external surfaces is especially important for long-term complex-geometry vehicles, e.g. International Space Station. Plume contamination tests indicated the presence of liquid phase contaminant in the form of droplets. Their origin is attributed to incomplete combustion. Most of liquid-phase contaminant is generated during the startup and shutdown (unsteady) periods of thruster pulse. These periods are relatively short (typically 10-50 ms), and the amount of contaminant is determined by the thruster design (propellant valve response, combustion chamber size, thruster mass flow rate, film cooling percentage, dribble volume, etc.) and combustion process organization. Steady-state period of pulse is characterized by much lower contamination rates, but may be lengthy enough to significantly conh'ibute to the overall contamination effect. Because there was no standard methodology for thruster pulse time division, plume contamination tests were conducted at various pulse durations, and their results do not allow quantifying contaminant amounts from each portion of the pulse. At present, the ISS plume contamination model uses an assumption that all thrusters operate in a pulse mode with the pulse length being 100 ms. This assumption may lead to a large difference between the actual amounts of contaminant produced by the thruster and the model predictions. This paper suggests a way to standardize thruster startup and shutdown period definitions, and shows the usefulness of this approach to better quantify thruster plume contamination. Use of the suggested thruster pulse time-division technique will ensure methodological consistency of future thruster plume contamination test programs, and allow accounting for thruster pulse length when modeling plume contamination and erosion effects.

  8. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    PubMed

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. 12 CFR 325.205 - Methodologies and practices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... procedures, that are designed to ensure that its stress test processes satisfy the requirements in this... POLICY CAPITAL MAINTENANCE Annual Stress Test § 325.205 Methodologies and practices. (a) Potential impact on capital. In conducting a stress test under this subpart, during each quarter of the planning...

  10. 12 CFR 325.205 - Methodologies and practices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... procedures, that are designed to ensure that its stress test processes satisfy the requirements in this... POLICY CAPITAL MAINTENANCE Annual Stress Test § 325.205 Methodologies and practices. (a) Potential impact on capital. In conducting a stress test under this subpart, during each quarter of the planning...

  11. Comparison of Meropenem MICs and Susceptibilities for Carbapenemase-Producing Klebsiella pneumoniae Isolates by Various Testing Methods▿

    PubMed Central

    Bulik, Catharine C.; Fauntleroy, Kathy A.; Jenkins, Stephen G.; Abuali, Mayssa; LaBombardi, Vincent J.; Nicolau, David P.; Kuti, Joseph L.

    2010-01-01

    We describe the levels of agreement between broth microdilution, Etest, Vitek 2, Sensititre, and MicroScan methods to accurately define the meropenem MIC and categorical interpretation of susceptibility against carbapenemase-producing Klebsiella pneumoniae (KPC). A total of 46 clinical K. pneumoniae isolates with KPC genotypes, all modified Hodge test and blaKPC positive, collected from two hospitals in NY were included. Results obtained by each method were compared with those from broth microdilution (the reference method), and agreement was assessed based on MICs and Clinical Laboratory Standards Institute (CLSI) interpretative criteria using 2010 susceptibility breakpoints. Based on broth microdilution, 0%, 2.2%, and 97.8% of the KPC isolates were classified as susceptible, intermediate, and resistant to meropenem, respectively. Results from MicroScan demonstrated the most agreement with those from broth microdilution, with 95.6% agreement based on the MIC and 2.2% classified as minor errors, and no major or very major errors. Etest demonstrated 82.6% agreement with broth microdilution MICs, a very major error rate of 2.2%, and a minor error rate of 2.2%. Vitek 2 MIC agreement was 30.4%, with a 23.9% very major error rate and a 39.1% minor error rate. Sensititre demonstrated MIC agreement for 26.1% of isolates, with a 3% very major error rate and a 26.1% minor error rate. Application of FDA breakpoints had little effect on minor error rates but increased very major error rates to 58.7% for Vitek 2 and Sensititre. Meropenem MIC results and categorical interpretations for carbapenemase-producing K. pneumoniae differ by methodology. Confirmation of testing results is encouraged when an accurate MIC is required for antibiotic dosing optimization. PMID:20484603

  12. Tracking variable sedimentation rates in orbitally forced paleoclimate proxy series

    NASA Astrophysics Data System (ADS)

    Li, M.; Kump, L. R.; Hinnov, L.

    2017-12-01

    This study addresses two fundamental issues in cyclostratigraphy: quantitative testing of orbital forcing in cyclic sedimentary sequences and tracking variable sedimentation rates. The methodology proposed here addresses these issues as an inverse problem, and estimates the product-moment correlation coefficient between the frequency spectra of orbital solutions and paleoclimate proxy series over a range of "test" sedimentation rates. It is inspired by the ASM method (1). The number of orbital parameters involved in the estimation is also considered. The method relies on the hypothesis that orbital forcing had a significant impact on the paleoclimate proxy variations, and thus is also tested. The null hypothesis of no astronomical forcing is evaluated using the Beta distribution, for which the shape parameters are estimated using a Monte Carlo simulation approach. We introduce a metric to estimate the most likely sedimentation rate using the product-moment correlation coefficient, H0 significance level, and the number of contributing orbital parameters, i.e., the CHO value. The CHO metric is applied with a sliding window to track variable sedimentation rates along the paleoclimate proxy series. Two forward models with uniform and variable sedimentation rates are evaluated to demonstrate the robustness of the method. The CHO method is applied to the classical Late Triassic Newark depth rank series; the estimated sedimentation rates match closely with previously published sedimentation rates and provide a more highly time-resolved estimate (2,3). References: (1) Meyers, S.R., Sageman, B.B., Amer. J. Sci., 307, 773-792, 2007; (2) Kent, D.V., Olsen, P.E., Muttoni, G., Earth-Sci. Rev.166, 153-180, 2017; (3) Li, M., Zhang, Y., Huang, C., Ogg, J., Hinnov, L., Wang, Y., Zou, Z., Li, L., 2017. Earth Plant. Sc. Lett. doi:10.1016/j.epsl.2017.07.015

  13. Testing Methodology in the Student Learning Process

    ERIC Educational Resources Information Center

    Gorbunova, Tatiana N.

    2017-01-01

    The subject of the research is to build methodologies to evaluate the student knowledge by testing. The author points to the importance of feedback about the mastering level in the learning process. Testing is considered as a tool. The object of the study is to create the test system models for defence practice problems. Special attention is paid…

  14. Tidal disruptions by rotating black holes: relativistic hydrodynamics with Newtonian codes

    NASA Astrophysics Data System (ADS)

    Tejeda, Emilio; Gafton, Emanuel; Rosswog, Stephan; Miller, John C.

    2017-08-01

    We propose an approximate approach for studying the relativistic regime of stellar tidal disruptions by rotating massive black holes. It combines an exact relativistic description of the hydrodynamical evolution of a test fluid in a fixed curved space-time with a Newtonian treatment of the fluid's self-gravity. Explicit expressions for the equations of motion are derived for Kerr space-time using two different coordinate systems. We implement the new methodology within an existing Newtonian smoothed particle hydrodynamics code and show that including the additional physics involves very little extra computational cost. We carefully explore the validity of the novel approach by first testing its ability to recover geodesic motion, and then by comparing the outcome of tidal disruption simulations against previous relativistic studies. We further compare simulations in Boyer-Lindquist and Kerr-Schild coordinates and conclude that our approach allows accurate simulation even of tidal disruption events where the star penetrates deeply inside the tidal radius of a rotating black hole. Finally, we use the new method to study the effect of the black hole spin on the morphology and fallback rate of the debris streams resulting from tidal disruptions, finding that while the spin has little effect on the fallback rate, it does imprint heavily on the stream morphology, and can even be a determining factor in the survival or disruption of the star itself. Our methodology is discussed in detail as a reference for future astrophysical applications.

  15. On the Determination of Magnesium Degradation Rates under Physiological Conditions

    PubMed Central

    Nidadavolu, Eshwara Phani Shubhakar; Feyerabend, Frank; Ebel, Thomas; Willumeit-Römer, Regine; Dahms, Michael

    2016-01-01

    The current physiological in vitro tests of Mg degradation follow the procedure stated according to the ASTM standard. This standard, although useful in predicting the initial degradation behavior of an alloy, has its limitations in interpreting the same for longer periods of immersion in cell culture media. This is an important consequence as the alloy’s degradation is time dependent. Even if two different alloys show similar corrosion rates in a short term experiment, their degradation characteristics might differ with increased immersion times. Furthermore, studies concerning Mg corrosion extrapolate the corrosion rate from a single time point measurement to the order of a year (mm/y), which might not be appropriate because of time dependent degradation behavior. In this work, the above issues are addressed and a new methodology of performing long-term immersion tests in determining the degradation rates of Mg alloys was put forth. For this purpose, cast and extruded Mg-2Ag and powder pressed and sintered Mg-0.3Ca alloy systems were chosen. DMEM Glutamax +10% FBS (Fetal Bovine Serum) +1% Penicillin streptomycin was used as cell culture medium. The advantages of such a method in predicting the degradation rates in vivo deduced from in vitro experiments are discussed. PMID:28773749

  16. Mock jurors' use of error rates in DNA database trawls.

    PubMed

    Scurich, Nicholas; John, Richard S

    2013-12-01

    Forensic science is not infallible, as data collected by the Innocence Project have revealed. The rate at which errors occur in forensic DNA testing-the so-called "gold standard" of forensic science-is not currently known. This article presents a Bayesian analysis to demonstrate the profound impact that error rates have on the probative value of a DNA match. Empirical evidence on whether jurors are sensitive to this effect is equivocal: Studies have typically found they are not, while a recent, methodologically rigorous study found that they can be. This article presents the results of an experiment that examined this issue within the context of a database trawl case in which one DNA profile was tested against a multitude of profiles. The description of the database was manipulated (i.e., "medical" or "offender" database, or not specified) as was the rate of error (i.e., one-in-10 or one-in-1,000). Jury-eligible participants were nearly twice as likely to convict in the offender database condition compared to the condition not specified. The error rates did not affect verdicts. Both factors, however, affected the perception of the defendant's guilt, in the expected direction, although the size of the effect was meager compared to Bayesian prescriptions. The results suggest that the disclosure of an offender database to jurors might constitute prejudicial evidence, and calls for proficiency testing in forensic science as well as training of jurors are echoed. (c) 2013 APA, all rights reserved

  17. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is presented revealing a power relationship. General models describing the soft error rates across scaled product generations are presented. The analysis methodology may be applied to other scaled microelectronic products and their key parameters.

  18. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    NASA Astrophysics Data System (ADS)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  19. Assessing the Medication Adherence App Marketplace From the Health Professional and Consumer Vantage Points.

    PubMed

    Dayer, Lindsey E; Shilling, Rebecca; Van Valkenburg, Madalyn; Martin, Bradley C; Gubbins, Paul O; Hadden, Kristie; Heldenbrand, Seth

    2017-04-19

    Nonadherence produces considerable health consequences and economic burden to patients and payers. One approach to improve medication nonadherence that has gained interest in recent years is the use of smartphone adherence apps. The development of smartphone adherence apps has increased rapidly since 2012; however, literature evaluating the clinical app and effectiveness of smartphone adherence apps to improve medication adherence is generally lacking. The aims of this study were to (1) provide an updated evaluation and comparison of medication adherence apps in the marketplace by assessing the features, functionality, and health literacy (HL) of the highest-ranking adherence apps and (2) indirectly measure the validity of our rating methodology by determining the relationship between our app evaluations and Web-based consumer ratings. Two independent reviewers assessed the features and functionality using a 4-domain rating tool of all adherence apps identified based on developer claims. The same reviewers downloaded and tested the 100 highest-ranking apps including an additional domain for assessment of HL. Pearson product correlations were estimated between the consumer ratings and our domain and total scores. A total of 824 adherence apps were identified; of these, 645 unique apps were evaluated after applying exclusion criteria. The median initial score based on descriptions was 14 (max of 68; range 0-60). As a result, 100 of the highest-scoring unique apps underwent user testing. The median overall user-tested score was 31.5 (max of 73; range 0-60). The majority of the user tested the adherence apps that underwent user testing reported a consumer rating score in their respective online marketplace. The mean consumer rating was 3.93 (SD 0.84). The total user-tested score was positively correlated with consumer ratings (r=.1969, P=.04). More adherence apps are available in the Web-based marketplace, and the quality of these apps varies considerably. Consumer ratings are positively but weakly correlated with user-testing scores suggesting that our rating tool has some validity but that consumers and clinicians may assess adherence app quality differently. ©Lindsey E Dayer, Rebecca Shilling, Madalyn Van Valkenburg, Bradley C Martin, Paul O Gubbins, Kristie Hadden, Seth Heldenbrand. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 19.04.2017.

  20. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    PubMed

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  1. Analysis of bubbles and crashes in the TRY/USD, TRY/EUR, TRY/JPY and TRY/CHF exchange rate within the scope of econophysics

    NASA Astrophysics Data System (ADS)

    Deviren, Bayram; Kocakaplan, Yusuf; Keskin, Mustafa; Balcılar, Mehmet; Özdemir, Zeynel Abidin; Ersoy, Ersan

    2014-09-01

    In this study, we analyze the Turkish Lira/US Dollar (TRY/USD), Turkish Lira/Euro (TRY/EUR), Turkish Lira/Japanese Yen (TRY/JPY) and Turkish Lira/Swiss Franc (TRY/CHF) exchange rates in the global financial crisis period to detect the bubbles and crashes in the TRY by using a mathematical methodology developed by Watanabe et al. (2007). The methodology defines the bubbles and crashes in financial market price fluctuations by considering an exponential fitting of the associated data. This methodology is applied to detect the bubbles and crashes in the TRY/USD, TRY/EUR, TRY/JPY and TRY/CHF exchange rates from January, 1, 2005 to December, 20, 2013. In this mathematical methodology, the whole period of bubbles and crashes can be determined purely from past data, and the start of bubbles and crashes can be identified even before its bursts. In this way, the periods of bubbles and crashes in the TRY/USD, TRY/EUR, TRY/JPY and TRY/CHF are determined, and the beginning and end points of these periods are detected. The results show that the crashes in the TRY/CHF exchange rate are commonly finished earlier than in the other exchange rates; hence it is probable that the crashes in the other exchange rates would be finished soon when the crashes in the TRY/CHF exchange rate ended. We also find that the periods of crashes in the TRY/EUR exchange rate take longer time than in the other exchange rates. This information can be used in risk management and/or speculative gain. The crashes' periods in the TRY/EUR and TRY/USD exchange rates are observed to be relatively longer than in the other exchange rates.

  2. Ablation, Thermal Response, and Chemistry Program for Analysis of Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Milos, Frank S.; Chen, Yih-Kanq

    2010-01-01

    In previous work, the authors documented the Multicomponent Ablation Thermochemistry (MAT) and Fully Implicit Ablation and Thermal response (FIAT) programs. In this work, key features from MAT and FIAT were combined to create the new Fully Implicit Ablation, Thermal response, and Chemistry (FIATC) program. FIATC is fully compatible with FIAT (version 2.5) but has expanded capabilities to compute the multispecies surface chemistry and ablation rate as part of the surface energy balance. This new methodology eliminates B' tables, provides blown species fractions as a function of time, and enables calculations that would otherwise be impractical (e.g. 4+ dimensional tables) such as pyrolysis and ablation with kinetic rates or unequal diffusion coefficients. Equations and solution procedures are presented, then representative calculations of equilibrium and finite-rate ablation in flight and ground-test environments are discussed.

  3. Uncertainty Evaluation of Computational Model Used to Support the Integrated Powerhead Demonstration Project

    NASA Technical Reports Server (NTRS)

    Steele, W. G.; Molder, K. J.; Hudson, S. T.; Vadasy, K. V.; Rieder, P. T.; Giel, T.

    2005-01-01

    NASA and the U.S. Air Force are working on a joint project to develop a new hydrogen-fueled, full-flow, staged combustion rocket engine. The initial testing and modeling work for the Integrated Powerhead Demonstrator (IPD) project is being performed by NASA Marshall and Stennis Space Centers. A key factor in the testing of this engine is the ability to predict and measure the transient fluid flow during engine start and shutdown phases of operation. A model built by NASA Marshall in the ROCket Engine Transient Simulation (ROCETS) program is used to predict transient engine fluid flows. The model is initially calibrated to data from previous tests on the Stennis E1 test stand. The model is then used to predict the next run. Data from this run can then be used to recalibrate the model providing a tool to guide the test program in incremental steps to reduce the risk to the prototype engine. In this paper, they define this type of model as a calibrated model. This paper proposes a method to estimate the uncertainty of a model calibrated to a set of experimental test data. The method is similar to that used in the calibration of experiment instrumentation. For the IPD example used in this paper, the model uncertainty is determined for both LOX and LH flow rates using previous data. The successful use of this model is then demonstrated to predict another similar test run within the uncertainty bounds. The paper summarizes the uncertainty methodology when a model is continually recalibrated with new test data. The methodology is general and can be applied to other calibrated models.

  4. Are validated outcome measures used in distal radial fractures truly valid?

    PubMed Central

    Nienhuis, R. W.; Bhandari, M.; Goslings, J. C.; Poolman, R. W.; Scholtes, V. A. B.

    2016-01-01

    Objectives Patient-reported outcome measures (PROMs) are often used to evaluate the outcome of treatment in patients with distal radial fractures. Which PROM to select is often based on assessment of measurement properties, such as validity and reliability. Measurement properties are assessed in clinimetric studies, and results are often reviewed without considering the methodological quality of these studies. Our aim was to systematically review the methodological quality of clinimetric studies that evaluated measurement properties of PROMs used in patients with distal radial fractures, and to make recommendations for the selection of PROMs based on the level of evidence of each individual measurement property. Methods A systematic literature search was performed in PubMed, EMbase, CINAHL and PsycINFO databases to identify relevant clinimetric studies. Two reviewers independently assessed the methodological quality of the studies on measurement properties, using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Level of evidence (strong / moderate / limited / lacking) for each measurement property per PROM was determined by combining the methodological quality and the results of the different clinimetric studies. Results In all, 19 out of 1508 identified unique studies were included, in which 12 PROMs were rated. The Patient-rated wrist evaluation (PRWE) and the Disabilities of Arm, Shoulder and Hand questionnaire (DASH) were evaluated on most measurement properties. The evidence for the PRWE is moderate that its reliability, validity (content and hypothesis testing), and responsiveness are good. The evidence is limited that its internal consistency and cross-cultural validity are good, and its measurement error is acceptable. There is no evidence for its structural and criterion validity. The evidence for the DASH is moderate that its responsiveness is good. The evidence is limited that its reliability and the validity on hypothesis testing are good. There is no evidence for the other measurement properties. Conclusion According to this systematic review, there is, at best, moderate evidence that the responsiveness of the PRWE and DASH are good, as are the reliability and validity of the PRWE. We recommend these PROMs in clinical studies in patients with distal radial fractures; however, more clinimetric studies of higher methodological quality are needed to adequately determine the other measurement properties. Cite this article: Dr Y. V. Kleinlugtenbelt. Are validated outcome measures used in distal radial fractures truly valid?: A critical assessment using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Bone Joint Res 2016;5:153–161. DOI: 10.1302/2046-3758.54.2000462. PMID:27132246

  5. 78 FR 47217 - Proposed Supervisory Guidance on Implementing Dodd-Frank Act Company-Run Stress Tests for Banking...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-05

    ..., that are designed to ensure that its stress testing processes are effective in meeting the requirements... specific methodological practices. Consistent with this approach, this guidance sets general supervisory... use any specific methodological practices for their stress tests. Companies may use various practices...

  6. Assessing Personality and Mood With Adjective Check List Methodology: A Review

    ERIC Educational Resources Information Center

    Craig, Robert J.

    2005-01-01

    This article addresses the benefits and problems in using adjective check list methodology to assess personality. Recent developments in this assessment method are reviewed, emphasizing seminal adjective-based personality tests (Gough's Adjective Check List), mood tests (Lubin's Depressive Adjective Test, Multiple Affect Adjective Check List),…

  7. Predictive Methodology for Delamination Growth in Laminated Composites Part 1: Theoretical Development and Preliminary Experimental Results

    DOT National Transportation Integrated Search

    1998-04-01

    A methodology is presented for the prediction of delamination growth in laminated structures. The methodology is aimed at overcoming computational difficulties in the determination of energy release rate and mode mix. It also addresses the issue that...

  8. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  9. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  10. Tuning the Mechanical and Antimicrobial Performance of a Cu-Based Metallic Glass Composite through Cooling Rate Control and Annealing

    PubMed Central

    Villapun, Victor Manuel; Esat, Faye; Bull, Steve; Dover, Lynn George; Gonzalez, Sergio

    2017-01-01

    The influence of cooling rate on the wear and antimicrobial performance of a Cu52Z41Al7 (at. %) bulk metallic glass (BMG) composite was studied and the results compared to those of the annealed sample (850 °C for 48 h) and to pure copper. The aim of this basic research is to explore the potential use of the material in preventing the spread of infections. The cooling rate is controlled by changing the mould diameter (2 mm and 3 mm) upon suction casting and controlling the mould temperature (chiller on and off). For the highest cooling rate conditions CuZr is formed but CuZr2 starts to crystallise as the cooling rate decreases, resulting in an increase in the wear resistance and brittleness, as measured by scratch tests. A decrease in the cooling rate also increases the antimicrobial performance, as shown by different methodologies (European, American and Japanese standards). Annealing leads to the formation of new intermetallic phases (Cu10Zr7 and Cu2ZrAl) resulting in maximum scratch hardness and antimicrobial performance. However, the annealed sample corrodes during the antimicrobial tests (within 1 h of contact with broth). The antibacterial activity of copper was proved to be higher than that of any of the other materials tested but it exhibits very poor wear properties. Cu-rich BMG composites with optimised microstructure would be preferable for some applications where the durability requirements are higher than the antimicrobial needs. PMID:28772866

  11. Sex estimation standards for medieval and contemporary Croats

    PubMed Central

    Bašić, Željana; Kružić, Ivana; Jerković, Ivan; Anđelinović, Deny; Anđelinović, Šimun

    2017-01-01

    Aim To develop discriminant functions for sex estimation on medieval Croatian population and test their application on contemporary Croatian population. Methods From a total of 519 skeletons, we chose 84 adult excellently preserved skeletons free of antemortem and postmortem changes and took all standard measurements. Sex was estimated/determined using standard anthropological procedures and ancient DNA (amelogenin analysis) where pelvis was insufficiently preserved or where sex morphological indicators were not consistent. We explored which measurements showed sexual dimorphism and used them for developing univariate and multivariate discriminant functions for sex estimation. We included only those functions that reached accuracy rate ≥80%. We tested the applicability of developed functions on modern Croatian sample (n = 37). Results From 69 standard skeletal measurements used in this study, 56 of them showed statistically significant sexual dimorphism (74.7%). We developed five univariate discriminant functions with classification rate 80.6%-85.2% and seven multivariate discriminant functions with an accuracy rate of 81.8%-93.0%. When tested on the modern population functions showed classification rates 74.1%-100%, and ten of them reached aimed accuracy rate. Females showed higher classification rates in the medieval populations, whereas males were better classified in the modern populations. Conclusion Developed discriminant functions are sufficiently accurate for reliable sex estimation in both medieval Croatian population and modern Croatian samples and may be used in forensic settings. The methodological issues that emerged regarding the importance of considering external factors in development and application of discriminant functions for sex estimation should be further explored. PMID:28613039

  12. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  13. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  14. Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification

    DOT National Transportation Integrated Search

    2011-04-29

    For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...

  15. A new process-centered description tool to initiate meta-reporting methodology in healthcare - 7CARECAT™. Feasibility study in a post-anesthesia care unit.

    PubMed

    Cottet, P; d'Hollander, A; Cahana, A; Van Gessel, E; Tassaux, D

    2013-10-01

    In the healthcare domain, different analytic tools focused on accidents appeared to be poorly adapted to sub-accidental issues. Improving local management and intra-institutional communication with simpler methods, allowing rapid and uncomplicated meta-reporting, could be an attractive alternative. A process-centered structure derived from the industrial domain - DEPOSE(E) - was selected and modified for its use in the healthcare domain. The seven exclusive meta-categories defined - Patient, Equipment, Process, Actor, Supplies, work Room and Organization- constitute 7CARECAT™. A collection of 536 "improvement" reports from a tertiary hospital Post anesthesia care unit (PACU) was used and four meta-categorization rules edited prior to the analysis. Both the relevance of the metacategories and of the rules were tested to build a meta-reporting methodology. The distribution of these categories was analyzed with a χ 2 test. Five hundred and ninety independent facts were collected out of the 536 reports. The frequencies of the categories are: Organization 44%, Actor 37%, Patient 11%, Process 3%, work Room 3%, Equipment 1% and Supplies 1%, with a p-value <0.005 (χ 2). During the analysis, three more rules were edited. The reproducibility, tested randomly on 200 reports, showed a <2% error rate. This meta-reporting methodology, developed with the 7CARECAT™ structure and using a reduced number of operational rules, has successfully produced a stable and consistent classification of sub-accidental events voluntarily reported. This model represents a relevant tool to exchange meta-informations important for local and transversal communication in healthcare institutions. It could be used as a promising tool to improve quality and risk management. Copyright © 2013. Published by Elsevier SAS.

  16. Using existing case-mix methods to fund trauma cases.

    PubMed

    Monakova, Julia; Blais, Irene; Botz, Charles; Chechulin, Yuriy; Picciano, Gino; Basinski, Antoni

    2010-01-01

    Policymakers frequently face the need to increase funding in isolated and frequently heterogeneous (clinically and in terms of resource consumption) patient subpopulations. This article presents a methodologic solution for testing the appropriateness of using existing grouping and weighting methodologies for funding subsets of patients in the scenario where a case-mix approach is preferable to a flat-rate based payment system. Using as an example the subpopulation of trauma cases of Ontario lead trauma hospitals, the statistical techniques of linear and nonlinear regression models, regression trees, and spline models were applied to examine the fit of the existing case-mix groups and reference weights for the trauma cases. The analyses demonstrated that for funding Ontario trauma cases, the existing case-mix systems can form the basis for rational and equitable hospital funding, decreasing the need to develop a different grouper for this subset of patients. This study confirmed that Injury Severity Score is a poor predictor of costs for trauma patients. Although our analysis used the Canadian case-mix classification system and cost weights, the demonstrated concept of using existing case-mix systems to develop funding rates for specific subsets of patient populations may be applicable internationally.

  17. Experimental saltwater intrusion in coastal aquifers using automated image analysis: Applications to homogeneous aquifers

    NASA Astrophysics Data System (ADS)

    Robinson, G.; Ahmed, Ashraf A.; Hamill, G. A.

    2016-07-01

    This paper presents the applications of a novel methodology to quantify saltwater intrusion parameters in laboratory-scale experiments. The methodology uses an automated image analysis procedure, minimising manual inputs and the subsequent systematic errors that can be introduced. This allowed the quantification of the width of the mixing zone which is difficult to measure in experimental methods that are based on visual observations. Glass beads of different grain sizes were tested for both steady-state and transient conditions. The transient results showed good correlation between experimental and numerical intrusion rates. The experimental intrusion rates revealed that the saltwater wedge reached a steady state condition sooner while receding than advancing. The hydrodynamics of the experimental mixing zone exhibited similar traits; a greater increase in the width of the mixing zone was observed in the receding saltwater wedge, which indicates faster fluid velocities and higher dispersion. The angle of intrusion analysis revealed the formation of a volume of diluted saltwater at the toe position when the saltwater wedge is prompted to recede. In addition, results of different physical repeats of the experiment produced an average coefficient of variation less than 0.18 of the measured toe length and width of the mixing zone.

  18. A Support Vector Machine Approach for Truncated Fingerprint Image Detection from Sweeping Fingerprint Sensors

    PubMed Central

    Chen, Chi-Jim; Pai, Tun-Wen; Cheng, Mox

    2015-01-01

    A sweeping fingerprint sensor converts fingerprints on a row by row basis through image reconstruction techniques. However, a built fingerprint image might appear to be truncated and distorted when the finger was swept across a fingerprint sensor at a non-linear speed. If the truncated fingerprint images were enrolled as reference targets and collected by any automated fingerprint identification system (AFIS), successful prediction rates for fingerprint matching applications would be decreased significantly. In this paper, a novel and effective methodology with low time computational complexity was developed for detecting truncated fingerprints in a real time manner. Several filtering rules were implemented to validate existences of truncated fingerprints. In addition, a machine learning method of supported vector machine (SVM), based on the principle of structural risk minimization, was applied to reject pseudo truncated fingerprints containing similar characteristics of truncated ones. The experimental result has shown that an accuracy rate of 90.7% was achieved by successfully identifying truncated fingerprint images from testing images before AFIS enrollment procedures. The proposed effective and efficient methodology can be extensively applied to all existing fingerprint matching systems as a preliminary quality control prior to construction of fingerprint templates. PMID:25835186

  19. Destructive testings: dry drilling operations with TruPro system to collect samples in a powder form, from two hulls containing immobilized wastes in a hydraulic binder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pombet, Denis; Desnoyers, Yvon; Charters, Grant

    2013-07-01

    The TruPro{sup R} process enables to collect a significant number of samples to characterize radiological materials. This innovative and alternative technique is experimented for the ANDRA quality-control inspection of cemented packages. It proves to be quicker and more prolific than the current methodology. Using classical statistics and geo-statistics approaches, the physical and radiological characteristics of two hulls containing immobilized wastes (sludges or concentrates) in a hydraulic binder are assessed in this paper. The waste homogeneity is also evaluated in comparison to ANDRA criterion. Sensibility to sample size (support effect), presence of extreme values, acceptable deviation rate and minimum number ofmore » data are discussed. The final objectives are to check the homogeneity of the two characterized radwaste packages and also to validate and reinforce this alternative characterization methodology. (authors)« less

  20. Methods for estimating the labour force insured by the Ontario Workplace Safety and Insurance Board: 1990-2000.

    PubMed

    Smith, Peter M; Mustard, Cameron A; Payne, Jennifer I

    2004-01-01

    This paper presents a methodology for estimating the size and composition of the Ontario labour force eligible for coverage under the Ontario Workplace Safety & Insurance Act (WSIA). Using customized tabulations from Statistics Canada's Labour Force Survey (LFS), we made adjustments for self-employment, unemployment, part-time employment and employment in specific industrial sectors excluded from insurance coverage under the WSIA. Each adjustment to the LFS reduced the estimates of the insured labour force relative to the total Ontario labour force. These estimates were then developed for major occupational and industrial groups stratified by gender. Additional estimates created to test assumptions used in the methodology produced similar results. The methods described in this paper advance those previously used to estimate the insured labour force, providing researchers with a useful tool to describe trends in the rate of injury across differing occupational, industrial and gender groups in Ontario.

  1. Acoustic Emission Methodology to Evaluate the Fracture Toughness in Heat Treated AISI D2 Tool Steel

    NASA Astrophysics Data System (ADS)

    Mostafavi, Sajad; Fotouhi, Mohamad; Motasemi, Abed; Ahmadi, Mehdi; Sindi, Cevat Teymuri

    2012-10-01

    In this article, fracture toughness behavior of tool steel was investigated using Acoustic Emission (AE) monitoring. Fracture toughness ( K IC) values of a specific tool steel was determined by applying various approaches based on conventional AE parameters, such as Acoustic Emission Cumulative Count (AECC), Acoustic Emission Energy Rate (AEER), and the combination of mechanical characteristics and AE information called sentry function. The critical fracture toughness values during crack propagation were achieved by means of relationship between the integral of the sentry function and cumulative fracture toughness (KICUM). Specimens were selected from AISI D2 cold-work tool steel and were heat treated at four different tempering conditions (300, 450, 525, and 575 °C). The results achieved through AE approaches were then compared with a methodology proposed by compact specimen testing according to ASTM standard E399. It was concluded that AE information was an efficient method to investigate fracture characteristics.

  2. The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale

    ERIC Educational Resources Information Center

    Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine

    2013-01-01

    Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…

  3. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  4. From Study to Work: Methodological Challenges of a Graduate Destination Survey in the Western Cape, South Africa

    ERIC Educational Resources Information Center

    du Toit, Jacques; Kraak, Andre; Favish, Judy; Fletcher, Lizelle

    2014-01-01

    Current literature proposes several strategies for improving response rates to student evaluation surveys. Graduate destination surveys pose the difficulty of tracing graduates years later when their contact details may have changed. This article discusses the methodology of one such a survey to maximise response rates. Compiling a sample frame…

  5. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  6. Modified Methodology for Projecting Coastal Louisiana Land Changes over the Next 50 Years

    USGS Publications Warehouse

    Hartley, Steve B.

    2009-01-01

    The coastal Louisiana landscape is continually undergoing geomorphologic changes (in particular, land loss); however, after the 2005 hurricane season, the changes were intensified because of Hurricanes Katrina and Rita. The amount of land loss caused by the 2005 hurricane season was 42 percent (562 km2) of the total land loss (1,329 km2) that was projected for the next 50 years in the Louisiana Coastal Area (LCA), Louisiana Ecosystem Restoration Study. The purpose of this study is to provide information on potential changes to coastal Louisiana by using a revised LCA study methodology. In the revised methodology, we used classified Landsat TM satellite imagery from 1990, 2001, 2004, and 2006 to calculate the 'background' or ambient land-water change rates but divided the Louisiana coastal area differently on the basis of (1) geographic regions ('subprovinces') and (2) specific homogeneous habitat types. Defining polygons by subprovinces (1, Pontchartrain Basin; 2, Barataria Basin; 3, Vermilion/Terrebonne Basins; and 4, the Chenier Plain area) allows for a specific erosion rate to be applied to that area. Further subdividing the provinces by habitat type allows for specific erosion rates for a particular vegetation type to be applied. Our modified methodology resulted in 24 polygons rather than the 183 that were used in the LCA study; further, actively managed areas and the CWPPRA areas were not masked out and dealt with separately as in the LCA study. This revised methodology assumes that erosion rates for habitat types by subprovince are under the influence of similar environmental conditions (sediment depletion, subsidence, and saltwater intrusion). Background change rate for three time periods (1990-2001, 1990-2004, and 1990-2006) were calculated by taking the difference in water or land among each time period and dividing it by the time interval. This calculation gives an annual change rate for each polygon per time period. Change rates for each time period were then used to compute the projected change in each subprovince and habitat type over 50 years by using the same compound rate functions used in the LCA study. The resulting maps show projected land changes based on the revised methodology and inclusion of damage by Hurricanes Katrina and Rita. Comparison of projected land change values between the LCA study and this study shows that this revised methodology - that is, using a reduced polygon subset (reduced from 183 to 24) based on habitat type and subprovince - can be used as a quick projection of land loss.

  7. Comparison of disease prevalence in two populations in the presence of misclassification.

    PubMed

    Tang, Man-Lai; Qiu, Shi-Fang; Poon, Wai-Yin

    2012-11-01

    Comparing disease prevalence in two groups is an important topic in medical research, and prevalence rates are obtained by classifying subjects according to whether they have the disease. Both high-cost infallible gold-standard classifiers or low-cost fallible classifiers can be used to classify subjects. However, statistical analysis that is based on data sets with misclassifications leads to biased results. As a compromise between the two classification approaches, partially validated sets are often used in which all individuals are classified by fallible classifiers, and some of the individuals are validated by the accurate gold-standard classifiers. In this article, we develop several reliable test procedures and approximate sample size formulas for disease prevalence studies based on the difference between two disease prevalence rates with two independent partially validated series. Empirical studies show that (i) the Score test produces close-to-nominal level and is preferred in practice; and (ii) the sample size formula based on the Score test is also fairly accurate in terms of the empirical power and type I error rate, and is hence recommended. A real example from an aplastic anemia study is used to illustrate the proposed methodologies. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Evaluation of glucose controllers in virtual environment: methodology and sample application.

    PubMed

    Chassin, Ludovic J; Wilinska, Malgorzata E; Hovorka, Roman

    2004-11-01

    Adaptive systems to deliver medical treatment in humans are safety-critical systems and require particular care in both the testing and the evaluation phase, which are time-consuming, costly, and confounded by ethical issues. The objective of the present work is to develop a methodology to test glucose controllers of an artificial pancreas in a simulated (virtual) environment. A virtual environment comprising a model of the carbohydrate metabolism and models of the insulin pump and the glucose sensor is employed to simulate individual glucose excursions in subjects with type 1 diabetes. The performance of the control algorithm within the virtual environment is evaluated by considering treatment and operational scenarios. The developed methodology includes two dimensions: testing in relation to specific life style conditions, i.e. fasting, post-prandial, and life style (metabolic) disturbances; and testing in relation to various operating conditions, i.e. expected operating conditions, adverse operating conditions, and system failure. We define safety and efficacy criteria and describe the measures to be taken prior to clinical testing. The use of the methodology is exemplified by tuning and evaluating a model predictive glucose controller being developed for a wearable artificial pancreas focused on fasting conditions. Our methodology to test glucose controllers in a virtual environment is instrumental in anticipating the results of real clinical tests for different physiological conditions and for different operating conditions. The thorough testing in the virtual environment reduces costs and speeds up the development process.

  9. Klamath Falls: High-Power Acoustic Well Stimulation Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Black, Brian

    Acoustic well stimulation (AWS) technology uses high-power sonic waves from specific frequency spectra in an attempt to stimulate production in a damaged or low-production wellbore. AWS technology is one of the most promising technologies in the oil and gas industry, but it has proven difficult for the industry to develop an effective downhole prototype. This collaboration between Klamath Falls Inc. and the Rocky Mountain Oilfield Testing Center (RMOTC) included a series of tests using high-power ultrasonic tools to stimulate oil and gas production. Phase I testing was designed and implemented to verify tool functionality, power requirements, and capacity of high-powermore » AWS tools. The purpose of Phase II testing was to validate the production response of wells with marginal production rates to AWS stimulation and to capture and identify any changes in the downhole environment after tool deployment. This final report presents methodology and results.« less

  10. Author Response to Sabour (2018), "Comment on Hall et al. (2017), 'How to Choose Between Measures of Tinnitus Loudness for Clinical Research? A Report on the Reliability and Validity of an Investigator-Administered Test and a Patient-Reported Measure Using Baseline Data Collected in a Phase IIa Drug Trial'".

    PubMed

    Hall, Deborah A; Mehta, Rajnikant L; Fackrell, Kathryn

    2018-03-08

    The authors respond to a letter to the editor (Sabour, 2018) concerning the interpretation of validity in the context of evaluating treatment-related change in tinnitus loudness over time. The authors refer to several landmark methodological publications and an international standard concerning the validity of patient-reported outcome measurement instruments. The tinnitus loudness rating performed better against our reported acceptability criteria for (face and convergent) validity than did the tinnitus loudness matching test. It is important to distinguish between tests that evaluate the validity of measuring treatment-related change over time and tests that quantify the accuracy of diagnosing tinnitus as a case and non-case.

  11. Development and psychometric testing of a semantic differential scale of sexual attitude for the older person.

    PubMed

    Park, Hyojung; Shin, Sunhwa

    2015-12-01

    The purpose of this study was to develop and test a semantic differential scale of sexual attitudes for older people in Korea. The scale was based on items derived from a literature review and focus group interviews. A methodological study was used to test the reliability and validity of the instrument. A total of 368 older men and women were recruited to complete the semantic differential scale. Fifteen pairs of adjective ratings were extracted through factor analysis. Total variance explained was 63.40%. To test for construct validity, group comparisons were implemented. The total score of sexual attitudes showed significant differences depending on gender and availability of sexual activity. Cronbach's alpha coefficient for internal consistency was 0.96. The findings of this study demonstrate that the semantic differential scale of sexual attitude is a reliable and valid instrument. © 2015 Wiley Publishing Asia Pty Ltd.

  12. Optimization of Extraction Process for Antidiabetic and Antioxidant Activities of Kursi Wufarikun Ziyabit Using Response Surface Methodology and Quantitative Analysis of Main Components.

    PubMed

    Edirs, Salamet; Turak, Ablajan; Numonov, Sodik; Xin, Xuelei; Aisa, Haji Akber

    2017-01-01

    By using extraction yield, total polyphenolic content, antidiabetic activities (PTP-1B and α -glycosidase), and antioxidant activity (ABTS and DPPH) as indicated markers, the extraction conditions of the prescription Kursi Wufarikun Ziyabit (KWZ) were optimized by response surface methodology (RSM). Independent variables were ethanol concentration, extraction temperature, solid-to-solvent ratio, and extraction time. The result of RSM analysis showed that the four variables investigated have a significant effect ( p < 0.05) for Y 1 , Y 2 , Y 3 , Y 4 , and Y 5 with R 2 value of 0.9120, 0.9793, 0.9076, 0.9125, and 0.9709, respectively. Optimal conditions for the highest extraction yield of 39.28%, PTP-1B inhibition rate of 86.21%, α -glycosidase enzymes inhibition rate of 96.56%, and ABTS inhibition rate of 77.38% were derived at ethanol concentration 50.11%, extraction temperature 72.06°C, solid-to-solvent ratio 1 : 22.73 g/mL, and extraction time 2.93 h. On the basis of total polyphenol content of 48.44% in this optimal condition, the quantitative analysis of effective part of KWZ was characterized via UPLC method, 12 main components were identified by standard compounds, and all of them have shown good regression within the test ranges and the total content of them was 11.18%.

  13. Use of simulated experiments for material characterization of brittle materials subjected to high strain rate dynamic tension

    PubMed Central

    Saletti, Dominique

    2017-01-01

    Rapid progress in ultra-high-speed imaging has allowed material properties to be studied at high strain rates by applying full-field measurements and inverse identification methods. Nevertheless, the sensitivity of these techniques still requires a better understanding, since various extrinsic factors present during an actual experiment make it difficult to separate different sources of errors that can significantly affect the quality of the identified results. This study presents a methodology using simulated experiments to investigate the accuracy of the so-called spalling technique (used to study tensile properties of concrete subjected to high strain rates) by numerically simulating the entire identification process. The experimental technique uses the virtual fields method and the grid method. The methodology consists of reproducing the recording process of an ultra-high-speed camera by generating sequences of synthetically deformed images of a sample surface, which are then analysed using the standard tools. The investigation of the uncertainty of the identified parameters, such as Young's modulus along with the stress–strain constitutive response, is addressed by introducing the most significant user-dependent parameters (i.e. acquisition speed, camera dynamic range, grid sampling, blurring), proving that the used technique can be an effective tool for error investigation. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956505

  14. The Long Exercise Test in Periodic Paralysis: A Bayesian Analysis.

    PubMed

    Simmons, Daniel B; Lanning, Julie; Cleland, James C; Puwanant, Araya; Twydell, Paul T; Griggs, Robert C; Tawil, Rabi; Logigian, Eric L

    2018-05-12

    The long exercise test (LET) is used to assess the diagnosis of periodic paralysis (PP), but LET methodology and normal "cut-off" values vary. To determine optimal LET methodology and cut-offs, we reviewed LET data (abductor digiti minimi (ADM) motor response amplitude, area) from 55 PP patients (32 genetically definite) and 125 controls. Receiver operating characteristic (ROC) curves were constructed and area-under-the-curve (AUC) calculated to compare 1) peak-to-nadir versus baseline-to-nadir methodologies, and 2) amplitude versus area decrements. Using Bayesian principles, optimal "cut-off" decrements that achieved 95% post-test probability of PP were calculated for various pre-test probabilities (PreTPs). AUC was highest for peak-to-nadir methodology and equal for amplitude and area decrements. For PreTP ≤50%, optimal decrement cut-offs (peak-to-nadir) were >40% (amplitude) or >50% (area). For confirmation of PP, our data endorse the diagnostic utility of peak-to-nadir LET methodology using 40% amplitude or 50% area decrement cut-offs for PreTPs ≤50%. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.

  15. Effect of olive mill wastewaters on the oxygen consumption by activated sludge microorganisms: an acute toxicity test method.

    PubMed

    Paixão, S M; Anselmo, A M

    2002-01-01

    The test for inhibition of oxygen consumption by activated sludge (ISO 8192-1986 (E)) was evaluated as a tool for assessing, the acute toxicity of olive mill wastewaters (OMW). According to the ISO test, information generated by this method may be helpful in estimating the effect of a test material on bacterial communities in the aquatic environment, especially in aerobic biological treatment systems. However, the lack of standardized bioassay methodology for effluents imposed that the test conditions were modified and adapted. The experiments were conducted in the presence or absence of an easily biodegradable carbon source (glucose) with different contact times (20 min and 24 h). The results obtained showed a remarkable stimulatory effect of this effluent to the activated sludge microorganisms. In fact, the oxygen uptake rate values increase with increasing effluent concentrations and contact times up to 0.98 microl O(2) h(-1) mg(-1) dry weight for a 100% OMW sample, 24 h contact time, with blanks exhibiting an oxygen uptake rate of ca. 1/10 of this value (0.07-0.10). It seems that the application of the ISO test as an acute toxicity test for effluents should be reconsidered, with convenient adaptation for its utilization as a method of estimating the effect on bacterial communities present in aerobic biological treatment systems. Copyright 2002 John Wiley & Sons, Ltd.

  16. Diagnostic accuracy of different caries risk assessment methods. A systematic review.

    PubMed

    Senneby, Anna; Mejàre, Ingegerd; Sahlin, Nils-Eric; Svensäter, Gunnel; Rohlin, Madeleine

    2015-12-01

    To evaluate the accuracy of different methods used to identify individuals with increased risk of developing dental coronal caries. Studies on following methods were included: previous caries experience, tests using microbiota, buffering capacity, salivary flow rate, oral hygiene, dietary habits and sociodemographic variables. QUADAS-2 was used to assess risk of bias. Sensitivity, specificity, predictive values, and likelihood ratios (LR) were calculated. Quality of evidence based on ≥3 studies of a method was rated according to GRADE. PubMed, Cochrane Library, Web of Science and reference lists of included publications were searched up to January 2015. From 5776 identified articles, 18 were included. Assessment of study quality identified methodological limitations concerning study design, test technology and reporting. No study presented low risk of bias in all domains. Three or more studies were found only for previous caries experience and salivary mutans streptococci and quality of evidence for these methods was low. Evidence regarding other methods was lacking. For previous caries experience, sensitivity ranged between 0.21 and 0.94 and specificity between 0.20 and 1. Tests using salivary mutans streptococci resulted in low sensitivity and high specificity. For children with primary teeth at baseline, pooled LR for a positive test was 3 for previous caries experience and 4 for salivary mutans streptococci, given a threshold ≥10(5) CFU/ml. Evidence on the validity of analysed methods used for caries risk assessment is limited. As methodological quality was low, there is a need to improve study design. Low validity for the analysed methods may lead to patients with increased risk not being identified, whereas some are falsely identified as being at risk. As caries risk assessment guides individualized decisions on interventions and intervals for patient recall, improved performance based on best evidence is greatly needed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Integrating Low-Cost Rapid Usability Testing into Agile System Development of Healthcare IT: A Methodological Perspective.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations.

  18. Person-centredness in the care of older adults: a systematic review of questionnaire-based scales and their measurement properties.

    PubMed

    Wilberforce, Mark; Challis, David; Davies, Linda; Kelly, Michael P; Roberts, Chris; Loynes, Nik

    2016-03-07

    Person-centredness is promoted as a central feature of the long-term care of older adults. Measures are needed to assist researchers, service planners and regulators in assessing this feature of quality. However, no systematic review exists to identify potential instruments and to provide a critical appraisal of their measurement properties. A systematic review of measures of person-centredness was undertaken. Inclusion criteria restricted references to multi-item instruments designed for older adult services, or otherwise with measurement properties tested in an older adult population. A two-stage critical appraisal was conducted. First, the methodological quality of included references was assessed using the COSMIN toolkit. Second, seven measurement properties were rated using widely-recognised thresholds of acceptability. These results were then synthesised to provide an overall appraisal of the strength of evidence for each measurement property for each instrument. Eleven measures tested in 22 references were included. Six instruments were designed principally for use in long-stay residential facilities, and four were for ambulatory hospital or clinic-based services. Only one measure was designed mainly for completion by users of home care services. No measure could be assessed across all seven measurement properties. Despite some instruments having promising measurement properties, this was consistently undermined by the poor methodological quality underpinning them. Testing of hypotheses to support construct validity was of particularly low quality, whilst measurement error was rarely assessed. Two measures were identified as having been the subject of the most rigorous testing. The review is unable to unequivocally recommend any measures of person-centredness for use in older adult care. Researchers are advised to improve methodological rigour when testing instruments. Efforts may be best focused on testing a narrower range of measurement properties but to a higher standard, and ensuring that translations to new languages are resisted until strong measurement properties are demonstrated in the original tongue. Limitations of the review include inevitable semantic and conceptual challenges involved in defining 'person-centredness'. The review protocol was registered with PROSPERO (ref: CRD42014005935).

  19. Computational Pollutant Environment Assessment from Propulsion-System Testing

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; McConnaughey, Paul; Chen, Yen-Sen; Warsi, Saif

    1996-01-01

    An asymptotic plume growth method based on a time-accurate three-dimensional computational fluid dynamics formulation has been developed to assess the exhaust-plume pollutant environment from a simulated RD-170 engine hot-fire test on the F1 Test Stand at Marshall Space Flight Center. Researchers have long known that rocket-engine hot firing has the potential for forming thermal nitric oxides, as well as producing carbon monoxide when hydrocarbon fuels are used. Because of the complex physics involved, most attempts to predict the pollutant emissions from ground-based engine testing have used simplified methods, which may grossly underpredict and/or overpredict the pollutant formations in a test environment. The objective of this work has been to develop a computational fluid dynamics-based methodology that replicates the underlying test-stand flow physics to accurately and efficiently assess pollutant emissions from ground-based rocket-engine testing. A nominal RD-170 engine hot-fire test was computed, and pertinent test-stand flow physics was captured. The predicted total emission rates compared reasonably well with those of the existing hydrocarbon engine hot-firing test data.

  20. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    PubMed

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information gathered in this study to adapt an existing framework for impact of clinical research for use in methodological research. Gathering evidence on research impact of methodological research from a variety of sources has enabled us to obtain multiple indicators and thus to demonstrate broad impacts of methodological research. The adapted framework developed can be applied to future methodological research and thus provides a tool for methodologists to better assess and report research impacts.

  1. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Volume 2, Part 2: Appendixes B, C, D and E

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    The derivation of the equations is presented, the rate control algorithm described, and simulation methodologies summarized. A set of dynamics equations that can be used recursively to calculate forces and torques acting at the joints of an n link manipulator given the manipulator joint rates are derived. The equations are valid for any n link manipulator system with any kind of joints connected in any sequence. The equations of motion for the class of manipulators consisting of n rigid links interconnected by rotary joints are derived. A technique is outlined for reducing the system of equations to eliminate contraint torques. The linearized dynamics equations for an n link manipulator system are derived. The general n link linearized equations are then applied to a two link configuration. The coordinated rate control algorithm used to compute individual joint rates when given end effector rates is described. A short discussion of simulation methodologies is presented.

  2. [Testing methods for the characterization of catheter balloons and lumina].

    PubMed

    Werner, C; Rössler, K; Deckert, F

    1995-10-01

    The present paper reports on the characterization of catheter balloons and lumina on the basis of such known parameters as residual volume, compliance, burst pressure and flow rate, with the aim of developing standards, test methods and testing equipment as well as standards. These are becoming ever more important with the coming into force of the EC directive on medical products [7] and the law governing medical products in Germany [13], which requires manufacturers to specify the properties of their products. Our testing concept is based on a commercially available machine that subjects materials to alternating extension and compression forces over the long-term, to which we added a special hydraulic module. Using the multimedia technology we achieved a real time superimposition of the volume-diameter curve on the balloon. The function of the testing device and method is demonstrated on dilatation catheters. Our initial results reveal compatibility with the requirements of the 1% accuracy class. Use of this methodology for comparative testing of catheters and quality evaluation is recommended.

  3. An Alternative Methodology for Creating Parallel Test Forms Using the IRT Information Function.

    ERIC Educational Resources Information Center

    Ackerman, Terry A.

    The purpose of this paper is to report results on the development of a new computer-assisted methodology for creating parallel test forms using the item response theory (IRT) information function. Recently, several researchers have approached test construction from a mathematical programming perspective. However, these procedures require…

  4. 77 FR 7109 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... assay (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six... loss of the only commercially available antigen-detection ELISA filovirus testing facility. Currently... current methodology (ELISA) used to test NHP liver samples. This cost determines the amount of the user...

  5. Report on FY17 testing in support of integrated EPP-SMT design methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanli .; Jetter, Robert I.; Sham, T. -L.

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate a SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The purpose of this methodology is to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, thermomechanical tests continued in FY17. Thismore » report presents the recent test results for Type 1 SMT specimens on Alloy 617 with long hold times, pressurization SMT on Alloy 617, and two-bar thermal ratcheting test results on SS316H at the temperature range of 405 °C to 705 °C. Preliminary EPP strain range analysis on the two-bar tests are critically evaluated and compared with the experimental results.« less

  6. Interrelationship of Nondestructive Evaluation Methodologies Applied to Testing of Composite Overwrapped Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Leifeste, Mark R.

    2007-01-01

    Composite Overwrapped Pressure Vessels (COPVs) are commonly used in spacecraft for containment of pressurized gases and fluids, incorporating strength and weight savings. The energy stored is capable of extensive spacecraft damage and personal injury in the event of sudden failure. These apparently simple structures, composed of a metallic media impermeable liner and fiber/resin composite overwrap are really complex structures with numerous material and structural phenomena interacting during pressurized use which requires multiple, interrelated monitoring methodologies to monitor and understand subtle changes critical to safe use. Testing of COPVs at NASA Johnson Space Center White Sands T est Facility (WSTF) has employed multiple in-situ, real-time nondestructive evaluation (NDE) methodologies as well as pre- and post-test comparative techniques to monitor changes in material and structural parameters during advanced pressurized testing. The use of NDE methodologies and their relationship to monitoring changes is discussed based on testing of real-world spacecraft COPVs. Lessons learned are used to present recommendations for use in testing, as well as a discussion of potential applications to vessel health monitoring in future applications.

  7. Solid particle erosion mechanisms of protective coatings for aerospace applications

    NASA Astrophysics Data System (ADS)

    Bousser, Etienne

    The main objective of this PhD project is to investigate the material loss mechanisms during Solid Particle Erosion (SPE) of hard protective coatings, including nanocomposite and nanostructured systems. In addition, because of the complex nature of SPE mechanisms, rigorous testing methodologies need to be employed and the effects of all testing parameters need to be fully understood. In this PhD project, the importance of testing methodology is addressed throughout in order to effectively study the SPE mechanisms of brittle materials and coatings. In the initial stage of this thesis, we studied the effect of the addition of silicon (Si) on the microstructure, mechanical properties and, more specifically, on the SPE resistance of thick CrN-based coatings. It was found that the addition of Si significantly improved the erosion resistance and that SPE correlated with the microhardness values, i.e. the coating with the highest microhardness also had the lowest erosion rate (ER). In fact, the ERs showed a much higher dependence on the surface hardness than what has been proposed for brittle erosion mechanisms. In the first article, we study the effects of the particle properties on the SPE behavior of six brittle bulk materials using glass and alumina powders. First, we apply a robust methodology to accurately characterize the elasto-plastic and fracture properties of the studied materials. We then correlate the measured ER to materials' parameters with the help of a morphological study and an analysis of the quasi-static elasto-plastic erosion models. Finally, in order to understand the effects of impact on the particles themselves and to support the energy dissipation-based model proposed here, we study the particle size distributions of the powders before and after erosion testing. It is shown that tests using both powders lead to a material loss mechanism related to lateral fracture, that the higher than predicted velocity exponents point towards a velocity-dependent damage accumulation mechanism correlated to target yield pressure, and that damage accumulation effects are more pronounced for the softer glass powder because of kinetic energy dissipation through different means. In the second article, we study the erosion mechanisms for several hard coatings deposited by pulsed DC magnetron sputtering. We first validate a new methodology for the accurate measurement of volume loss, and we show the importance of optimizing the testing parameters in order to obtain results free from experimental artefacts. We then correlate the measured ERs to the material parameters measured by depth-sensing indentation. In order to understand the material loss mechanisms, we study three of the coating systems in greater detail with the help of fracture characterization and a morphological study of the eroded surfaces. Finally, we study the particle size distributions of the powders before and after erosion testing in an effort to understand the role of particle fracture. We demonstrate that the measured ERs of the coatings are strongly dependent on the target hardness and do not correlate with coating toughness. In fact, the material removal mechanism is found to occur through repeated ductile indentation and cutting of the surface by the impacting particles and that particle breakup is not sufficiently large to influence the results significantly. Studying SPE mechanisms of hard protective coating systems in detail has proven to be quite challenging in the past, given that conventional SPE testing is notoriously inaccurate due to its aggressive nature and its many methodological uncertainties. In the third article, we present a novel in situ real-time erosion testing methodology using a quartz crystal microbalance, developed in order to study the SPE process of hard protective coating systems. Using conventional mass loss SPE testing, we validate and discuss the advantages and challenges related to such a method. In addition, this time-resolved technique enables us to discuss some transient events present during SPE testing of hard coating systems leading to new insights into the erosion process. (Abstract shortened by UMI.)

  8. Education Research: The challenge of incorporating formal research methodology training in a neurology residency.

    PubMed

    Leira, E C; Granner, M A; Torner, J C; Callison, R C; Adams, H P

    2008-05-13

    Physicians often do not have good understanding of research methodology. Unfortunately, the mechanism to achieve this important competency in a busy neurology residency program remains unclear. We tested the value and degree of acceptance by neurology residents of a multimodal educational intervention that consisted of biweekly teaching sessions in place of an existing journal club, as a way to provide formal training in research and statistical techniques. We used a pre- and post-test design with an educational intervention in between using neurology residents at the University of Iowa as subjects. Each test had 40 questions of research methodology. The educational intervention consisted of a biweekly, structured, topic-centered, research methodology-oriented elective seminar following a year-long predefined curriculum. An exit survey was offered to gather resident's perceptions about the course. While a majority of residents agreed that the intervention enhanced their knowledge of research methodology, only 23% attended more than 40% of the sessions. There was no difference between pretest and post-test scores (p = 0.40). Our experience suggests that, in order to accomplish the Accreditation Council for Graduate Medical Education goals regarding increasing competency of residents in knowledge about research methodology, a major restructuring in the neurology residency curriculum with more intense formal training would be necessary.

  9. Development of fuel wear tests using the Cameron-Plint High-Frequency reciprocating machine. Interim report, March 1988-May 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanakia, M.D.; Cuellar, J.P.; Lestz, S.J.

    The objectives of this program were to develop laboratory bench fuel-wear test methodology using JP-8 and to evaluate the effects of additives to improve load-carrying capacity of JP-8 for use in diesel-powered ground equipment. A laboratory test using the Cameron-Plint High-Frequency Reciprocating machine evaluated the effects of various chemical and physical parameters influencing the lubricity of the distillate fuels. The test conditions were determined sufficient to eliminate the effect of fluid physical properties such as viscosity. It was shown that the differences in the intrinsic lubricity of the fuels were due to small amounts of chemical additives. Under such conditions,more » the test can be used as a screening tool to find additives for enhancement of JP-8 lubricity. The test has potential to ascertain minimum lubricity level for diesel-powered ground equipment if these requirements are verified with field performance data and determined to be different from the Air Force JP-8 specifications. The dimensionless wear coefficients of Reference No. 2 diesel fuel were shown to be an order of magnitude lower than the jet fuels. In all cases, the wear rates of jet fuels and isoparaffinic solvents were improved by addition of a corrosion inhibitor or antiwear additive to match the lower wear rates of the diesel fuels. Although there was no measurable change in the viscosities of the jet fuel due to the additives, the wear rates changed by an order of magnitude.« less

  10. The methodology study of time accelerated irradiation of elastomers

    NASA Astrophysics Data System (ADS)

    Ito, Masayuki

    2005-07-01

    The article studied the methods how to shorten the irradiation time by increasing dose rate without changing the relationship between dose versus properties of degraded samples. The samples used were nine kinds of EPDM which have different compounding formula. The different dose of Co-γ ray was exposed to the samples. The maximum dose was 2 MGy. The reference condition to be compared with two short time test conditions is irradiation of 0.33 kGy/h at room temperature. Two methods shown below were studied as the time-accelerate irradiation conditions.

  11. Nutrient Stress Detection in Corn Using Neural Networks and AVIRIS Hyperspectral Imagery

    NASA Technical Reports Server (NTRS)

    Estep, Lee

    2001-01-01

    AVIRIS image cube data has been processed for the detection of nutrient stress in corn by both known, ratio-type algorithms and by trained neural networks. The USDA Shelton, NE, ARS Variable Rate Nitrogen Application (VRAT) experimental farm was the site used in the study. Upon application of ANOVA and Dunnett multiple comparsion tests on the outcome of both the neural network processing and the ratio-type algorithm results, it was found that the neural network methodology provides a better overall capability to separate nutrient stressed crops from in-field controls.

  12. Methodological Review of Intimate Partner Violence Prevention Research

    ERIC Educational Resources Information Center

    Murray, Christine E.; Graybeal, Jennifer

    2007-01-01

    The authors present a methodological review of empirical program evaluation research in the area of intimate partner violence prevention. The authors adapted and utilized criterion-based rating forms to standardize the evaluation of the methodological strengths and weaknesses of each study. The findings indicate that the limited amount of…

  13. 76 FR 72134 - Annual Charges for Use of Government Lands

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-22

    ... revise the methodology used to compute these annual charges. Under the proposed rule, the Commission would create a fee schedule based on the U.S. Bureau of Land Management's (BLM) methodology for calculating rental rates for linear rights of way. This methodology includes a land value per acre, an...

  14. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  15. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  16. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  17. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  18. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  19. Altered gas-exchange at peak exercise in obese adolescents: implications for verification of effort during cardiopulmonary exercise testing.

    PubMed

    Marinus, Nastasia; Bervoets, Liene; Massa, Guy; Verboven, Kenneth; Stevens, An; Takken, Tim; Hansen, Dominique

    2017-12-01

    Cardiopulmonary exercise testing is advised ahead of exercise intervention in obese adolescents to assess medical safety of exercise and physical fitness. Optimal validity and reliability of test results are required to identify maximal exercise effort. As fat oxidation during exercise is disturbed in obese individuals, it remains an unresolved methodological issue whether the respiratory gas exchange ratio (RER) is a valid marker for maximal effort during exercise testing in this population. RER during maximal exercise testing (RERpeak), and RER trajectories, was compared between obese and lean adolescents and relationships between RERpeak, RER slope and subject characteristics (age, gender, Body Mass Index [BMI], Tanner stage, physical activity level) were explored. Thirty-four obese (BMI: 35.1±5.1 kg/m²) and 18 lean (BMI: 18.8±1.9 kg/m²) adolescents (aged 12-18 years) performed a maximal cardiopulmonary exercise test on bike, with comparison of oxygen uptake (VO2), heart rate (HR), expiratory volume (VE), carbon dioxide output (VCO2), and cycling power output (W). RERpeak (1.09±0.06 vs. 1.14±0.06 in obese vs. lean adolescents, respectively) and RER slope (0.03±0.01 vs. 0.05±0.01 per 10% increase in VO2, in obese vs. lean adolescents, respectively) was significantly lower in obese adolescents, and independently related to BMI (P<0.05). Adjusted for HRpeak and VEpeak, RERpeak and RER slope remained significantly lower in obese adolescents (P<0.05). RER trajectories (in relation to %VO2peak and %Wpeak) were significantly different between groups (P<0.001). RERpeak is significantly lowered in obese adolescents. This may have important methodological implications for cardiopulmonary exercise testing in this population.

  20. An investigation of the direct-drive method of susceptibility testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonn, R.H.

    1992-07-01

    The Naval Surface Weapons Laboratory has constructed a small electrical subsystem for the purpose of evaluating electrical upset from various electromagnetic sources. The subsystem consists of three boxes, two of which are intended to be illuminated by electromagnetic waves. The two illuminated boxes are connected by two unshielded cable bundles. The goal of the Navy test series is to expose the subsystem to electromagnetic illumination from several different types of excitation, document upset levels, and compare the results. Before its arrival at Sandia National Laboratories (SNL) the system was illuminated in a mode stirred chamber and in an anechoic chamber.more » This effort was a continuation of that test program. The Sandia tests involved the test methodology referred to as bulk current injection (BCI). Because this is a poorly-shielded, multiple-aperture system, the method was not expected to compare closely to the other test methods. The test results show that. The BCI test methodology is a useful test technique for a subset of limited aperture systems; the methodology will produce incorrect answers when used improperly on complex systems; the methodology can produce accurate answers on simple systems with a well-controlled electromagnetic topology. This is a preliminary study and the results should be interpreted carefully.« less

  1. Suggestions for Job and Curriculum Ladders in Health Center Ambulatory Care: A Pilot Test of the Health Services Mobility Study Methodology.

    ERIC Educational Resources Information Center

    Gilpatrick, Eleanor

    This report contains the results of a pilot test which represents the first complete field test of methodological work begun in October 1967 under a Federal grant for the purpose of job analysis in the health services. This 4-year Health Services Mobility Study permitted basic research, field testing, practical application, and policy involvement…

  2. Concurrent measurement of "real-world" stress and arousal in individuals with psychosis: assessing the feasibility and validity of a novel methodology.

    PubMed

    Kimhy, David; Delespaul, Philippe; Ahn, Hongshik; Cai, Shengnan; Shikhman, Marina; Lieberman, Jeffrey A; Malaspina, Dolores; Sloan, Richard P

    2010-11-01

    Psychosis has been repeatedly suggested to be affected by increases in stress and arousal. However, there is a dearth of evidence supporting the temporal link between stress, arousal, and psychosis during "real-world" functioning. This paucity of evidence may stem from limitations of current research methodologies. Our aim is to the test the feasibility and validity of a novel methodology designed to measure concurrent stress and arousal in individuals with psychosis during "real-world" daily functioning. Twenty patients with psychosis completed a 36-hour ambulatory assessment of stress and arousal. We used experience sampling method with palm computers to assess stress (10 times per day, 10 AM → 10 PM) along with concurrent ambulatory measurement of cardiac autonomic regulation using a Holter monitor. The clocks of the palm computer and Holter monitor were synchronized, allowing the temporal linking of the stress and arousal data. We used power spectral analysis to determine the parasympathetic contributions to autonomic regulation and sympathovagal balance during 5 minutes before and after each experience sample. Patients completed 79% of the experience samples (75% with a valid concurrent arousal data). Momentary increases in stress had inverse correlation with concurrent parasympathetic activity (ρ = -.27, P < .0001) and positive correlation with sympathovagal balance (ρ = .19, P = .0008). Stress and heart rate were not significantly related (ρ = -.05, P = .3875). The findings support the feasibility and validity of our methodology in individuals with psychosis. The methodology offers a novel way to study in high time resolution the concurrent, "real-world" interactions between stress, arousal, and psychosis. The authors discuss the methodology's potential applications and future research directions.

  3. Test-treatment RCTs are susceptible to bias: a review of the methodological quality of randomized trials that evaluate diagnostic tests.

    PubMed

    Ferrante di Ruffano, Lavinia; Dinnes, Jacqueline; Sitch, Alice J; Hyde, Chris; Deeks, Jonathan J

    2017-02-24

    There is a growing recognition for the need to expand our evidence base for the clinical effectiveness of diagnostic tests. Many international bodies are calling for diagnostic randomized controlled trials to provide the most rigorous evidence of impact to patient health. Although these so-called test-treatment RCTs are very challenging to undertake due to their methodological complexity, they have not been subjected to a systematic appraisal of their methodological quality. The extent to which these trials may be producing biased results therefore remains unknown. We set out to address this issue by conducting a methodological review of published test-treatment trials to determine how often they implement adequate methods to limit bias and safeguard the validity of results. We ascertained all test-treatment RCTs published 2004-2007, indexed in CENTRAL, including RCTs which randomized patients to diagnostic tests and measured patient outcomes after treatment. Tests used for screening, monitoring or prognosis were excluded. We assessed adequacy of sequence generation, allocation concealment and intention-to-treat, appropriateness of primary analyses, blinding and reporting of power calculations, and extracted study characteristics including the primary outcome. One hundred three trials compared 105 control with 119 experimental interventions, and reported 150 primary outcomes. Randomization and allocation concealment were adequate in 57 and 37% of trials. Blinding was uncommon (patients 5%, clinicians 4%, outcome assessors 21%), as was an adequate intention-to-treat analysis (29%). Overall 101 of 103 trials (98%) were at risk of bias, as judged using standard Cochrane criteria. Test-treatment trials are particularly susceptible to attrition and inadequate primary analyses, lack of blinding and under-powering. These weaknesses pose much greater methodological and practical challenges to conducting reliable RCT evaluations of test-treatment strategies than standard treatment interventions. We suggest a cautious approach that first examines whether a test-treatment intervention can accommodate the methodological safeguards necessary to minimize bias, and highlight that test-treatment RCTs require different methods to ensure reliability than standard treatment trials. Please see the companion paper to this article: http://bmcmedresmethodol.biomedcentral.com/articles/10.1186/s12874-016-0286-0 .

  4. Comparative evaluation of liquid-liquid extraction, solid-phase extraction and solid-phase microextraction for the gas chromatography-mass spectrometry determination of multiclass priority organic contaminants in wastewater.

    PubMed

    Robles-Molina, José; Gilbert-López, Bienvenida; García-Reyes, Juan F; Molina-Díaz, Antonio

    2013-12-15

    The European Water Framework Directive (WFD) 2000/60/EC establishes guidelines to control the pollution of surface water by sorting out a list of priority substances that involves a significant risk to or via the aquatic systems. In this article, the analytical performance of three different sample preparation methodologies for the GC-MS/MS determination of multiclass organic contaminants-including priority comprounds from the WFD-in wastewater samples using gas chromatography-mass spectrometry was evaluated. The methodologies tested were: (a) liquid-liquid extraction (LLE) with n-hexane; (b) solid-phase extraction (SPE) with C18 cartridges and elution with ethyl acetate:dichloromethane (1:1 (v/v)), and (c) headspace solid-phase microextraction (HS-SPME) using two different fibers: polyacrylate and polydimethylsiloxane/carboxen/divinilbenzene. Identification and confirmation of the selected 57 compounds included in the study (comprising polycyclic aromatic hydrocarbons (PAHs), pesticides and other contaminants) were accomplished using gas chromatography tandem mass spectrometry (GC-MS/MS) with a triple quadrupole instrument operated in the multiple reaction monitoring (MRM) mode. Three MS/MS transitions were selected for unambiguous confirmation of the target chemicals. The different advantages and pitfalls of each method were discussed. In the case of both LLE and SPE procedures, the method was validated at two different concentration levels (15 and 150 ng L(-1)) obtaining recovery rates in the range 70-120% for most of the target compounds. In terms of analyte coverage, results with HS-SPME were not satisfactory, since 14 of the compounds tested were not properly recovered and the overall performance was worse than the other two methods tested. LLE, SPE and HS-SPME (using polyacrylate fiber) procedures also showed good linearity and precision. Using any of the three methodologies tested, limits of quantitation obtained for most of the detected compounds were in the low nanogram per liter range. © 2013 Elsevier B.V. All rights reserved.

  5. Evaluating Cross-National Metrics of Tertiary Graduation Rates for OECD Countries: A Case for Increasing Methodological Congruence and Data Comparability

    ERIC Educational Resources Information Center

    Heuser, Brian L.; Drake, Timothy A.; Owens, Taya L.

    2013-01-01

    By examining the different methods and processes by which national data gathering agencies compile and submit their findings to the Organization for Economic Cooperation and Development (OECD), the authors (1) assess the methodological challenges of accurately reporting tertiary completion and graduation rates cross-nationally; (2) to examine the…

  6. Developpement d'une plateforme de simulation et d'un pilote automatique - Application aux Cessna Citation X et Hawker 800XP

    NASA Astrophysics Data System (ADS)

    Ghazi, Georges

    This report presents several methodologies for the design of tools intended to the analysis of the stability and the control of a business aircraft. At first, a generic flight dynamic model was developed to predict the behavior of the aircraft further to a movement on the control surfaces or further to any disturbance. For that purpose, different categories of winds were considered in the module of simulation to generate various scenarios and conclude about the efficiency of the autopilot. Besides being realistic, the flight model takes into account the variation of the mass parameters according to fuel consumption. A comparison with a simulator of the company CAE Inc. and certified level D allowed to validate this first stage with an acceptable success rate. Once the dynamics is validated, the next stage deals with the stability around a flight condition. For that purpose, a first static analysis is established to find the trim conditions inside the flight envelop. Then, two algorithms of linearization generate the state space models which approximate the decoupled dynamics (longitudinal and lateral) of the aircraft. Then to test the viability of the linear models, 1,500 comparisons with the nonlinear dynamics have been done with a 100% rate of success. The study of stability allowed to highlight the need of control systems to improve first the performances of the plane, then to control its different axes. A methodology based on a coupling between a modern control technique (LQR) and a genetic algorithm is presented. This methodology allowed to find optimal and successful controllers which satisfy a large number of specifications. Besides being successful, they have to be robust to uncertainties owed to the variation of mass. Thus, an analysis of robustness using the theory of the guardian maps was applied to uncertain dynamics. However, because of a too sensitive region of the flight envelop, some analyses are biased. Nevertheless, a validation with the nonlinear dynamics allowed to prove the robustness of the controllers over the entire flight envelope. Finally, the last stage of this project concerned the control laws for the autopilot. Once again, the proposed methodology, bases itself on the association of flight mechanic equations, control theory and a metaheuristic optimization method. Afterward, four detailed test scenarios are presented to illustrate the efficiency and the robustness of the entire autopilot.

  7. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  8. Supplement to a Methodology for Succession Planning for Technical Experts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirk, Bernadette Lugue; Cain, Ronald A.; Agreda, Carla L.

    This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a draft methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, and the methodology is tested through interviews with selected subject matter experts.

  9. Designing and testing regenerative pulp treatment strategies: modeling the transdentinal transport mechanisms

    PubMed Central

    Passos, Agathoklis D.; Mouza, Aikaterini A.; Paras, Spiros V.; Gogos, Christos; Tziafas, Dimitrios

    2015-01-01

    The need for simulation models to thoroughly test the inflammatory effects of dental materials and dentinogenic effects of specific signaling molecules has been well recognized in current dental research. The development of a model that simulates the transdentinal flow and the mass transfer mechanisms is of prime importance in terms of achieving the objectives of developing more effective treatment modalities in restorative dentistry. The present protocol study is part of an ongoing investigation on the development of a methodology that can calculate the transport rate of selected molecules inside a typical dentinal tubule. The transport rate of biological molecules has been investigated using a validated CFD code. In that framework we propose a simple algorithm that, given the type of molecules of the therapeutic agent and the maximum acceptable time for the drug concentration to attain a required value at the pulpal side of the tubules, can estimate the initial concentration to be imposed. PMID:26441676

  10. Optimization of enzyme complexes for efficient hydrolysis of corn stover to produce glucose.

    PubMed

    Yu, Xiaoxiao; Liu, Yan; Meng, Jiatong; Cheng, Qiyue; Zhang, Zaixiao; Cui, Yuxiao; Liu, Jiajing; Teng, Lirong; Lu, Jiahui; Meng, Qingfan; Ren, Xiaodong

    2015-05-01

    Hydrolysis of cellulose to glucose is the critical step for transferring the lignocellulose to the industrial chemicals. For improving the conversion rate of cellulose of corn stover to glucose, the cocktail of celllulase with other auxiliary enzymes and chemicals was studied in this work. Single factor tests and Response Surface Methodology (RSM) were applied to optimize the enzyme mixture, targeting maximum glucose release from corn stover. The increasing rate of glucan-to-glucose conversion got the higher levels while the cellulase was added 1.7μl tween-80/g cellulose, 300μg β-glucosidase/g cellulose, 400μg pectinase/g cellulose and 0.75mg/ml sodium thiosulphate separately in single factor tests. To improve the glucan conversion, the β-glucosidase, pectinase and sodium thiosulphate were selected for next step optimization with RSM. It is showed that the maximum increasing yield was 45.8% at 377μg/g cellulose Novozyme 188, 171μg/g cellulose pectinase and 1mg/ml sodium thiosulphate.

  11. Quantification of effective exoelectrogens by most probable number (MPN) in a microbial fuel cell.

    PubMed

    Heidrich, Elizabeth S; Curtis, Thomas P; Woodcock, Stephen; Dolfing, Jan

    2016-10-01

    The objective of this work was to quantify the number of exoelectrogens in wastewater capable of producing current in a microbial fuel cell by adapting the classical most probable number (MPN) methodology using current production as end point. Inoculating a series of microbial fuel cells with various dilutions of domestic wastewater and with acetate as test substrate yielded an apparent number of exoelectrogens of 17perml. Using current as a proxy for activity the apparent exoelectrogen growth rate was 0.03h(-1). With starch or wastewater as more complex test substrates similar apparent growth rates were obtained, but the apparent MPN based numbers of exoelectrogens in wastewater were significantly lower, probably because in contrast to acetate, complex substrates require complex food chains to deliver the electrons to the electrodes. Consequently, the apparent MPN is a function of the combined probabilities of members of the food chain being present. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  12. Real-time management of an urban groundwater well field threatened by pollution.

    PubMed

    Bauser, Gero; Franssen, Harrie-Jan Hendricks; Kaiser, Hans-Peter; Kuhlmann, Ulrich; Stauffer, Fritz; Kinzelbach, Wolfgang

    2010-09-01

    We present an optimal real-time control approach for the management of drinking water well fields. The methodology is applied to the Hardhof field in the city of Zurich, Switzerland, which is threatened by diffuse pollution. The risk of attracting pollutants is higher if the pumping rate is increased and can be reduced by increasing artificial recharge (AR) or by adaptive allocation of the AR. The method was first tested in offline simulations with a three-dimensional finite element variably saturated subsurface flow model for the period January 2004-August 2005. The simulations revealed that (1) optimal control results were more effective than the historical control results and (2) the spatial distribution of AR should be different from the historical one. Next, the methodology was extended to a real-time control method based on the Ensemble Kalman Filter method, using 87 online groundwater head measurements, and tested at the site. The real-time control of the well field resulted in a decrease of the electrical conductivity of the water at critical measurement points which indicates a reduced inflow of water originating from contaminated sites. It can be concluded that the simulation and the application confirm the feasibility of the real-time control concept.

  13. Prediction of driving capacity after traumatic brain injury: a systematic review.

    PubMed

    Ortoleva, Claudia; Brugger, Camille; Van der Linden, Martial; Walder, Bernhard

    2012-01-01

    To review the current evidence on predictors for the ability to return to driving after traumatic brain injury. Systematic searches were conducted in MEDLINE, PsycINFO, EMBASE, and CINAHL up to March 1, 2010. Studies were rigorously rated for their methodological content and quality and standardized data were extracted from eligible studies. We screened 2341 articles, of which 7 satisfied our inclusion criteria. Five studies were of limited quality because of undefined, unrepresentative samples and/or absence of blinding. Studies mentioned 38 candidate predictors and tested 37. The candidate predictors most frequently mentioned were "selective attention" and "divided attention" in 4/7 studies, and "executive functions" and "processing speed," both in 3/7 studies. No association with driving was observed for 19 candidate predictors. Eighteen candidate predictors from 3 domains were associated with driving capacity: patient and trauma characteristics, neuropsychological assessments, and general assessments; 10 candidate predictors were tested in only one study and 8 in more than one study. The results of associations were contradictory for all but one: time between trauma and driving evaluation. There is no sound basis at present for predicting driving capacity after traumatic brain injury because most studies have methodological limitations.

  14. Testing CREATE at Community Colleges: An Examination of Faculty Perspectives and Diverse Student Gains

    PubMed Central

    Kenyon, Kristy L.; Onorato, Morgan E.; Gottesman, Alan J.; Hoque, Jamila; Hoskins, Sally G.

    2016-01-01

    CREATE (Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment) is an innovative pedagogy for teaching science through the intensive analysis of scientific literature. Initiated at the City College of New York, a minority-serving institution, and regionally expanded in the New York/New Jersey/Pennsylvania area, this methodology has had multiple positive impacts on faculty and students in science, technology, engineering, and mathematics courses. To determine whether the CREATE strategy is effective at the community college (2-yr) level, we prepared 2-yr faculty to use CREATE methodologies and investigated CREATE implementation at community colleges in seven regions of the United States. We used outside evaluation combined with pre/postcourse assessments of students to test related hypotheses: 1) workshop-trained 2-yr faculty teach effectively with the CREATE strategy in their first attempt, and 2) 2-yr students in CREATE courses make cognitive and affective gains during their CREATE quarter or semester. Community college students demonstrated positive shifts in experimental design and critical-thinking ability concurrent with gains in attitudes/self-rated learning and maturation of epistemological beliefs about science. PMID:26931399

  15. Locating the Places People Meet New Sexual Partners in a Southern US City to Inform HIV/STI Prevention and Testing Efforts

    PubMed Central

    Khan, Maria R.; Tisdale, Caroline; Norcott, Kathy; Duncan, Jesse; Kaplan, Andrew M.; Weir, Sharon S.

    2012-01-01

    Places where people meet new sex partners can be venues for the delivery of individual and environmental interventions that aim to reduce transmission of HIV and other sexually transmitted infections (STI). Using the Priorities for Local AIDS Control Efforts (PLACE) methodology we identified and characterized venues where people in a southeastern US city with high prevalence of both HIV and STI go to meet new sexual partners. A total of 123 community informants identified 143 public, private and commercial venues where people meet sex partners. Condoms were available at 14% of the venues, although 48% of venue representatives expressed a willingness to host HIV prevention efforts. Interviews with 373 people (229 men, 144 women) socializing at a random sample of 54 venues found high rates of HIV risk behaviors including concurrent sexual partnerships, transactional sex and illicit substance abuse. Risk behaviors were more common among those at certain venue types including those that may be overlooked by public health outreach efforts. The systematic methodology used was successful in locating venues where risky encounters are established and reveal opportunities for targeted HIV prevention and testing programs as well as research. PMID:20614175

  16. Development of smart textiles with embedded fiber optic chemical sensors

    NASA Astrophysics Data System (ADS)

    Khalil, Saif E.; Yuan, Jianming; El-Sherif, Mahmoud A.

    2004-03-01

    Smart textiles are defined as textiles capable of monitoring their own health conditions or structural behavior, as well as sensing external environmental conditions. Smart textiles appear to be a future focus of the textile industry. As technology accelerates, textiles are found to be more useful and practical for potential advanced technologies. The majority of textiles are used in the clothing industry, which set up the idea of inventing smart clothes for various applications. Examples of such applications are medical trauma assessment and medical patients monitoring (heart and respiration rates), and environmental monitoring for public safety officials. Fiber optics have played a major role in the development of smart textiles as they have in smart structures in general. Optical fiber integration into textile structures (knitted, woven, and non-woven) is presented, and defines the proper methodology for the manufacturing of smart textiles. Samples of fabrics with integrated optical fibers were processed and tested for optical signal transmission. This was done in order to investigate the effect of textile production procedures on optical fiber performance. The tests proved the effectiveness of the developed methodology for integration of optical fibers without changing their optical performance or structural integrity.

  17. PCR-based Methodologies Used to Detect and Differentiate the Burkholderia pseudomallei complex: B. pseudomallei, B. mallei, and B. thailandensis.

    PubMed

    Lowe, Woan; March, Jordon K; Bunnell, Annette J; O'Neill, Kim L; Robison, Richard A

    2014-01-01

    Methods for the rapid detection and differentiation of the Burkholderia pseudomallei complex comprising B. pseudomallei, B. mallei, and B. thailandensis, have been the topic of recent research due to the high degree of phenotypic and genotypic similarities of these species. B. pseudomallei and B. mallei are recognized by the CDC as tier 1 select agents. The high mortality rates of glanders and melioidosis, their potential use as bioweapons, and their low infectious dose, necessitate the need for rapid and accurate detection methods. Although B. thailandensis is generally avirulent in mammals, this species displays very similar phenotypic characteristics to that of B. pseudomallei. Optimal identification of these species remains problematic, due to the difficulty in developing a sensitive, selective, and accurate assay. The development of PCR technologies has revolutionized diagnostic testing and these detection methods have become popular due to their speed, sensitivity, and accuracy. The purpose of this review is to provide a comprehensive overview and evaluation of the advancements in PCR-based detection and differentiation methodologies for the B. pseudomallei complex, and examine their potential uses in diagnostic and environmental testing.

  18. Psychometric evaluation of commonly used game-specific skills tests in rugby: A systematic review

    PubMed Central

    Oorschot, Sander; Chiwaridzo, Matthew; CM Smits-Engelsman, Bouwien

    2017-01-01

    Objectives To (1) give an overview of commonly used game-specific skills tests in rugby and (2) evaluate available psychometric information of these tests. Methods The databases PubMed, MEDLINE CINAHL and Africa Wide information were systematically searched for articles published between January 1995 and March 2017. First, commonly used game-specific skills tests were identified. Second, the available psychometrics of these tests were evaluated and the methodological quality of the studies assessed using the Consensus-based Standards for the selection of health Measurement Instruments checklist. Studies included in the first step had to report detailed information on the construct and testing procedure of at least one game-specific skill, and studies included in the second step had additionally to report at least one psychometric property evaluating reliability, validity or responsiveness. Results 287 articles were identified in the first step, of which 30 articles met the inclusion criteria and 64 articles were identified in the second step of which 10 articles were included. Reactive agility, tackling and simulated rugby games were the most commonly used tests. All 10 studies reporting psychometrics reported reliability outcomes, revealing mainly strong evidence. However, all studies scored poor or fair on methodological quality. Four studies reported validity outcomes in which mainly moderate evidence was indicated, but all articles had fair methodological quality. Conclusion Game-specific skills tests indicated mainly high reliability and validity evidence, but the studies lacked methodological quality. Reactive agility seems to be a promising domain, but the specific tests need further development. Future high methodological quality studies are required in order to develop valid and reliable test batteries for rugby talent identification. Trial registration number PROSPERO CRD42015029747. PMID:29259812

  19. Women's self-rated attraction to male faces does not correspond with physiological arousal.

    PubMed

    Hagerman, S; Woolard, Z; Anderson, K; Tatler, B W; Moore, F R

    2017-10-19

    There has been little work to determine whether attractiveness ratings of faces correspond to sexual or more general attraction. We tested whether a measure of women's physiological arousal (pupil diameter change) was correlated with ratings of men's facial attractiveness. In Study 1, women rated the faces of men for whom we also measured salivary testosterone. They rated each face for attractiveness, and for desirability for friendship and long- and short-term romantic relationships. Pupil diameter change was not related to subjective ratings of attractiveness, but was positively correlated with the men's testosterone. In Study 2 we compared women's pupil diameter change in response to the faces of men with high versus low testosterone, as well as in response to non-facial images pre-rated as either sexually arousing or threatening. Pupil dilation was not affected by testosterone, and increased relatively more in response to sexually arousing than threatening images. We conclude that self-rated preferences may not provide a straightforward and direct assessment of sexual attraction. We argue that future work should identify the constructs that are tapped via attractiveness ratings of faces, and support the development of methodology which assesses objective sexual attraction.

  20. Rating of Dynamic Coefficient for Simple Beam Bridge Design on High-Speed Railways

    NASA Astrophysics Data System (ADS)

    Diachenko, Leonid; Benin, Andrey; Smirnov, Vladimir; Diachenko, Anastasia

    2018-06-01

    The aim of the work is to improve the methodology for the dynamic computation of simple beam spans during the impact of high-speed trains. Mathematical simulation utilizing numerical and analytical methods of structural mechanics is used in the research. The article analyses parameters of the effect of high-speed trains on simple beam spanning bridge structures and suggests a technique of determining of the dynamic index to the live load. Reliability of the proposed methodology is confirmed by results of numerical simulation of high-speed train passage over spans with different speeds. The proposed algorithm of dynamic computation is based on a connection between maximum acceleration of the span in the resonance mode of vibrations and the main factors of stress-strain state. The methodology allows determining maximum and also minimum values of the main efforts in the construction that makes possible to perform endurance tests. It is noted that dynamic additions for the components of the stress-strain state (bending moments, transverse force and vertical deflections) are different. This condition determines the necessity for differentiated approach to evaluation of dynamic coefficients performing design verification of I and II groups of limiting state. The practical importance: the methodology of determining the dynamic coefficients allows making dynamic calculation and determining the main efforts in split beam spans without numerical simulation and direct dynamic analysis that significantly reduces the labour costs for design.

  1. Towards Intelligent Interpretation of Low Strain Pile Integrity Testing Results Using Machine Learning Techniques.

    PubMed

    Cui, De-Mi; Yan, Weizhong; Wang, Xiao-Quan; Lu, Lie-Min

    2017-10-25

    Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT's turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts' interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology's effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction.

  2. Data mining of text as a tool in authorship attribution

    NASA Astrophysics Data System (ADS)

    Visa, Ari J. E.; Toivonen, Jarmo; Autio, Sami; Maekinen, Jarno; Back, Barbro; Vanharanta, Hannu

    2001-03-01

    It is common that text documents are characterized and classified by keywords that the authors use to give them. Visa et al. have developed a new methodology based on prototype matching. The prototype is an interesting document or a part of an extracted, interesting text. This prototype is matched with the document database of the monitored document flow. The new methodology is capable of extracting the meaning of the document in a certain degree. Our claim is that the new methodology is also capable of authenticating the authorship. To verify this claim two tests were designed. The test hypothesis was that the words and the word order in the sentences could authenticate the author. In the first test three authors were selected. The selected authors were William Shakespeare, Edgar Allan Poe, and George Bernard Shaw. Three texts from each author were examined. Every text was one by one used as a prototype. The two nearest matches with the prototype were noted. The second test uses the Reuters-21578 financial news database. A group of 25 short financial news reports from five different authors are examined. Our new methodology and the interesting results from the two tests are reported in this paper. In the first test, for Shakespeare and for Poe all cases were successful. For Shaw one text was confused with Poe. In the second test the Reuters-21578 financial news were identified by the author relatively well. The resolution is that our text mining methodology seems to be capable of authorship attribution.

  3. Acceleration-based methodology to assess the blast mitigation performance of explosive ordnance disposal helmets

    NASA Astrophysics Data System (ADS)

    Dionne, J. P.; Levine, J.; Makris, A.

    2018-01-01

    To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.

  4. Does Maltreatment Beget Maltreatment? A Systematic Review of the Intergenerational Literature

    PubMed Central

    Thornberry, Terence P.; Knight, Kelly E.; Lovegrove, Peter J.

    2014-01-01

    In this paper, we critically review the literature testing the cycle of maltreatment hypothesis which posits continuity in maltreatment across adjacent generations. That is, we examine whether a history of maltreatment victimization is a significant risk factor for the later perpetration of maltreatment. We begin by establishing 11 methodological criteria that studies testing this hypothesis should meet. They include such basic standards as using representative samples, valid and reliable measures, prospective designs, and different reporters for each generation. We identify 47 studies that investigated this issue and then evaluate them with regard to the 11 methodological criteria. Overall, most of these studies report findings consistent with the cycle of maltreatment hypothesis. Unfortunately, at the same time, few of them satisfy the basic methodological criteria that we established; indeed, even the stronger studies in this area only meet about half of them. Moreover, the methodologically stronger studies present mixed support for the hypothesis. As a result, the positive association often reported in the literature appears to be based largely on the methodologically weaker designs. Based on our systematic methodological review, we conclude that this small and methodologically weak body of literature does not provide a definitive test of the cycle of maltreatment hypothesis. We conclude that it is imperative to develop more robust and methodologically adequate assessments of this hypothesis to more accurately inform the development of prevention and treatment programs. PMID:22673145

  5. Characterization of human passive muscles for impact loads using genetic algorithm and inverse finite element methods.

    PubMed

    Chawla, A; Mukherjee, S; Karthikeyan, B

    2009-02-01

    The objective of this study is to identify the dynamic material properties of human passive muscle tissues for the strain rates relevant to automobile crashes. A novel methodology involving genetic algorithm (GA) and finite element method is implemented to estimate the material parameters by inverse mapping the impact test data. Isolated unconfined impact tests for average strain rates ranging from 136 s(-1) to 262 s(-1) are performed on muscle tissues. Passive muscle tissues are modelled as isotropic, linear and viscoelastic material using three-element Zener model available in PAMCRASH(TM) explicit finite element software. In the GA based identification process, fitness values are calculated by comparing the estimated finite element forces with the measured experimental forces. Linear viscoelastic material parameters (bulk modulus, short term shear modulus and long term shear modulus) are thus identified at strain rates 136 s(-1), 183 s(-1) and 262 s(-1) for modelling muscles. Extracted optimal parameters from this study are comparable with reported parameters in literature. Bulk modulus and short term shear modulus are found to be more influential in predicting the stress-strain response than long term shear modulus for the considered strain rates. Variations within the set of parameters identified at different strain rates indicate the need for new or improved material model, which is capable of capturing the strain rate dependency of passive muscle response with single set of material parameters for wide range of strain rates.

  6. Effect of mechanical properties on erosion resistance of ductile materials

    NASA Astrophysics Data System (ADS)

    Levin, Boris Feliksovih

    Solid particle erosion (SPE) resistance of ductile Fe, Ni, and Co-based alloys as well as commercially pure Ni and Cu was studied. A model for SPE behavior of ductile materials is presented. The model incorporates the mechanical properties of the materials at the deformation conditions associated with SPE process, as well as the evolution of these properties during the erosion induced deformation. An erosion parameter was formulated based on consideration of the energy loss during erosion, and incorporates the material's hardness and toughness at high strain rates. The erosion model predicts that materials combining high hardness and toughness can exhibit good erosion resistance. To measure mechanical properties of materials, high strain rate compression tests using Hopkinson bar technique were conducted at strain rates similar to those during erosion. From these tests, failure strength and strain during erosion were estimated and used to calculate toughness of the materials. The proposed erosion parameter shows good correlation with experimentally measured erosion rates for all tested materials. To analyze subsurface deformation during erosion, microhardness and nanoindentation tests were performed on the cross-sections of the eroded materials and the size of the plastically deformed zone and the increase in materials hardness due to erosion were determined. A nanoindentation method was developed to estimate the restitution coefficient within plastically deformed regions of the eroded samples which provides a measure of the rebounding ability of a material during particle impact. An increase in hardness near the eroded surface led to an increase in restitution coefficient. Also, the stress rates imposed below the eroded surface were comparable to those measured during high strain-rate compression tests (10sp3-10sp4 ssp{-1}). A new parameter, "area under the microhardness curve" was developed that represents the ability of a material to absorb impact energy. By incorporating this parameter into a new erosion model, good correlation was observed with experimentally measured erosion rates. An increase in area under the microhardness curve led to an increase in erosion resistance. It was shown that an increase in hardness below the eroded surface occurs mainly due to the strain-rate hardening effect. Strain-rate sensitivities of tested materials were estimated from the nanoindentation tests and showed a decrease with an increase in materials hardness. Also, materials combining high hardness and strain-rate sensitivity may offer good erosion resistance. A methodology is presented to determine the proper mechanical properties to incorporate into the erosion parameter based on the physical model of the erosion mechanism in ductile materials.

  7. [The methodological assessment and qualitative evaluation of psychometric performance tests based on the example of modern tests that assess reading and spelling skills].

    PubMed

    Galuschka, Katharina; Rothe, Josefine; Schulte-Körne, Gerd

    2015-09-01

    This article looks at a means of objectively evaluating the quality of psychometric tests. This approach enables users to evaluate psychometric tests based on their methodological characteristics, in order to decide which instrument should be used. Reading and spelling assessment tools serve as examples. The paper also provides a review of German psychometric tests for the assessment of reading and spelling skills. This method facilitates the identification of psychometric tests.of high methodological quality which can be used for the assessment of reading and spelling skills. Reading performance should ideally be assessed with the following instruments: ELFE 1-6, LGVT 6-12, LESEN 6-7, LESEN 8-9, or WLLP-R. The tests to be used for the evaluation of spelling skills are DERET 1-2+, DERET 3-4+, WRT 1+, WRT 2+, WRT 3+, WRT 4+ or HSP 1-10.

  8. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Xiao-Ying; Yao, Juan; He, Hua

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  9. Development and testing of methodology for evaluating the performance of multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Polotzky, Anthony S.; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1990-01-01

    The development of a controller performance evaluation (CPE) methodology for multiinput/multioutput digital control systems is described. The equations used to obtain the open-loop plant, controller transfer matrices, and return-difference matrices are given. Results of applying the CPE methodology to evaluate MIMO digital flutter suppression systems being tested on an active flexible wing wind-tunnel model are presented to demonstrate the CPE capability.

  10. System testing of a production Ada (trademark) project: The GRODY study

    NASA Technical Reports Server (NTRS)

    Seigle, Jeffrey; Esker, Linda; Shi, Ying-Liang

    1990-01-01

    The use of the Ada language and design methodologies that utilize its features has a strong impact on all phases of the software development project lifecycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The teams found some qualitative differences between the system test phases of the two projects. Although planning for system testing and conducting of tests were not generally affected by the use of Ada, the solving of problems found in system testing was generally facilitated by Ada constructs and design methodology. Most problems found in system testing were not due to difficulty with the language or methodology but to lack of experience with the application.

  11. Cassini's Test Methodology for Flight Software Verification and Operations

    NASA Technical Reports Server (NTRS)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  12. A Test Method for Monitoring Modulus Changes during Durability Tests on Building Joint Sealants

    Treesearch

    Christopher C. White; Donald L. Hunston; Kar Tean Tan; Gregory T. Schueneman

    2012-01-01

    The durability of building joint sealants is generally assessed using a descriptive methodology involving visual inspection of exposed specimens for defects. It is widely known that this methodology has inherent limitations, including that the results are qualitative. A new test method is proposed that provides more fundamental and quantitative information about...

  13. Integrated HTA-FMEA/FMECA methodology for the evaluation of robotic system in urology and general surgery.

    PubMed

    Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea

    2016-11-14

    The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.

  14. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    PubMed

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  15. Pitfalls in the statistical examination and interpretation of the correspondence between physician and patient satisfaction ratings and their relevance for shared decision making research

    PubMed Central

    2011-01-01

    Background The correspondence of satisfaction ratings between physicians and patients can be assessed on different dimensions. One may examine whether they differ between the two groups or focus on measures of association or agreement. The aim of our study was to evaluate methodological difficulties in calculating the correspondence between patient and physician satisfaction ratings and to show the relevance for shared decision making research. Methods We utilised a structured tool for cardiovascular prevention (arriba™) in a pragmatic cluster-randomised controlled trial. Correspondence between patient and physician satisfaction ratings after individual primary care consultations was assessed using the Patient Participation Scale (PPS). We used the Wilcoxon signed-rank test, the marginal homogeneity test, Kendall's tau-b, weighted kappa, percentage of agreement, and the Bland-Altman method to measure differences, associations, and agreement between physicians and patients. Results Statistical measures signal large differences between patient and physician satisfaction ratings with more favourable ratings provided by patients and a low correspondence regardless of group allocation. Closer examination of the raw data revealed a high ceiling effect of satisfaction ratings and only slight disagreement regarding the distributions of differences between physicians' and patients' ratings. Conclusions Traditional statistical measures of association and agreement are not able to capture a clinically relevant appreciation of the physician-patient relationship by both parties in skewed satisfaction ratings. Only the Bland-Altman method for assessing agreement augmented by bar charts of differences was able to indicate this. Trial registration ISRCTN: ISRCT71348772 PMID:21592337

  16. Comparing 2 methods of assessing 30-day readmissions: what is the impact on hospital profiling in the veterans health administration?

    PubMed

    Mull, Hillary J; Chen, Qi; O'Brien, William J; Shwartz, Michael; Borzecki, Ann M; Hanchate, Amresh; Rosen, Amy K

    2013-07-01

    The Centers for Medicare and Medicaid Services' (CMS) all-cause readmission measure and the 3M Health Information System Division Potentially Preventable Readmissions (PPR) measure are both used for public reporting. These 2 methods have not been directly compared in terms of how they identify high-performing and low-performing hospitals. To examine how consistently the CMS and PPR methods identify performance outliers, and explore how the PPR preventability component impacts hospital readmission rates, public reporting on CMS' Hospital Compare website, and pay-for-performance under CMS' Hospital Readmission Reduction Program for 3 conditions (acute myocardial infarction, heart failure, and pneumonia). We applied the CMS all-cause model and the PPR software to VA administrative data to calculate 30-day observed FY08-10 VA hospital readmission rates and hospital profiles. We then tested the effect of preventability on hospital readmission rates and outlier identification for reporting and pay-for-performance by replacing the dependent variable in the CMS all-cause model (Yes/No readmission) with the dichotomous PPR outcome (Yes/No preventable readmission). The CMS and PPR methods had moderate correlations in readmission rates for each condition. After controlling for all methodological differences but preventability, correlations increased to >90%. The assessment of preventability yielded different outlier results for public reporting in 7% of hospitals; for 30% of hospitals there would be an impact on Hospital Readmission Reduction Program reimbursement rates. Despite uncertainty over which readmission measure is superior in evaluating hospital performance, we confirmed that there are differences in CMS-generated and PPR-generated hospital profiles for reporting and pay-for-performance, because of methodological differences and the PPR's preventability component.

  17. Diagnosing Conceptions about the Epistemology of Science: Contributions of a Quantitative Assessment Methodology

    ERIC Educational Resources Information Center

    Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa

    2016-01-01

    This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…

  18. Pigeons exhibit higher accuracy for chosen memory tests than for forced memory tests in duration matching-to-sample.

    PubMed

    Adams, Allison; Santi, Angelo

    2011-03-01

    Following training to match 2- and 8-sec durations of feederlight to red and green comparisons with a 0-sec baseline delay, pigeons were allowed to choose to take a memory test or to escape the memory test. The effects of sample omission, increases in retention interval, and variation in trial spacing on selection of the escape option and accuracy were studied. During initial testing, escaping the test did not increase as the task became more difficult, and there was no difference in accuracy between chosen and forced memory tests. However, with extended training, accuracy for chosen tests was significantly greater than for forced tests. In addition, two pigeons exhibited higher accuracy on chosen tests than on forced tests at the short retention interval and greater escape rates at the long retention interval. These results have not been obtained in previous studies with pigeons when the choice to take the test or to escape the test is given before test stimuli are presented. It appears that task-specific methodological factors may determine whether a particular species will exhibit the two behavioral effects that were initially proposed as potentially indicative of metacognition.

  19. Predictive aging results in radiation environments

    NASA Astrophysics Data System (ADS)

    Gillen, Kenneth T.; Clough, Roger L.

    1993-06-01

    We have previously derived a time-temperature-dose rate superposition methodology, which, when applicable, can be used to predict polymer degradation versus dose rate, temperature and exposure time. This methodology results in predictive capabilities at the low dose rates and long time periods appropriate, for instance, to ambient nuclear power plant environments. The methodology was successfully applied to several polymeric cable materials and then verified for two of the materials by comparisons of the model predictions with 12 year, low-dose-rate aging data on these materials from a nuclear environment. In this paper, we provide a more detailed discussion of the methodology and apply it to data obtained on a number of additional nuclear power plant cable insulation (a hypalon, a silicone rubber and two ethylene-tetrafluoroethylenes) and jacket (a hypalon) materials. We then show that the predicted, low-dose-rate results for our materials are in excellent agreement with long-term (7-9 year) low-dose-rate results recently obtained for the same material types actually aged under bnuclear power plant conditions. Based on a combination of the modelling and long-term results, we find indications of reasonably similar degradation responses among several different commercial formulations for each of the following "generic" materials: hypalon, ethylene-tetrafluoroethylene, silicone rubber and PVC. If such "generic" behavior can be further substantiated through modelling and long-term results on additional formulations, predictions of cable life for other commercial materials of the same generic types would be greatly facilitated.

  20. In situ Measurements of Dissolved Gas Dynamics and Root Uptake in the Wetland Rhizosphere

    NASA Astrophysics Data System (ADS)

    Reid, Matthew; Jaffe, Peter

    2013-04-01

    Anaerobic wetland soils are important natural sources of various atmospheric trace gases that are detrimental to the environment, including methane (CH4), nitrous oxide, elemental mercury (Hg°), and halomethanes. The balance between production and uptake in soils depends, in part, on mass transfer within the soil and between soil and the atmosphere. Observed volatilization rates of trace gases are highly variable and poorly described by models, however, so there is a clear need for new process measurements to clarify the rates of these transport mechanisms. Here we present results from mesocosm push-pull tests intended to quantify transport processes of dissolved gases in wetland sediments, with a focus on uptake by wetland plant roots and partitioning into trapped gas bubbles. This technique uses a suite of nonreactive volatile tracers to pinpoint transport mechanisms without the confounding influence of biochemical transformations. Mass balance approaches are used to determine transport kinetics, and a new analytical method to interpret dissolved gas push-pull test data is presented and compared to traditional analytical techniques. Results confirm the key role of vegetation in dramatically enhancing removal rates of dissolved gases from wetland soils. Root uptake is shown to be diffusion-limited and relative root uptake rates are modeled as an empirical function of molecular size. We use the porewater removal rates measured here to estimate potential volatilization fluxes of CH4, methyl chloride, and Hg° from wetlands vegetated with Typha latifolia and Scirpus acutus. The implementation of this new push-pull test methodology to field settings will be discussed.

  1. Differing antidepressant maintenance methodologies.

    PubMed

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  2. 17 CFR 39.5 - Review of swaps for Commission determination on clearing requirement.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... publicly; (vi) Risk management procedures, including measurement and monitoring of credit exposures, initial and variation margin methodology, methodologies for stress testing and back testing, settlement procedures, and default management procedures; (vii) Applicable rules, manuals, policies, or procedures...

  3. Measurement properties of existing clinical assessment methods evaluating scapular positioning and function. A systematic review.

    PubMed

    Larsen, Camilla Marie; Juul-Kristensen, Birgit; Lund, Hans; Søgaard, Karen

    2014-10-01

    The aims were to compile a schematic overview of clinical scapular assessment methods and critically appraise the methodological quality of the involved studies. A systematic, computer-assisted literature search using Medline, CINAHL, SportDiscus and EMBASE was performed from inception to October 2013. Reference lists in articles were also screened for publications. From 50 articles, 54 method names were identified and categorized into three groups: (1) Static positioning assessment (n = 19); (2) Semi-dynamic (n = 13); and (3) Dynamic functional assessment (n = 22). Fifteen studies were excluded for evaluation due to no/few clinimetric results, leaving 35 studies for evaluation. Graded according to the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN checklist), the methodological quality in the reliability and validity domains was "fair" (57%) to "poor" (43%), with only one study rated as "good". The reliability domain was most often investigated. Few of the assessment methods in the included studies that had "fair" or "good" measurement property ratings demonstrated acceptable results for both reliability and validity. We found a substantially larger number of clinical scapular assessment methods than previously reported. Using the COSMIN checklist the methodological quality of the included measurement properties in the reliability and validity domains were in general "fair" to "poor". None were examined for all three domains: (1) reliability; (2) validity; and (3) responsiveness. Observational evaluation systems and assessment of scapular upward rotation seem suitably evidence-based for clinical use. Future studies should test and improve the clinimetric properties, and especially diagnostic accuracy and responsiveness, to increase utility for clinical practice.

  4. Application of Heart Rate Variability in Diagnosis and Prognosis of Individuals with Diabetes Mellitus: Systematic Review.

    PubMed

    França da Silva, Anne Kastelianne; Penachini da Costa de Rezende Barbosa, Marianne; Marques Vanderlei, Franciele; Destro Christofaro, Diego Giuliano; Marques Vanderlei, Luiz Carlos

    2016-05-01

    The use of heart rate variability as a tool capable of discriminating individuals with diabetes mellitus is still little explored, as its use has been limited to comparing those with and without the disease. Thus, the purpose of this study was to verify the use of heart rate variability as a tool for diagnostic and prognostic evaluation in person with diabetes and to identify whether there are cutoff points generated from the use of this tool in these individuals. A search was conducted in the electronic databases MEDLINE, Cochrane Library, Web of Science, EMBASE, and LILACS starting from the oldest records until January 2015, by means of descriptors related to the target condition, evaluated tool, and evaluation method. All the studies were evaluated for methodological quality using the QUADAS-2 instrument. Eight studies were selected. In general, the studies showed that the heart rate variability is useful to discriminate cardiac autonomic neuropathy in person with diabetes, and the sample entropy, SD1/SD2 indices, SDANN, HF, and slope of TFC have better discriminatory power to detect autonomic dysfunction, with sensitivity and specificity values ranging from 72% to 100% and 71% to 97%, respectively. Although there are methodological differences in indices used, in general, this tool demonstrated good sensitivity and specificity and can be used as an additional and/or complementary tool to the conventional autonomic tests, in order to obtain safer and more effective diagnostic, collaborating for better risk stratification conditions of these patients. © 2016 Wiley Periodicals, Inc.

  5. Monitoring receipt of seasonal influenza vaccines with BRFSS and NHIS data: challenges and solutions.

    PubMed

    Burger, Andrew E; Reither, Eric N

    2014-06-30

    Despite the availability of vaccines that mitigate the health risks associated with seasonal influenza, most individuals in the U.S. remain unvaccinated. Monitoring vaccination uptake for seasonal influenza, especially among disadvantaged or high-risk groups, is therefore an important public health activity. The Behavioral Risk Factor Surveillance System (BRFSS) - the largest telephone-based health surveillance system in the world - is an important resource in monitoring population health trends, including influenza vaccination. However, due to limitations in the question that measures influenza vaccination status, difficulties arise in estimating seasonal vaccination rates. Although researchers have proposed various methodologies to address this issue, no systematic review of these methodologies exists. By subjecting these methods to tests of sensitivity and specificity, we identify their strengths and weaknesses and advance a new method for estimating national and state-level vaccination rates with BRFSS data. To ensure that our findings are not anomalous to the BRFSS, we also analyze data from the National Health Interview Survey (NHIS). For both studies, we find that restricting the sample to interviews conducted between January and September offers the best balance of sensitivity (>90% on average), specificity (>90% on average), and statistical power (retention of 92.2% of vaccinations from the target flu season) over other proposed methods. We conclude that including survey participants from these months provides a simple and effective way to estimate seasonal influenza vaccination rates with BRFSS and NHIS data, and we discuss potential ways to better estimate vaccination rates in future epidemiologic surveys. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Development of an Accelerated Methodology to Study Degradation of Materials in Supercritical Water for Application in High Temperature Power Plants

    NASA Astrophysics Data System (ADS)

    Rodriguez, David

    The decreasing supply of fossil fuel sources, coupled with the increasing concentration of green house gases has placed enormous pressure to maximize the efficiency of power generation. Increasing the outlet temperature of these power plants will result in an increase in operating efficiency. By employing supercritical water as the coolant in thermal power plants (nuclear reactors and coal power plants), the plant efficiency can be increased to 50%, compared to traditional reactors which currently operate at 33%. The goal of this dissertation is to establish techniques to characterize the mechanical properties and corrosion behavior of materials exposed to supercritical water. Traditionally, these tests have been long term exposure tests spanning months. The specific goal of this dissertation is to develop a methodology for accelerated estimation of corrosion rates in supercritical water that can be sued as a screening tool to select materials for long term testing. In this study, traditional methods were used to understand the degradation of materials in supercritical water and establish a point of comparison to the first electrochemical studies performed in supercritical water. Materials studied included austenitic steels (stainless steel 304, stainless steel 316 and Nitronic 50) and nickel based alloys (Inconel 625 and 718). Surface chemistry of the oxide layer was characterized using scanning electron microscopy, X-ray diffraction, FT-IR, Raman and X-ray photoelectron spectroscopies. Stainless steel 304 was subjected to constant tensile load creep tests in water at a pressure of 27 MPa and at temperatures of 200 °C, 315 °C and supercritical water at 450 °C for 24 hours. It was determined that the creep rate for stainless steel 304 exposed to supercritical water would be unacceptable for use in service. It was observed that the formation of hematite was favored in subcritical temperatures, while magnetite was formed in the supercritical region. Corrosion of stainless steel 316, Nitronic 50, Inconel 625 and Inconel 718 was exposed to supercritical water at 530 °C and ultra-supercritical water at 600 °C and was studied as a function exposure time. When exposed to supercritical water, Nitronic 50 and stainless steel 316 were observed to have similar mass gains; however, stainless steel 316 was found to gain less mass than Nitronic 50 in exposure tests performed in ultra-supercritical water. Stainless steel 316 developed surface films primarily composed of iron oxides, while the surface of Nitronic 50 contained a mixture of iron, chromium and manganese oxides. Inconels 625 and 718 samples were exposed to these temperatures for 24, 96, and 200 hours. Inconel 718 exhibited greater mass gain than Inconel 625 for all temperatures and exposure times. For the first time, corrosion rates in supercritical water were determined using electrochemical techniques. The corrosion rates of stainless steel 316, Nitronic 50, Inconel 625 and Inconel 718 were estimated in supercritical and ultra-supercritical water using electrochemical impedance spectroscopy and electrochemical frequency modulation. For all conditions tested, the corrosion rates obtained from electrochemical testing followed similar trends to the long-term gravimetric results. As a screening tool, this protocol can potentially reduce the time required for corrosion rate studies from thousands of hours to 24 hours.

  7. COBRA-WC pretest predictions and post-test analysis of the FOTA temperature distribution during FFTF natural-circulation transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, E.U.; George, T.L.; Rector, D.R.

    The natural circulation tests of the Fast Flux Test Facility (FFTF) demonstrated a safe and stable transition from forced convection to natural convection and showed that natural convection may adequately remove decay heat from the reactor core. The COBRA-WC computer code was developed by the Pacific Northwest laboratory (PNL) to account for buoyancy-induced coolant flow redistribution and interassembly heat transfer, effects that become important in mitigating temperature gradients and reducing reactor core temperatures when coolant flow rate in the core is low. This report presents work sponsored by the US Department of Energy (DOE) with the objective of checking themore » validity of COBRA-WC during the first 220 seconds (sec) of the FFTF natural-circulation (plant-startup) tests using recorded data from two instrumented Fuel Open Test Assemblies (FOTAs). Comparison of COBRA-WC predictions of the FOTA data is a part of the final confirmation of the COBRA-WC methodology for core natural-convection analysis.« less

  8. Gender and sexual orientation differences in cognition across adulthood: age is kinder to women than to men regardless of sexual orientation.

    PubMed

    Maylor, Elizabeth A; Reimers, Stian; Choi, Jean; Collaer, Marcia L; Peters, Michael; Silverman, Irwin

    2007-04-01

    Despite some evidence of greater age-related deterioration of the brain in males than in females, gender differences in rates of cognitive aging have proved inconsistent. The present study employed web-based methodology to collect data from people aged 20-65 years (109,612 men; 88,509 women). As expected, men outperformed women on tests of mental rotation and line angle judgment, whereas women outperformed men on tests of category fluency and object location memory. Performance on all tests declined with age but significantly more so for men than for women. Heterosexuals of each gender generally outperformed bisexuals and homosexuals on tests where that gender was superior; however, there were no clear interactions between age and sexual orientation for either gender. At least for these particular tests from young adulthood to retirement, age is kinder to women than to men, but treats heterosexuals, bisexuals, and homosexuals just the same.

  9. Study on improving the turbidity measurement of the absolute coagulation rate constant.

    PubMed

    Sun, Zhiwei; Liu, Jie; Xu, Shenghua

    2006-05-23

    The existing theories dealing with the evaluation of the absolute coagulation rate constant by turbidity measurement were experimentally tested for different particle-sized (radius = a) suspensions at incident wavelengths (lambda) ranging from near-infrared to ultraviolet light. When the size parameter alpha = 2pi a/lambda > 3, the rate constant data from previous theories for fixed-sized particles show significant inconsistencies at different light wavelengths. We attribute this problem to the imperfection of these theories in describing the light scattering from doublets through their evaluation of the extinction cross section. The evaluations of the rate constants by all previous theories become untenable as the size parameter increases and therefore hampers the applicable range of the turbidity measurement. By using the T-matrix method, we present a robust solution for evaluating the extinction cross section of doublets formed in the aggregation. Our experiments show that this new approach is effective in extending the applicability range of the turbidity methodology and increasing measurement accuracy.

  10. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. A generalized methodology to characterize composite materials for pyrolysis models

    NASA Astrophysics Data System (ADS)

    McKinnon, Mark B.

    The predictive capabilities of computational fire models have improved in recent years such that models have become an integral part of many research efforts. Models improve the understanding of the fire risk of materials and may decrease the number of expensive experiments required to assess the fire hazard of a specific material or designed space. A critical component of a predictive fire model is the pyrolysis sub-model that provides a mathematical representation of the rate of gaseous fuel production from condensed phase fuels given a heat flux incident to the material surface. The modern, comprehensive pyrolysis sub-models that are common today require the definition of many model parameters to accurately represent the physical description of materials that are ubiquitous in the built environment. Coupled with the increase in the number of parameters required to accurately represent the pyrolysis of materials is the increasing prevalence in the built environment of engineered composite materials that have never been measured or modeled. The motivation behind this project is to develop a systematic, generalized methodology to determine the requisite parameters to generate pyrolysis models with predictive capabilities for layered composite materials that are common in industrial and commercial applications. This methodology has been applied to four common composites in this work that exhibit a range of material structures and component materials. The methodology utilizes a multi-scale experimental approach in which each test is designed to isolate and determine a specific subset of the parameters required to define a material in the model. Data collected in simultaneous thermogravimetry and differential scanning calorimetry experiments were analyzed to determine the reaction kinetics, thermodynamic properties, and energetics of decomposition for each component of the composite. Data collected in microscale combustion calorimetry experiments were analyzed to determine the heats of complete combustion of the volatiles produced in each reaction. Inverse analyses were conducted on sample temperature data collected in bench-scale tests to determine the thermal transport parameters of each component through degradation. Simulations of quasi-one-dimensional bench-scale gasification tests generated from the resultant models using the ThermaKin modeling environment were compared to experimental data to independently validate the models.

  12. Evaluation of deconvolution modelling applied to numerical combustion

    NASA Astrophysics Data System (ADS)

    Mehl, Cédric; Idier, Jérôme; Fiorina, Benoît

    2018-01-01

    A possible modelling approach in the large eddy simulation (LES) of reactive flows is to deconvolve resolved scalars. Indeed, by inverting the LES filter, scalars such as mass fractions are reconstructed. This information can be used to close budget terms of filtered species balance equations, such as the filtered reaction rate. Being ill-posed in the mathematical sense, the problem is very sensitive to any numerical perturbation. The objective of the present study is to assess the ability of this kind of methodology to capture the chemical structure of premixed flames. For that purpose, three deconvolution methods are tested on a one-dimensional filtered laminar premixed flame configuration: the approximate deconvolution method based on Van Cittert iterative deconvolution, a Taylor decomposition-based method, and the regularised deconvolution method based on the minimisation of a quadratic criterion. These methods are then extended to the reconstruction of subgrid scale profiles. Two methodologies are proposed: the first one relies on subgrid scale interpolation of deconvolved profiles and the second uses parametric functions to describe small scales. Conducted tests analyse the ability of the method to capture the chemical filtered flame structure and front propagation speed. Results show that the deconvolution model should include information about small scales in order to regularise the filter inversion. a priori and a posteriori tests showed that the filtered flame propagation speed and structure cannot be captured if the filter size is too large.

  13. [Susceptibility of Aedes aegypti to DDT, deltamethrin, and lambda-cyhalothrin in Colombia].

    PubMed

    Santacoloma Varón, Liliana; Chaves Córdoba, Bernardo; Brochero, Helena Luisa

    2010-01-01

    To assess the susceptibility status of 13 natural populations of Aedes aegypti (collected from sites in Colombia where dengue is a serious public health problem) to the pyrethroids, deltamethrin and lambda-cyhalothrin, and to the organochlorine, DDT, and to identify any biochemical mechanisms associated with resistance. Immature forms of the vector were collected from natural breeding spots at each site and then raised under controlled conditions. Using the F2 generation, bioassays were performed using the World Health Organization's 1981 methodology (impregnated paper) and United States Centers for Disease Control and Prevention's 1998 methodology (impregnated bottles). In populations where mortality rates were consistent with decreased susceptibility, levels of nonspecific esterases (NSE), mixed-function oxidases (MFO), and acetylcholinesterase (AChE) were measured using colorimetric tests. All of the mosquito populations that were tested showed resistance to the organochlorine DDT. In the case of the pyrethroids, widespread resistance to lambda-cyhalothrin was found, but not to deltamethrin. Assessing the biochemical resistance mechanisms showed that 7 of the 11 populations had elevated NSE, and one population, increased MFO. Physiological cross-resistance between DDT and lambda-cyhalothrin in the A. aegypti populations tested was dismissed. Physiological resistance to lambda-cyhalothrin appears to be associated with increased NSE. The differences in susceptibility levels and enzyme values among the populations were associated with genetic variations and chemicals in use locally.

  14. Measurements of Mode I Interlaminar Properties of Carbon Fiber Reinforced Polymers Using Digital Image Correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merzkirch, Matthias; Ahure Powell, Louise; Foecke, Tim

    Numerical models based on cohesive zones are usually used to model and simulate the mechanical behavior of laminated carbon fiber reinforced polymers (CFRP) in automotive and aerospace applications and require different interlaminar properties. This work focuses on determining the interlaminar fracture toughness (G IC) under Mode I loading of a double cantilever beam (DCB) specimen of unidirectional CFRP, serving as prototypical material. The novelty of this investigation is the improvement of the testing methodology by introducing digital image correlation (DIC) as an extensometer and this tool allows for crack growth measurement, phenomenological visualization and quantification of various material responses tomore » Mode I loading. Multiple methodologies from different international standards and other common techniques are compared for the determination of the evolution of G IC as crack resistance curves (R-curves). The primarily metrological sources of uncertainty, in contrast to material specific related uncertainties, are discussed through a simple sensitivity analysis. Additionally, the current work offers a detailed insight into the constraints and assumptions to allow exploration of different methods for the determination of material properties using the DIC measured data. The main aim is an improvement of the measurement technique and an increase in the reliability of measured data during static testing, in advance of future rate dependent testing for crashworthiness simulations.« less

  15. Measurements of Mode I Interlaminar Properties of Carbon Fiber Reinforced Polymers Using Digital Image Correlation

    DOE PAGES

    Merzkirch, Matthias; Ahure Powell, Louise; Foecke, Tim

    2017-07-01

    Numerical models based on cohesive zones are usually used to model and simulate the mechanical behavior of laminated carbon fiber reinforced polymers (CFRP) in automotive and aerospace applications and require different interlaminar properties. This work focuses on determining the interlaminar fracture toughness (G IC) under Mode I loading of a double cantilever beam (DCB) specimen of unidirectional CFRP, serving as prototypical material. The novelty of this investigation is the improvement of the testing methodology by introducing digital image correlation (DIC) as an extensometer and this tool allows for crack growth measurement, phenomenological visualization and quantification of various material responses tomore » Mode I loading. Multiple methodologies from different international standards and other common techniques are compared for the determination of the evolution of G IC as crack resistance curves (R-curves). The primarily metrological sources of uncertainty, in contrast to material specific related uncertainties, are discussed through a simple sensitivity analysis. Additionally, the current work offers a detailed insight into the constraints and assumptions to allow exploration of different methods for the determination of material properties using the DIC measured data. The main aim is an improvement of the measurement technique and an increase in the reliability of measured data during static testing, in advance of future rate dependent testing for crashworthiness simulations.« less

  16. A Randomized Study of How Physicians Interpret Research Funding Disclosures

    PubMed Central

    Kesselheim, Aaron S.; Robertson, Christopher T.; Myers, Jessica A.; Rose, Susannah L.; Gillet, Victoria; Ross, Kathryn M.; Glynn, Robert J.; Joffe, Steven; Avorn, Jerry

    2012-01-01

    BACKGROUND The effects of clinical-trial funding on the interpretation of trial results are poorly understood. We examined how such support affects physicians’ reactions to trials with a high, medium, or low level of methodologic rigor. METHODS We presented 503 board-certified internists with abstracts that we designed describing clinical trials of three hypothetical drugs. The trials had high, medium, or low methodologic rigor, and each report included one of three support disclosures: funding from a pharmaceutical company, NIH funding, or none. For both factors studied (rigor and funding), one of the three possible variations was randomly selected for inclusion in the abstracts. Follow-up questions assessed the physicians’ impressions of the trials’ rigor, their confidence in the results, and their willingness to prescribe the drugs. RESULTS The 269 respondents (53.5% response rate) perceived the level of study rigor accurately. Physicians reported that they would be less willing to prescribe drugs tested in low-rigor trials than those tested in medium-rigor trials (odds ratio, 0.64; 95% confidence interval [CI], 0.46 to 0.89; P = 0.008) and would be more willing to prescribe drugs tested in high-rigor trials than those tested in medium-rigor trials (odds ratio, 3.07; 95% CI, 2.18 to 4.32; P<0.001). Disclosure of industry funding, as compared with no disclosure of funding, led physicians to downgrade the rigor of a trial (odds ratio, 0.63; 95% CI, 0.46 to 0.87; P = 0.006), their confidence in the results (odds ratio, 0.71; 95% CI, 0.51 to 0.98; P = 0.04), and their willingness to prescribe the hypothetical drugs (odds ratio, 0.68; 95% CI, 0.49 to 0.94; P = 0.02). Physicians were half as willing to prescribe drugs studied in industry-funded trials as they were to prescribe drugs studied in NIH-funded trials (odds ratio, 0.52; 95% CI, 0.37 to 0.71; P<0.001). These effects were consistent across all levels of methodologic rigor. CONCLUSIONS Physicians discriminate among trials of varying degrees of rigor, but industry sponsorship negatively influences their perception of methodologic quality and reduces their willingness to believe and act on trial findings, independently of the trial’s quality. These effects may influence the translation of clinical research into practice. PMID:22992075

  17. A Brief Introduction to Q Methodology

    ERIC Educational Resources Information Center

    Yang, Yang

    2016-01-01

    Q methodology is a method to systematically study subjective matters such as thoughts and beliefs on any given topic. Q methodology can be used for both theory building and theory testing. The purpose of this paper was to give a brief overview of Q methodology to readers with various backgrounds. This paper discussed several advantages of Q…

  18. Rotary-wing flight test methods used for the evaluation of night vision devices

    NASA Astrophysics Data System (ADS)

    Haworth, Loran A.; Blanken, Christopher J.; Szoboszlay, Zoltan P.

    2001-08-01

    The U.S. Army Aviation mission includes flying helicopters at low altitude, at night, and in adverse weather. Night Vision Devices (NVDs) are used to supplement the pilot's visual cues for night flying. As the military requirement to conduct night helicopter operations has increased, the impact of helicopter flight operations with NVD technology in the Degraded Visual Environment (DVE) became increasingly important to quantify. Aeronautical Design Standard-33 (ADS- 33) was introduced to update rotorcraft handling qualities requirements and to quantify the impact of the NVDs in the DVE. As reported in this paper, flight test methodology in ADS-33 has been used by the handling qualities community to measure the impact of NVDs on task performance in the DVE. This paper provides the background and rationale behind the development of ADS-33 flight test methodology for handling qualities in the DVE, as well as the test methodology developed for human factor assessment of NVDs in the DVE. Lessons learned, shortcomings and recommendations for NVD flight test methodology are provided in this paper.

  19. Human papillomavirus detection with genotyping by the cobas and Aptima assays: Significant differences in HPV 16 detection?

    PubMed

    Chorny, Joseph A; Frye, Teresa C; Fisher, Beth L; Remmers, Carol L

    2018-03-23

    The primary high-risk human papillomavirus (hrHPV) assays in the United States are the cobas (Roche) and the Aptima (Hologic). The cobas assay detects hrHPV by DNA analysis while the Aptima detects messenger RNA (mRNA) oncogenic transcripts. As the Aptima assay identifies oncogenic expression, it should have a lower rate of hrHPV and genotype detection. The Kaiser Permanente Regional Reference Laboratory in Denver, Colorado changed its hrHPV assay from the cobas to the Aptima assay. The rates of hrHPV detection and genotyping were compared over successive six-month periods. The overall hrHPV detection rates by the two platforms were similar (9.5% versus 9.1%) and not statistically different. For genotyping, the HPV 16 rate by the cobas was 1.6% and by the Aptima it was 1.1%. These differences were statistically different with the Aptima detecting nearly one-third less HPV 16 infections. With the HPV 18 and HPV 18/45, there was a slightly higher detection rate of HPV 18/45 by the Aptima platform (0.5% versus 0.9%) and this was statistically significant. While HPV 16 represents a low percentage of hrHPV infections, it was detected significantly less by the Aptima assay compared to the cobas assay. This has been previously reported, although not highlighted. Given the test methodologies, one would expect the Aptima to detect less HPV 16. This difference appears to be mainly due to a significantly increased number of non-oncogenic HPV 16 infections detected by the cobas test as there were no differences in HPV 16 detection rates in the high-grade squamous intraepithelial lesions indicating that the two tests have similar sensitivities for oncogenic HPV 16. © 2018 Wiley Periodicals, Inc.

  20. Proceedings of the Annual Mechanics of Composites Review (12th) Held in Wright-Patterson AFB, Ohio on 16-17 October 1987

    DTIC Science & Technology

    1988-01-01

    ignored but the Volkersen model is extended to include adherend deformations will be discussed. STATISTICAL METHODOLOGY FOR DESIGN ALLOWABLES [15-17...structure. In the certification methodology , the development test program and the calculation of composite design allowables is orchestrated to support...Development of design methodology of thick composites and their test methods. (b) Role of interface in emerging composite systems. *CONTRACTS IMPROVED DAMAGE

  1. Evaluation of culture- and PCR-based detection methods for Escherichia coli O157:H7 in inoculated ground beeft.

    PubMed

    Arthur, Terrance M; Bosilevac, Joseph M; Nou, Xiangwu; Koohmaraie, Mohammad

    2005-08-01

    Currently, several beef processors employ test-and-hold systems for increased quality control of ground beef. In such programs, each lot of product must be tested and found negative for Escherichia coli O157:H7 prior to release of the product into commerce. Optimization of three testing attributes (detection time, specificity, and sensitivity) is critical to the success of such strategies. Because ground beef is a highly perishable product, the testing methodology used must be as rapid as possible. The test also must have a low false-positive result rate so product is not needlessly discarded. False-negative results cannot be tolerated because they would allow contaminated product to be released and potentially cause disease. In this study, two culture-based and three PCR-based methods for detecting E. coli O157:H7 in ground beef were compared for their abilities to meet the above criteria. Ground beef samples were individually spiked with five genetically distinct strains of E. coli O157: H7 at concentrations of 17 and 1.7 CFU/65 g and then subjected to the various testing methodologies. There was no difference (P > 0.05) in the abilities of the PCR-based methods to detect E. coli O157:H7 inoculated in ground beef at 1.7 CFU/65 g. The culture-based systems detected more positive samples than did the PCR-based systems, but the detection times (21 to 48 h) were at least 9 h longer than those for the PCR-based methods (7.5 to 12 h). Ground beef samples were also spiked with potentially cross-reactive strains. The PCR-based systems that employed an immunomagnetic separation step prior to detection produced fewer false-positive results.

  2. A new proposal for randomized start design to investigate disease-modifying therapies for Alzheimer disease.

    PubMed

    Zhang, Richard Y; Leon, Andrew C; Chuang-Stein, Christy; Romano, Steven J

    2011-02-01

    The increasing prevalence of Alzheimer disease (AD) and lack of effective agents to attenuate progression have accelerated research and development of disease modifying (DM) therapies. The traditional parallel group design and single time point analysis used in the support of past AD drug approvals address symptomatic benefit over relatively short treatment durations. More recent trials investigating disease modification are by necessity longer in duration and require larger sample sizes. Nevertheless, trial design and analysis remain mostly unchanged and may not be adequate to meet the objective of demonstrating disease modification. Randomized start design (RSD) has been proposed as an option to study DM effects, but its application in AD trials may have been hampered by certain methodological challenges. To address the methodological issues that have impeded more extensive use of RSD in AD trial and to encourage other researchers to develop novel design and analysis methodologies to better ascertain DM effects for the next generation of AD therapies, we propose a stepwise testing procedure to evaluate potential DM effects of novel AD therapies. Alzheimer Disease Assessment Scale-Cognitive Subscale (ADAS-cog) is used for illustration. We propose to test three hypotheses in a stepwise sequence. The three tests pertain to treatment difference at two separate time points and a difference in the rate of change. Estimation is facilitated by the Mixed-effects Model for Repeated Measures approach. The required sample size is estimated using Monte Carlo simulations and by modeling ADAS-cog data from prior longitudinal AD studies. The greatest advantage of the RSD proposed in this article is its ability to critically address the question on a DM effect. The AD trial using the new approach would be longer (12-month placebo period plus 12-month delay-start period; total 24-month duration) and require more subjects (about 1000 subjects per arm for the non-inferiority margin chosen in the illustration). It would also require additional evaluations to estimate the rate of ADAS-cog change toward the end of the trial. A regulatory claim of disease modification for any compound will likely require additional verification of a drug's effect on a validated biomarker of Alzheimer's pathology. Incorporation of the RSD in AD trials is feasible. With proper trial setup and statistical procedures, this design could support the detection of a disease-modifying effect. In our opinion, a two-phase RSD with a stepwise hypothesis testing procedure could be a reasonable option for future studies.

  3. Integrated vehicle-based safety systems heavy truck field operational test, methodology and results report.

    DOT National Transportation Integrated Search

    2010-12-01

    "This document presents the methodology and results from the heavy-truck field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michiga...

  4. Integrated vehicle-based safety systems light-vehicle field operational test, methodology and results report.

    DOT National Transportation Integrated Search

    2010-12-01

    "This document presents the methodology and results from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michi...

  5. 77 FR 47077 - Statement of Organization, Functions, and Delegations of Authority; Office of Planning, Research...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-07

    ...; surveys, research and evaluation methodologies; demonstration testing and model development; synthesis and..., policy and program analysis; surveys, research and evaluation methodologies; demonstration testing and... Organization, Functions, and Delegations of Authority; Office of Planning, Research and Evaluation AGENCY...

  6. 12 CFR 252.155 - Methodologies and practices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... SYSTEM (CONTINUED) ENHANCED PRUDENTIAL STANDARDS (REGULATION YY) Company-Run Stress Test Requirements for....155 Methodologies and practices. (a) Potential impact on capital. In conducting a stress test under...) Losses, pre-provision net revenue, provision for loan and lease losses, and net income; and (2) The...

  7. 12 CFR 252.155 - Methodologies and practices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... SYSTEM (CONTINUED) ENHANCED PRUDENTIAL STANDARDS (REGULATION YY) Company-Run Stress Test Requirements for....155 Methodologies and practices. (a) Potential impact on capital. In conducting a stress test under...) Losses, pre-provision net revenue, provision for loan and lease losses, and net income; and (2) The...

  8. 42 CFR 413.312 - Methodology for calculating rates.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospectively Determined Payment Rates for Low-Volume Skilled Nursing Facilities, for Cost Reporting Periods Beginning...

  9. 42 CFR 413.312 - Methodology for calculating rates.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospectively Determined Payment Rates for Low-Volume Skilled Nursing Facilities, for Cost Reporting Periods Beginning...

  10. A Multisite, Randomized Controlled Clinical Trial of Computerized Cognitive Remediation Therapy for Schizophrenia.

    PubMed

    Gomar, Jesús J; Valls, Elia; Radua, Joaquim; Mareca, Celia; Tristany, Josep; del Olmo, Francisco; Rebolleda-Gil, Carlos; Jañez-Álvarez, María; de Álvaro, Francisco J; Ovejero, María R; Llorente, Ana; Teixidó, Cristina; Donaire, Ana M; García-Laredo, Eduardo; Lazcanoiturburu, Andrea; Granell, Luis; Mozo, Cristina de Pablo; Pérez-Hernández, Mónica; Moreno-Alcázar, Ana; Pomarol-Clotet, Edith; McKenna, Peter J

    2015-11-01

    The effectiveness of cognitive remediation therapy (CRT) for the neuropsychological deficits seen in schizophrenia is supported by meta-analysis. However, a recent methodologically rigorous trial had negative findings. In this study, 130 chronic schizophrenic patients were randomly assigned to computerized CRT, an active computerized control condition (CC) or treatment as usual (TAU). Primary outcome measures were 2 ecologically valid batteries of executive function and memory, rated under blind conditions; other executive and memory tests and a measure of overall cognitive function were also employed. Carer ratings of executive and memory failures in daily life were obtained before and after treatment. Computerized CRT was found to produce improvement on the training tasks, but this did not transfer to gains on the primary outcome measures and most other neuropsychological tests in comparison to either CC or TAU conditions. Nor did the intervention result in benefits on carer ratings of daily life cognitive failures. According to this study, computerized CRT is not effective in schizophrenia. The use of both active and passive CCs suggests that nature of the control group is not an important factor influencing results. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.

  11. [Rating scales based on the phenomenological and structural approach].

    PubMed

    Schiltz, L

    2006-01-01

    A current tendency of research in clinical psychology consists in using an integrated quantitative and qualitative methodology. This approach is especially suited to the study of the therapeutic intervention where the researcher is himself part of the situation he is investigating. As to the tools of research, the combination of the semi-structured clinical interview, of psychometric scales and projective tests has proved to be pertinent to describe the multidimensional and fluctuating reality of the therapeutic relationship and the changes induced by it in the two partners. In arts therapeutic research the investigation of the artistic production or of the free expression of people may complete the psychometric and projective tools. The concept of "expressive test" is currently being used to characterise this method. In this context, the development of rating scales, based on the phenomenological and structural or holistic approach allows us making the link between qualitative analysis and quantification, leading to the use of inferential statistics, providing that we remain at the nominal or ordinal level of measurement. We are explaining the principle of construction of these rating scales and we are illustrating our practice with some examples drawn from studies we realized in clinical psychology.

  12. Effects of surface chemistry on hot corrosion life

    NASA Technical Reports Server (NTRS)

    Fryxell, R. E.; Leese, G. E.

    1985-01-01

    This program has its primary objective: the development of hot corrosion life prediction methodology based on a combination of laboratory test data and evaluation of field service turbine components which show evidence of hot corrosion. The laboratory program comprises burner rig testing by TRW. A summary of results is given for two series of burner rig tests. The life prediction methodology parameters to be appraised in a final campaign of burner rig tests are outlined.

  13. The significance of some methodological effects on filtration and ingestion rates of the rotifer Brachionus plicatilis

    NASA Astrophysics Data System (ADS)

    Schlosser, H. J.; Anger, K.

    1982-06-01

    Filtration rate (F) and ingestion rate (I) were measured in the rotifer Brachionus plicatilis feeding on the flagellate Dunaliella spec. and on yeast cells (Saccharomyces cerevisiae). 60-min experiments in rotating bottles served as a standard for testing methodological effects on levels of F and I. A lack of rotation reduced F values by 40 %, and a rise in temperature from 18° to 23.5 °C increased them by 42 %. Ingestion rates increased significantly up to a particle (yeast) concentration of ca. 600-800 cells · μl-1; then they remained constant, whereas filtration rates decreased beyond this threshold. Rotifer density (up to 1000 ind · ml-1) and previous starvation (up to 40 h) did not significantly influence food uptake rates. The duration of the experiment proved to have the most significant effect on F and I values: in 240-min experiments, these values were on the average more than 90 % lower than in 15-min experiments. From this finding it is concluded that ingestion rates obtained from short-term experiments (60 min or less) cannot be used in energy budgets, because they severely overestimate the actual long-term feeding capacity of the rotifers. At the lower end of the particle size spectrum (2 to 3 µm) there are not only food cells, but apparently also contaminating faecal particles. Their number increased with increasing duration of experiments and lead to an underestimation of F and I. Elemental analyses of rotifers and their food suggest that B. plicatilis can ingest up to 0.6 mJ or ca. 14 % of its own body carbon within 15 min. The long term average was estimated as 3.4 mJ · ind-1 · d-1 or ca. 75 % of body carbon · d-1.

  14. NASA Handbook for Spacecraft Structural Dynamics Testing

    NASA Technical Reports Server (NTRS)

    Kern, Dennis L.; Scharton, Terry D.

    2005-01-01

    Recent advances in the area of structural dynamics and vibrations, in both methodology and capability, have the potential to make spacecraft system testing more effective from technical, cost, schedule, and hardware safety points of view. However, application of these advanced test methods varies widely among the NASA Centers and their contractors. Identification and refinement of the best of these test methodologies and implementation approaches has been an objective of efforts by the Jet Propulsion Laboratory on behalf of the NASA Office of the Chief Engineer. But to develop the most appropriate overall test program for a flight project from the selection of advanced methodologies, as well as conventional test methods, spacecraft project managers and their technical staffs will need overall guidance and technical rationale. Thus, the Chief Engineer's Office has recently tasked JPL to prepare a NASA Handbook for Spacecraft Structural Dynamics Testing. An outline of the proposed handbook, with a synopsis of each section, has been developed and is presented herein. Comments on the proposed handbook are solicited from the spacecraft structural dynamics testing community.

  15. NASA Handbook for Spacecraft Structural Dynamics Testing

    NASA Technical Reports Server (NTRS)

    Kern, Dennis L.; Scharton, Terry D.

    2004-01-01

    Recent advances in the area of structural dynamics and vibrations, in both methodology and capability, have the potential to make spacecraft system testing more effective from technical, cost, schedule, and hardware safety points of view. However, application of these advanced test methods varies widely among the NASA Centers and their contractors. Identification and refinement of the best of these test methodologies and implementation approaches has been an objective of efforts by the Jet Propulsion Laboratory on behalf of the NASA Office of the Chief Engineer. But to develop the most appropriate overall test program for a flight project from the selection of advanced methodologies, as well as conventional test methods, spacecraft project managers and their technical staffs will need overall guidance and technical rationale. Thus, the Chief Engineer's Office has recently tasked JPL to prepare a NASA Handbook for Spacecraft Structural Dynamics Testing. An outline of the proposed handbook, with a synopsis of each section, has been developed and is presented herein. Comments on the proposed handbook is solicited from the spacecraft structural dynamics testing community.

  16. Refinement of a methodology for siting maintenance area headquarters.

    DOT National Transportation Integrated Search

    1986-01-01

    Prior to this study, a methodology that generates travel time, or isochronal, contours around area headquarters or the housing bases of maintenance crews was developed. The methodology was then pilot tested for the Charlottesville Residency, and was ...

  17. Toward the Long-Term Scientific Study of Encounter Group Phenomena: I. Methodological Considerations.

    ERIC Educational Resources Information Center

    Diamond, Michael Jay; Shapiro, Jerrold Lee

    This paper proposes a model for the long-term scientific study of encounter, T-, and sensitivity groups. The authors see the need for overcoming major methodological and design inadequacies of such research. They discuss major methodological flaws in group outcome research as including: (1) lack of adequate base rate or pretraining measures; (2)…

  18. Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.

    1978-01-01

    Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.

  19. Proof test methodology for composites

    NASA Technical Reports Server (NTRS)

    Wu, Edward M.; Bell, David K.

    1992-01-01

    The special requirements for proof test of composites are identified based on the underlying failure process of composites. Two proof test methods are developed to eliminate the inevitable weak fiber sites without also causing flaw clustering which weakens the post-proof-test composite. Significant reliability enhancement by these proof test methods has been experimentally demonstrated for composite strength and composite life in tension. This basic proof test methodology is relevant to the certification and acceptance of critical composite structures. It can also be applied to the manufacturing process development to achieve zero-reject for very large composite structures.

  20. Multiobjective Optimization of Atmospheric Plasma Spray Process Parameters to Deposit Yttria-Stabilized Zirconia Coatings Using Response Surface Methodology

    NASA Astrophysics Data System (ADS)

    Ramachandran, C. S.; Balasubramanian, V.; Ananthapadmanabhan, P. V.

    2011-03-01

    Atmospheric plasma spraying is used extensively to make Thermal Barrier Coatings of 7-8% yttria-stabilized zirconia powders. The main problem faced in the manufacture of yttria-stabilized zirconia coatings by the atmospheric plasma spraying process is the selection of the optimum combination of input variables for achieving the required qualities of coating. This problem can be solved by the development of empirical relationships between the process parameters (input power, primary gas flow rate, stand-off distance, powder feed rate, and carrier gas flow rate) and the coating quality characteristics (deposition efficiency, tensile bond strength, lap shear bond strength, porosity, and hardness) through effective and strategic planning and the execution of experiments by response surface methodology. This article highlights the use of response surface methodology by designing a five-factor five-level central composite rotatable design matrix with full replication for planning, conduction, execution, and development of empirical relationships. Further, response surface methodology was used for the selection of optimum process parameters to achieve desired quality of yttria-stabilized zirconia coating deposits.

Top