Sample records for conditions measurement methodology

  1. Methodology for measurement of diesel particle size distributions from a city bus working in real traffic conditions

    NASA Astrophysics Data System (ADS)

    Armas, O.; Gómez, A.; Mata, C.

    2011-10-01

    The study of particulate matter (PM) and nitrogen oxides emissions of diesel engines is nowadays a necessary step towards pollutant emission reduction. For a complete evaluation of PM emissions and its size characterization, one of the most challenging goals is to adapt the available techniques and the data acquisition procedures to the measurement and to propose a methodology for the interpretation of instantaneous particle size distributions (PSD) of combustion-derived particles produced by a vehicle during real driving conditions. In this work, PSD from the exhaust gas of a city bus operated in real driving conditions with passengers have been measured. For the study, the bus was equipped with a rotating disk diluter coupled to an air supply thermal conditioner (with an evaporating tube), the latter being connected to a TSI Engine Exhaust Particle Sizer spectrometer. The main objective of this work has been to propose an alternative procedure for evaluating the influence of several transient sequences on PSD emitted by a city bus used in real driving conditions with passengers. The transitions studied were those derived from the combination of four possible sequences or categories during real driving conditions: idle, acceleration, deceleration with fuel consumption and deceleration without fuel consumption. The analysis methodology used in this work proved to be a useful tool for a better understanding of the phenomena related to the determination of PSD emitted by a city bus during real driving conditions with passengers.

  2. Methodological Quality of National Guidelines for Pediatric Inpatient Conditions

    PubMed Central

    Hester, Gabrielle; Nelson, Katherine; Mahant, Sanjay; Eresuma, Emily; Keren, Ron; Srivastava, Rajendu

    2014-01-01

    Background Guidelines help inform standardization of care for quality improvement (QI). The Pediatric Research in Inpatient Settings (PRIS) network published a prioritization list of inpatient conditions with high prevalence, cost, and variation in resource utilization across children’s hospitals. The methodological quality of guidelines for priority conditions is unknown. Objective To rate the methodological quality of national guidelines for 20 priority pediatric inpatient conditions. Design We searched sources including PubMed for national guidelines published 2002–2012. Guidelines specific to one organism, test or treatment, or institution were excluded. Guidelines were rated by two raters using a validated tool (AGREE II) with an overall rating on a 7-point scale (7–highest). Inter-rater reliability was measured with a weighted kappa coefficient. Results 17 guidelines met inclusion criteria for 13 conditions, 7 conditions yielded no relevant national guidelines. The highest methodological quality guidelines were for asthma, tonsillectomy, and bronchiolitis (mean overall rating 7, 6.5 and 6.5 respectively); the lowest were for sickle cell disease (2 guidelines) and dental caries (mean overall rating 4, 3.5, and 3 respectively). The overall weighted kappa was 0.83 (95% confidence interval 0.78–0.87). Conclusions We identified a group of moderate to high methodological quality national guidelines for priority pediatric inpatient conditions. Hospitals should consider these guidelines to inform QI initiatives. PMID:24677729

  3. Site-conditions map for Portugal based on VS measurements: methodology and final model

    NASA Astrophysics Data System (ADS)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and

  4. A novelty detection diagnostic methodology for gearboxes operating under fluctuating operating conditions using probabilistic techniques

    NASA Astrophysics Data System (ADS)

    Schmidt, S.; Heyns, P. S.; de Villiers, J. P.

    2018-02-01

    In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.

  5. National working conditions surveys in Latin America: comparison of methodological characteristics

    PubMed Central

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G.

    2015-01-01

    Background: High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. Objective: To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. Methods: The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Results: Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Conclusions: Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region. PMID:26079314

  6. National working conditions surveys in Latin America: comparison of methodological characteristics.

    PubMed

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G

    2015-01-01

    High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region.

  7. A methodology for TLD postal dosimetry audit of high-energy radiotherapy photon beams in non-reference conditions.

    PubMed

    Izewska, Joanna; Georg, Dietmar; Bera, Pranabes; Thwaites, David; Arib, Mehenna; Saravi, Margarita; Sergieva, Katia; Li, Kaibao; Yip, Fernando Garcia; Mahant, Ashok Kumar; Bulski, Wojciech

    2007-07-01

    A strategy for national TLD audit programmes has been developed by the International Atomic Energy Agency (IAEA). It involves progression through three sequential dosimetry audit steps. The first step audits are for the beam output in reference conditions for high-energy photon beams. The second step audits are for the dose in reference and non-reference conditions on the beam axis for photon and electron beams. The third step audits involve measurements of the dose in reference, and non-reference conditions off-axis for open and wedged symmetric and asymmetric fields for photon beams. Through a co-ordinated research project the IAEA developed the methodology to extend the scope of national TLD auditing activities to more complex audit measurements for regular fields. Based on the IAEA standard TLD holder for high-energy photon beams, a TLD holder was developed with horizontal arm to enable measurements 5cm off the central axis. Basic correction factors were determined for the holder in the energy range between Co-60 and 25MV photon beams. New procedures were developed for the TLD irradiation in hospitals. The off-axis measurement methodology for photon beams was tested in a multi-national pilot study. The statistical distribution of dosimetric parameters (off-axis ratios for open and wedge beam profiles, output factors, wedge transmission factors) checked in 146 measurements was 0.999+/-0.012. The methodology of TLD audits in non-reference conditions with a modified IAEA TLD holder has been shown to be feasible.

  8. A Wavelet-Based Methodology for Grinding Wheel Condition Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, T. W.; Ting, C.F.; Qu, Jun

    2007-01-01

    Grinding wheel surface condition changes as more material is removed. This paper presents a wavelet-based methodology for grinding wheel condition monitoring based on acoustic emission (AE) signals. Grinding experiments in creep feed mode were conducted to grind alumina specimens with a resinoid-bonded diamond wheel using two different conditions. During the experiments, AE signals were collected when the wheel was 'sharp' and when the wheel was 'dull'. Discriminant features were then extracted from each raw AE signal segment using the discrete wavelet decomposition procedure. An adaptive genetic clustering algorithm was finally applied to the extracted features in order to distinguish differentmore » states of grinding wheel condition. The test results indicate that the proposed methodology can achieve 97% clustering accuracy for the high material removal rate condition, 86.7% for the low material removal rate condition, and 76.7% for the combined grinding conditions if the base wavelet, the decomposition level, and the GA parameters are properly selected.« less

  9. Revisiting the Schönbein ozone measurement methodology

    NASA Astrophysics Data System (ADS)

    Ramírez-González, Ignacio A.; Añel, Juan A.; Saiz-López, Alfonso; García-Feal, Orlando; Cid, Antonio; Mejuto, Juan Carlos; Gimeno, Luis

    2017-04-01

    Trough the XIX century the Schönbein method gained a lot of popularity by its easy way to measure tropospheric ozone. Traditionally it has been considered that Schönbein measurements are not accurate enough to be useful. Detractors of this method argue that it is sensitive to meteorological conditions, being the most important the influence of relative humidity. As a consequence the data obtained by this method have usually been discarded. Here we revisit this method taking into account that values measured during the 19th century were taken using different measurement papers. We explore several concentrations of starch and potassium iodide, the basis for this measurement method. Our results are compared with the previous ones existing in the literature. The validity of the Schönbein methodology is discussed having into account humidity and other meteorological variables.

  10. Development of the methodology of exhaust emissions measurement under RDE (Real Driving Emissions) conditions for non-road mobile machinery (NRMM) vehicles

    NASA Astrophysics Data System (ADS)

    Merkisz, J.; Lijewski, P.; Fuc, P.; Siedlecki, M.; Ziolkowski, A.

    2016-09-01

    The paper analyzes the exhaust emissions from farm vehicles based on research performed under field conditions (RDE) according to the NTE procedure. This analysis has shown that it is hard to meet the NTE requirements under field conditions (engine operation in the NTE zone for at least 30 seconds). Due to a very high variability of the engine conditions, the share of a valid number of NTE windows in the field test is small throughout the entire test. For this reason, a modification of the measurement and exhaust emissions calculation methodology has been proposed for farm vehicles of the NRMM group. A test has been developed composed of the following phases: trip to the operation site (paved roads) and field operations (including u-turns and maneuvering). The range of the operation time share in individual test phases has been determined. A change in the method of calculating the real exhaust emissions has also been implemented in relation to the NTE procedure.

  11. Structural health monitoring methodology for aircraft condition-based maintenance

    NASA Astrophysics Data System (ADS)

    Saniger, Jordi; Reithler, Livier; Guedra-Degeorges, Didier; Takeda, Nobuo; Dupuis, Jean Pierre

    2001-06-01

    Reducing maintenance costs while keeping a constant level of safety is a major issue for Air Forces and airlines. The long term perspective is to implement condition based maintenance to guarantee a constant safety level while decreasing maintenance costs. On this purpose, the development of a generalized Structural Health Monitoring System (SHMS) is needed. The objective of such a system is to localize the damages and to assess their severity, with enough accuracy to allow low cost corrective actions. The present paper describes a SHMS based on acoustic emission technology. This choice was driven by its reliability and wide use in the aerospace industry. The described SHMS uses a new learning methodology which relies on the generation of artificial acoustic emission events on the structure and an acoustic emission sensor network. The calibrated acoustic emission events picked up by the sensors constitute the knowledge set that the system relies on. With this methodology, the anisotropy of composite structures is taken into account, thus avoiding the major cause of errors of classical localization methods. Moreover, it is adaptive to different structures as it does not rely on any particular model but on measured data. The acquired data is processed and the event's location and corrected amplitude are computed. The methodology has been demonstrated and experimental tests on elementary samples presented a degree of accuracy of 1cm.

  12. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review

    PubMed Central

    Chung, Stephanie T.; Chacko, Shaji K.; Sunehag, Agneta L.

    2015-01-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions. PMID:26604176

  13. Rapid development of xylanase assay conditions using Taguchi methodology.

    PubMed

    Prasad Uday, Uma Shankar; Bandyopadhyay, Tarun Kanti; Bhunia, Biswanath

    2016-11-01

    The present investigation is mainly concerned with the rapid development of extracellular xylanase assay conditions by using Taguchi methodology. The extracellular xylanase was produced from Aspergillus niger (KP874102.1), a new strain isolated from a soil sample of the Baramura forest, Tripura West, India. Four physical parameters including temperature, pH, buffer concentration and incubation time were considered as key factors for xylanase activity and were optimized using Taguchi robust design methodology for enhanced xylanase activity. The main effect, interaction effects and optimal levels of the process factors were determined using signal-to-noise (S/N) ratio. The Taguchi method recommends the use of S/N ratio to measure quality characteristics. Based on analysis of the S/N ratio, optimal levels of the process factors were determined. Analysis of variance (ANOVA) was performed to evaluate statistically significant process factors. ANOVA results showed that temperature contributed the maximum impact (62.58%) on xylanase activity, followed by pH (22.69%), buffer concentration (9.55%) and incubation time (5.16%). Predicted results showed that enhanced xylanase activity (81.47%) can be achieved with pH 2, temperature 50°C, buffer concentration 50 Mm and incubation time 10 min.

  14. Methodology for building confidence measures

    NASA Astrophysics Data System (ADS)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  15. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review.

    PubMed

    Chung, Stephanie T; Chacko, Shaji K; Sunehag, Agneta L; Haymond, Morey W

    2015-12-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions. © 2015 by the American Diabetes Association. Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered.

  16. Methodological Issues in Measuring the Development of Character

    ERIC Educational Resources Information Center

    Card, Noel A.

    2017-01-01

    In this article I provide an overview of the methodological issues involved in measuring constructs relevant to character development and education. I begin with a nontechnical overview of the 3 fundamental psychometric properties of measurement: reliability, validity, and equivalence. Developing and evaluating measures to ensure evidence of all 3…

  17. Measurement-based auralization methodology for the assessment of noise mitigation measures

    NASA Astrophysics Data System (ADS)

    Thomas, Pieter; Wei, Weigang; Van Renterghem, Timothy; Botteldooren, Dick

    2016-09-01

    The effect of noise mitigation measures is generally expressed by noise levels only, neglecting the listener's perception. In this study, an auralization methodology is proposed that enables an auditive preview of noise abatement measures for road traffic noise, based on the direction dependent attenuation of a priori recordings made with a dedicated 32-channel spherical microphone array. This measurement-based auralization has the advantage that all non-road traffic sounds that create the listening context are present. The potential of this auralization methodology is evaluated through the assessment of the effect of an L-shaped mound. The angular insertion loss of the mound is estimated by using the ISO 9613-2 propagation model, the Pierce barrier diffraction model and the Harmonoise point-to-point model. The realism of the auralization technique is evaluated by listening tests, indicating that listeners had great difficulty in differentiating between a posteriori recordings and auralized samples, which shows the validity of the followed approaches.

  18. An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations

    PubMed Central

    Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.

    2016-01-01

    We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach. PMID:28316360

  19. Development and Attestation of Gamma-Ray Measurement Methodologies for use by Rostekhnadzor Inspectors in the Russian Federation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Sanders

    2006-09-01

    Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revisionmore » of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation.« less

  20. Development of a Valid and Reliable Knee Articular Cartilage Condition-Specific Study Methodological Quality Score.

    PubMed

    Harris, Joshua D; Erickson, Brandon J; Cvetanovich, Gregory L; Abrams, Geoffrey D; McCormick, Frank M; Gupta, Anil K; Verma, Nikhil N; Bach, Bernard R; Cole, Brian J

    2014-02-01

    Condition-specific questionnaires are important components in evaluation of outcomes of surgical interventions. No condition-specific study methodological quality questionnaire exists for evaluation of outcomes of articular cartilage surgery in the knee. To develop a reliable and valid knee articular cartilage-specific study methodological quality questionnaire. Cross-sectional study. A stepwise, a priori-designed framework was created for development of a novel questionnaire. Relevant items to the topic were identified and extracted from a recent systematic review of 194 investigations of knee articular cartilage surgery. In addition, relevant items from existing generic study methodological quality questionnaires were identified. Items for a preliminary questionnaire were generated. Redundant and irrelevant items were eliminated, and acceptable items modified. The instrument was pretested and items weighed. The instrument, the MARK score (Methodological quality of ARticular cartilage studies of the Knee), was tested for validity (criterion validity) and reliability (inter- and intraobserver). A 19-item, 3-domain MARK score was developed. The 100-point scale score demonstrated face validity (focus group of 8 orthopaedic surgeons) and criterion validity (strong correlation to Cochrane Quality Assessment score and Modified Coleman Methodology Score). Interobserver reliability for the overall score was good (intraclass correlation coefficient [ICC], 0.842), and for all individual items of the MARK score, acceptable to perfect (ICC, 0.70-1.000). Intraobserver reliability ICC assessed over a 3-week interval was strong for 2 reviewers (≥0.90). The MARK score is a valid and reliable knee articular cartilage condition-specific study methodological quality instrument. This condition-specific questionnaire may be used to evaluate the quality of studies reporting outcomes of articular cartilage surgery in the knee.

  1. Methodological aspects of EEG and body dynamics measurements during motion

    PubMed Central

    Reis, Pedro M. R.; Hebenstreit, Felix; Gabsteiger, Florian; von Tscharner, Vinzenz; Lochmann, Matthias

    2014-01-01

    EEG involves the recording, analysis, and interpretation of voltages recorded on the human scalp which originate from brain gray matter. EEG is one of the most popular methods of studying and understanding the processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements that are performed in response to the environment. However, there are methodological difficulties which can occur when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions on how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics, and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determinating real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks. PMID:24715858

  2. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a

  3. Calibration methodology for proportional counters applied to yield measurements of a neutron burst.

    PubMed

    Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo

    2014-01-01

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  4. Modeling of the effect of freezer conditions on the hardness of ice cream using response surface methodology.

    PubMed

    Inoue, K; Ochi, H; Habara, K; Taketsuka, M; Saito, H; Ichihashi, N; Iwatsuki, K

    2009-12-01

    The effect of conventional continuous freezer parameters [mix flow (L/h), overrun (%), drawing temperature ( degrees C), cylinder pressure (kPa), and dasher speed (rpm)] on the hardness of ice cream under varying measured temperatures (-5, -10, and -15 degrees C) was investigated systematically using response surface methodology (central composite face-centered design), and the relationships were expressed as statistical models. The range (maximum and minimum values) of each freezer parameter was set according to the actual capability of the conventional freezer and applicability to the manufacturing process. Hardness was measured using a penetrometer. These models showed that overrun and drawing temperature had significant effects on hardness. The models can be used to optimize freezer conditions to make ice cream of the least possible hardness under the highest overrun (120%) and a drawing temperature of approximately -5.5 degrees C (slightly warmer than the lowest drawing temperature of -6.5 degrees C) within the range of this study. With reference to the structural elements of the ice cream, we suggest that the volume of overrun and ice crystal content, ice crystal size, and fat globule destabilization affect the hardness of ice cream. In addition, the combination of a simple instrumental parameter and response surface methodology allows us to show the relation between freezer conditions and one of the most important properties-hardness-visually and quantitatively on the practical level.

  5. Measures of outdoor play and independent mobility in children and youth: A methodological review.

    PubMed

    Bates, Bree; Stone, Michelle R

    2015-09-01

    Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct

  6. Application of MetaRail railway noise measurement methodology: comparison of three track systems

    NASA Astrophysics Data System (ADS)

    Kalivoda, M.; Kudrna, M.; Presle, G.

    2003-10-01

    Within the fourth RTD Framework Programme, the European Union has supported a research project dealing with the improvement of railway noise (emission) measurement methodologies. This project was called MetaRail and proposed a number of procedures and methods to decrease systematic measurement errors and to increase reproducibility. In 1999 the Austrian Federal Railways installed 1000 m of test track to explore the long-term behaviour of three different ballast track systems. This test included track stability, rail forces and ballast forces, as well as vibration transmission and noise emission. The noise study was carried out using the experience and methods developed within MetaRail. This includes rail roughness measurements as well as measurements of vertical railhead, sleeper and ballast vibration in parallel with the noise emission measurement with a single microphone at a distance of 7.5 m from the track. Using a test train with block- and disc-braked vehicles helped to control operational conditions and indicated the influence of different wheel roughness. It has been shown that the parallel recording of several vibration signals together with the noise signal makes it possible to evaluate the contributions of car body, sleeper, track and wheel sources to the overall noise emission. It must be stressed that this method is not focused as is a microphone-array. However, this methodology is far easier to apply and thus cheaper. Within this study, noise emission was allocated to the different elements to answer questions such as whether the sleeper eigenfrequency is transmitted into the rail.

  7. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  8. Responsive Surface Methodology Optimizes Extraction Conditions of Industrial by-products, Camellia japonica Seed Cake

    PubMed Central

    Kim, Jae Kyeom; Lim, Ho-Jeong; Kim, Mi-So; Choi, Soo Jung; Kim, Mi-Jeong; Kim, Cho Rong; Shin, Dong-Hoon; Shin, Eui-Cheol

    2016-01-01

    Background: The central nervous system is easily damaged by oxidative stress due to high oxygen consumption and poor defensive capacity. Hence, multiple studies have demonstrated that inhibiting oxidative stress-induced damage, through an antioxidant-rich diet, might be a reasonable approach to prevent neurodegenerative disease. Objective: In the present study, response surface methodology was utilized to optimize the extraction for neuro-protective constituents of Camellia japonica byproducts. Materials and Methods: Rat pheochromocytoma cells were used to evaluate protective potential of Camellia japonica byproducts. Results: Optimum conditions were 33.84 min, 75.24%, and 75.82°C for time, ethanol concentration and temperature. Further, we demonstrated that major organic acid contents were significantly impacted by the extraction conditions, which may explain varying magnitude of protective potential between fractions. Conclusions: Given the paucity of information in regards to defatted C. japonica seed cake and their health promoting potential, our results herein provide interesting preliminary data for utilization of this byproduct from oil processing in both academic and industrial applications. SUMMARY Neuro-protective potential of C. japonica seed cake on cell viability was affected by extraction conditionsExtraction conditions effectively influenced on active constituents of C. japonica seed cakeBiological activity of C. japonica seed cake was optimized by the responsive surface methodology. Abbreviations used: GC-MS: Gas chromatography-mass spectrometer, MTT: 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide, PC12 cells: Pheochromocytoma, RSM: Response surface methodology. PMID:27601847

  9. [Pharmacological treatment conciliation methodology in patients with multiple conditions].

    PubMed

    Alfaro-Lara, Eva Rocío; Vega-Coca, María Dolores; Galván-Banqueri, Mercedes; Nieto-Martín, María Dolores; Pérez-Guerrero, Concepción; Santos-Ramos, Bernardo

    2014-02-01

    To carry out a bibliographic review in order to identify the different methodologies used along the reconciliation process of drug therapy applicable to polypathological patients. We performed a literature review. Data sources The bibliographic review (February 2012) included the following databases: Pubmed, EMBASE, CINAHL, PsycINFO and Spanish Medical Index (IME). The different methodologies, identified on those databases, to measure the conciliation process in polypathological patients, or otherwise elderly patients or polypharmacy, were studied. Study selection Two hundred and seventy three articles were retrieved, of which 25 were selected. Data extraction Specifically: the level of care, the sources of information, the use of registration forms, the established time, the medical professional in charge and the registered variables such as errors of reconciliation. Most of studies selected when the patient was admitted into the hospital and after the hospital discharge of the patient. The main sources of information to be highlighted are: the interview and the medical history of the patient. An established time is not explicitly stated on most of them, nor the registration form is used. The main professional in charge is the clinical pharmacologist. Apart from the home medication, the habits of self-medication and phytotherapy are also identified. The common errors of reconciliation vary from the omission of drugs to different forms of interaction with other medicinal products (drugs interactions). There is a large heterogeneity of methodologies used for reconciliation. There is not any work done on the specific figure of the polypathological patient, which precisely requires a standardized methodology due to its complexity and its susceptibility to errors of reconciliation. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  10. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    PubMed

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  11. Causality analysis in business performance measurement system using system dynamics methodology

    NASA Astrophysics Data System (ADS)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  12. Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Zhenhua; Yan, Binhang; Zhang, Li

    In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO 2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.

  13. Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation

    DOE PAGES

    Xie, Zhenhua; Yan, Binhang; Zhang, Li; ...

    2017-01-25

    In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO 2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.

  14. Geometric scaling of artificial hair sensors for flow measurement under different conditions

    NASA Astrophysics Data System (ADS)

    Su, Weihua; Reich, Gregory W.

    2017-03-01

    Artificial hair sensors (AHSs) have been developed for prediction of the local flow speed and aerodynamic force around an airfoil and subsequent application in vibration control of the airfoil. Usually, a specific sensor design is only sensitive to the flow speeds within its operating flow measurement region. This paper aims at expanding this flow measurement concept of using AHSs to different flow speed conditions by properly sizing the parameters of the sensors, including the dimensions of the artificial hair, capillary, and carbon nanotubes (CNTs) that make up the sensor design, based on a baseline sensor design and its working flow condition. In doing so, the glass fiber hair is modeled as a cantilever beam with an elastic foundation, subject to the distributed aerodynamic drag over the length of the hair. Hair length and diameter, capillary depth, and CNT height are scaled by keeping the maximum compressive strain of the CNTs constant for different sensors under different speed conditions. Numerical studies will demonstrate the feasibility of the geometric scaling methodology by designing AHSs for aircraft with different dimensions and flight conditions, starting from the same baseline sensor. Finally, the operating bandwidth of the scaled sensors are explored.

  15. Don't fear 'fear conditioning': Methodological considerations for the design and analysis of studies on human fear acquisition, extinction, and return of fear.

    PubMed

    Lonsdorf, Tina B; Menz, Mareike M; Andreatta, Marta; Fullana, Miguel A; Golkar, Armita; Haaker, Jan; Heitland, Ivo; Hermann, Andrea; Kuhn, Manuel; Kruse, Onno; Meir Drexler, Shira; Meulders, Ann; Nees, Frauke; Pittig, Andre; Richter, Jan; Römer, Sonja; Shiban, Youssef; Schmitz, Anja; Straube, Benjamin; Vervliet, Bram; Wendt, Julia; Baas, Johanna M P; Merz, Christian J

    2017-06-01

    The so-called 'replicability crisis' has sparked methodological discussions in many areas of science in general, and in psychology in particular. This has led to recent endeavours to promote the transparency, rigour, and ultimately, replicability of research. Originating from this zeitgeist, the challenge to discuss critical issues on terminology, design, methods, and analysis considerations in fear conditioning research is taken up by this work, which involved representatives from fourteen of the major human fear conditioning laboratories in Europe. This compendium is intended to provide a basis for the development of a common procedural and terminology framework for the field of human fear conditioning. Whenever possible, we give general recommendations. When this is not feasible, we provide evidence-based guidance for methodological decisions on study design, outcome measures, and analyses. Importantly, this work is also intended to raise awareness and initiate discussions on crucial questions with respect to data collection, processing, statistical analyses, the impact of subtle procedural changes, and data reporting specifically tailored to the research on fear conditioning. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Optimization of hydrolysis conditions for bovine plasma protein using response surface methodology.

    PubMed

    Seo, Hyun-Woo; Jung, Eun-Young; Go, Gwang-Woong; Kim, Gap-Don; Joo, Seon-Tea; Yang, Han-Sul

    2015-10-15

    The purpose of this study was to establish optimal conditions for the hydrolysis of bovine plasma protein. Response surface methodology was used to model and optimize responses [degree of hydrolysis (DH), 2,2-diphenyl-1-picrydrazyl (DPPH) radical-scavenging activity and Fe(2+)-chelating activity]. Hydrolysis conditions, such as hydrolysis temperature (46.6-63.4 °C), hydrolysis time (98-502 min), and hydrolysis pH (6.32-9.68) were selected as the main processing conditions in the hydrolysis of bovine plasma protein. Optimal conditions for maximum DH (%), DPPH radical-scavenging activity (%) and Fe(2+)-chelating activity (%) of the hydrolyzed bovine plasma protein, were respectively established. We discovered the following three conditions for optimal hydrolysis of bovine plasma: pH of 7.82-8.32, temperature of 54.1 °C, and time of 338.4-398.4 min. We consequently succeeded in hydrolyzing bovine plasma protein under these conditions and confirmed the various desirable properties of optimal hydrolysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Novel optoelectronic methodology for testing of MOEMS

    NASA Astrophysics Data System (ADS)

    Pryputniewicz, Ryszard J.; Furlong, Cosme

    2003-01-01

    Continued demands for delivery of high performance micro-optoelectromechanical systems (MOEMS) place unprecedented requirements on methods used in their development and operation. Metrology is a major and inseparable part of these methods. Optoelectronic methodology is an essential field of metrology. Due to its scalability, optoelectronic methodology is particularly suitable for testing of MOEMS where measurements must be made with ever increasing accuracy and precision. This was particularly evident during the last few years, characterized by miniaturization of devices, when requirements for measurements have rapidly increased as the emerging technologies introduced new products, especially, optical MEMS. In this paper, a novel optoelectronic methodology for testing of MOEMS is described and its applications are illustrated with representative examples. These examples demonstrate capability to measure submicron deformations of various components of the micromirror device, under operating conditions, and show viability of the optoelectronic methodology for testing of MOEMS.

  18. Current methodological approaches in conditioned pain modulation assessment in pediatrics

    PubMed Central

    Hwang, Philippe S; Ma, My-Linh; Spiegelberg, Nora; Ferland, Catherine E

    2017-01-01

    Conditioned pain modulation (CPM) paradigms have been used in various studies with healthy and non-healthy adult populations in an attempt to elucidate the mechanisms of pain processing. However, only a few studies so far have applied CPM in pediatric populations. Studies finding associations with chronic pain conditions suggest that deficiencies in underlying descending pain pathways may play an important role in the development and persistence of pain early in life. Twelve studies were identified using a PubMed search which examine solely pediatric populations, and these are reviewed with regard to demographics studied, methodological approaches, and conclusions reached. This review aimed to provide both clinicians and researchers with a brief overview of the current state of research regarding the use of CPM in children and adolescents, both healthy and clinical patients. The implications of CPM in experimental and clinical settings and its potential to aid in refining considerations to individualize treatment of pediatric pain syndromes will be discussed. PMID:29263694

  19. High-frequency measurements of aeolian saltation flux: Field-based methodology and applications

    NASA Astrophysics Data System (ADS)

    Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.

    2018-02-01

    Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.

  20. Methodologies for measuring travelers' risk perception of infectious diseases: A systematic review.

    PubMed

    Sridhar, Shruti; Régner, Isabelle; Brouqui, Philippe; Gautret, Philippe

    2016-01-01

    Numerous studies in the past have stressed the importance of travelers' psychology and perception in the implementation of preventive measures. The aim of this systematic review was to identify the methodologies used in studies reporting on travelers' risk perception of infectious diseases. A systematic search for relevant literature was conducted according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. There were 39 studies identified. In 35 of 39 studies, the methodology used was that of a knowledge, attitude and practice (KAP) survey based on questionnaires. One study used a combination of questionnaires and a visual psychometric measuring instrument called the 'pictorial representation of illness and self-measurement" or PRISM. One study used a self-representation model (SRM) method. Two studies measured psychosocial factors. Valuable information was obtained from KAP surveys showing an overall lack of knowledge among travelers about the most frequent travel-associated infections and associated preventive measures. This methodological approach however, is mainly descriptive, addressing knowledge, attitudes, and practices separately and lacking an examination of the interrelationships between these three components. Another limitation of the KAP method is underestimating psychosocial variables that have proved influential in health related behaviors, including perceived benefits and costs of preventive measures, perceived social pressure, perceived personal control, unrealistic optimism and risk propensity. Future risk perception studies in travel medicine should consider psychosocial variables with inferential and multivariate statistical analyses. The use of implicit measurements of attitudes could also provide new insights in the field of travelers' risk perception of travel-associated infectious diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Select Methodology for Validating Advanced Satellite Measurement Systems

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.

    2008-01-01

    Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.

  2. Non-pharmacological sleep interventions for youth with chronic health conditions: a critical review of the methodological quality of the evidence.

    PubMed

    Brown, Cary A; Kuo, Melissa; Phillips, Leah; Berry, Robyn; Tan, Maria

    2013-07-01

    Restorative sleep is clearly linked with well-being in youth with chronic health conditions. This review addresses the methodological quality of non-pharmacological sleep intervention (NPSI) research for youth with chronic health conditions. The Guidelines for Critical Review (GCR) and the Effective Public Health Practice Project Quality Assessment Tool (EPHPP) were used in the review. The search yielded 31 behavioural and 10 non-behavioural NPSI for review. Most studies had less than 10 participants. Autism spectrum disorders, attention deficit/hyperactivity disorders, down syndrome, intellectual disabilities, and visual impairments were the conditions that most studies focused upon. The global EPHPP scores indicated most reviewed studies were of weak quality. Only 7 studies were rated as moderate, none were strong. Studies rated as weak quality frequently had recruitment issues; non-blinded participants/parents and/or researchers; and used outcome measures without sound psychometric properties. Little conclusive evidence exists for NPSIs in this population. However, NPSIs are widely used and these preliminary studies demonstrate promising outcomes. There have not been any published reports of negative outcomes that would preclude application of the different NPSIs on a case-by-case basis guided by clinical judgement. These findings support the need for more rigorous, applied research. • Methodological Quality of Sleep Research • Disordered sleep (DS) in youth with chronic health conditions is pervasive and is important to rehabilitation therapists because DS contributes to significant functional problems across psychological, physical and emotional domains. • Rehabilitation therapists and other healthcare providers receive little education about disordered sleep and are largely unaware of the range of assessment and non-pharmacological intervention strategies that exist. An evidence-based website of pediatric sleep resources can be found at http

  3. A methodology for hard/soft information fusion in the condition monitoring of aircraft

    NASA Astrophysics Data System (ADS)

    Bernardo, Joseph T.

    2013-05-01

    Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.

  4. Phenomenological methodology for assessing the influence of flow conditions on the acoustic response of exhaust aftertreatment systems

    NASA Astrophysics Data System (ADS)

    Torregrosa, A. J.; Arnau, F. J.; Piqueras, P.; Sanchis, E. J.; Tartoussi, H.

    2017-05-01

    The increasing limits of standards on aerosol and gaseous emissions from internal combustion engines have led to the progressive inclusion of different exhaust aftertreatment systems (EATS) as a part of the powertrain. Regulated emissions are generally abated making use of devices based on monolithic structures with different chemical functions. As a side effect, wave transmission across the device is affected and so is the boundary at the exhaust line inlet, so that the design of the latter is in turn affected. While some models are available for the prediction of these effects, the geometrical complexity of many devices makes still necessary in many cases to rely on experimental measurements, which cannot cover all the diversity of flow conditions under which these devices operate. To overcome this limitation, a phenomenological methodology is proposed in this work that allows for the sound extrapolation of experimental results to flow conditions different from those used in the measurements. The transfer matrix is obtained from tests in an impulse rig for different excitation amplitudes and mean flows. The experimental coefficients of the transmission matrix of the device are fitted to Fourier series. It allows treating the influence of the flow conditions on the acoustic response, which is manifested on changes in the characteristic periods, separately from the specific properties of every device. In order to provide predictive capabilities to the method, the Fourier series approach is coupled to a gas dynamics model able to account for the sensitivity of propagation velocity to variations in the flow conditions.

  5. [Feasibility of prismatic correction of microesotropia using the measuring and correcting methodology by H.-J. Haase].

    PubMed

    Kromeier, M; Kommerell, G

    2006-01-01

    The "Measuring and Correcting Methodology after H.-J. Haase" is based on the assumption that a minute deviation from the orthovergence position (fixation disparity) indicates a difficulty to overcome a larger "vergence angle of rest". Objective recordings have, however, revealed that the subjective tests applied in the "Measuring and Correcting Methodology after H.-J. Haase" can mislead to the assumption of a fixation disparity, although both eyes are aligned exactly to the fixation point. How do patients with an inconspicuously small, yet objectively verified strabismus react to the "Measuring and Correcting Methodology by H.-J. Haase"? Eight patients with a microesotropia between 0.5 and 3 degrees were subjected to the "Measuring and Correcting Methodology after H.-J. Haase. In all 8 patients, the prisms determined with the Cross-, Pointer- and Rectangle Tests increased the angle of squint, without reaching a full correction: the original angle prevailed. In the Stereobalance Test, prisms did not reduce the 100 % preponderance of the non-squinting eye. The stereoscopic threshold was between 36 and 1170 arcsec in 7 out of the 8 subjects, and above 4000 arcsec in 1 subject. (1) In all 8 patients, prisms determined with the "Measuring and Correcting Methodology by H.-J. Haase" increased the angle of strabismus, without reaching bifoveal vision. This uniform result suggests that primary microesotropia cannot be corrected with the "Measuring and Correcting Methodology after H.-J. Haase" (2) A lacking contribution of the strabismic eye to the recognition of a lateral offset between stereo objects, as determined with the Stereobalance Test, does not imply a lack of binocular stereopsis.

  6. Defining Multiple Chronic Conditions for Quality Measurement.

    PubMed

    Drye, Elizabeth E; Altaf, Faseeha K; Lipska, Kasia J; Spatz, Erica S; Montague, Julia A; Bao, Haikun; Parzynski, Craig S; Ross, Joseph S; Bernheim, Susannah M; Krumholz, Harlan M; Lin, Zhenqiu

    2018-02-01

    Patients with multiple chronic conditions (MCCs) are a critical but undefined group for quality measurement. We present a generally applicable systematic approach to defining an MCC cohort of Medicare fee-for-service beneficiaries that we developed for a national quality measure, risk-standardized rates of unplanned admissions for Accountable Care Organizations. To define the MCC cohort we: (1) identified potential chronic conditions; (2) set criteria for cohort conditions based on MCC framework and measure concept; (3) applied the criteria informed by empirical analysis, experts, and the public; (4) described "broader" and "narrower" cohorts; and (5) selected final cohort with stakeholder input. Subjects were patients with chronic conditions. Participants included 21.8 million Medicare fee-for-service beneficiaries in 2012 aged 65 years and above with ≥1 of 27 Medicare Chronic Condition Warehouse condition(s). In total, 10 chronic conditions were identified based on our criteria; 8 of these 10 were associated with notably increased admission risk when co-occurring. A broader cohort (2+ of the 8 conditions) included 4.9 million beneficiaries (23% of total cohort) with an admission rate of 70 per 100 person-years. It captured 53% of total admissions. The narrower cohort (3+ conditions) had 2.2 million beneficiaries (10%) with 100 admissions per 100 person-years and captured 32% of admissions. Most stakeholders viewed the broader cohort as best aligned with the measure concept. By systematically narrowing chronic conditions to those most relevant to the outcome and incorporating stakeholder input, we defined an MCC admission measure cohort supported by stakeholders. This approach can be used as a model for other MCC outcome measures.

  7. The Role of Measuring in the Learning of Environmental Occupations and Some Aspects of Environmental Methodology

    ERIC Educational Resources Information Center

    István, Lüko

    2016-01-01

    The methodology neglected area of pedagogy, within the environmental specialist teaching methodology cultivating the best part is incomplete. In this article I shall attempt to environmental methodology presented in one part of the University of West Environmental Education workshop, based on the measurement of experiential learning as an…

  8. Quantum Measurement and Initial Conditions

    NASA Astrophysics Data System (ADS)

    Stoica, Ovidiu Cristinel

    2016-03-01

    Quantum measurement finds the observed system in a collapsed state, rather than in the state predicted by the Schrödinger equation. Yet there is a relatively spread opinion that the wavefunction collapse can be explained by unitary evolution (for instance in the decoherence approach, if we take into account the environment). In this article it is proven a mathematical result which severely restricts the initial conditions for which measurements have definite outcomes, if pure unitary evolution is assumed. This no-go theorem remains true even if we take the environment into account. The result does not forbid a unitary description of the measurement process, it only shows that such a description is possible only for very restricted initial conditions. The existence of such restrictions of the initial conditions can be understood in the four-dimensional block universe perspective, as a requirement of global self-consistency of the solutions of the Schrödinger equation.

  9. Methodology of Blade Unsteady Pressure Measurement in the NASA Transonic Flutter Cascade

    NASA Technical Reports Server (NTRS)

    Lepicovsky, J.; McFarland, E. R.; Capece, V. R.; Jett, T. A.; Senyitko, R. G.

    2002-01-01

    In this report the methodology adopted to measure unsteady pressures on blade surfaces in the NASA Transonic Flutter Cascade under conditions of simulated blade flutter is described. The previous work done in this cascade reported that the oscillating cascade produced waves, which for some interblade phase angles reflected off the wind tunnel walls back into the cascade, interfered with the cascade unsteady aerodynamics, and contaminated the acquired data. To alleviate the problems with data contamination due to the back wall interference, a method of influence coefficients was selected for the future unsteady work in this cascade. In this approach only one blade in the cascade is oscillated at a time. The majority of the report is concerned with the experimental technique used and the experimental data generated in the facility. The report presents a list of all test conditions for the small amplitude of blade oscillations, and shows examples of some of the results achieved. The report does not discuss data analysis procedures like ensemble averaging, frequency analysis, and unsteady blade loading diagrams reconstructed using the influence coefficient method. Finally, the report presents the lessons learned from this phase of the experimental effort, and suggests the improvements and directions of the experimental work for tests to be carried out for large oscillation amplitudes.

  10. Thermotactile perception thresholds measurement conditions.

    PubMed

    Maeda, Setsuo; Sakakibara, Hisataka

    2002-10-01

    The purpose of this paper is to investigate the effects of posture, push force and rate of temperature change on thermotactile thresholds and to clarify suitable measuring conditions for Japanese people. Thermotactile (warm and cold) thresholds on the right middle finger were measured with an HVLab thermal aesthesiometer. Subjects were eight healthy male Japanese students. The effects of posture in measurement were examined in the posture of a straight hand and forearm placed on a support, the same posture without a support, and the fingers and hand flexed at the wrist with the elbow placed on a desk. The finger push force applied to the applicator of the thermal aesthesiometer was controlled at a 0.5, 1.0, 2.0 and 3.0 N. The applicator temperature was changed to 0.5, 1.0, 1.5, 2.0 and 2.5 degrees C/s. After each measurement, subjects were asked about comfort under the measuring conditions. Three series of experiments were conducted on different days to evaluate repeatability. Repeated measures ANOVA showed that warm thresholds were affected by the push force and the rate of temperature change and that cold thresholds were influenced by posture and push force. The comfort assessment indicated that the measurement posture of a straight hand and forearm laid on a support was the most comfortable for the subjects. Relatively high repeatability was obtained under measurement conditions of a 1 degrees C/s temperature change rate and a 0.5 N push force. Measurement posture, push force and rate of temperature change can affect the thermal threshold. Judging from the repeatability, a push force of 0.5 N and a temperature change of 1.0 degrees C/s in the posture with the straight hand and forearm laid on a support are recommended for warm and cold threshold measurements.

  11. Optimization of Extraction Conditions for Phenolic Acids from the Leaves of Melissa officinalis L. Using Response Surface Methodology

    PubMed Central

    Yoo, Guijae; Lee, Il Kyun; Park, Seonju; Kim, Nanyoung; Park, Jun Hyung; Kim, Seung Hyun

    2018-01-01

    Background: Melissa officinalis L. is a well-known medicinal plant from the family Lamiaceae, which is distributed throughout Eastern Mediterranean region and Western Asia. Objective: In this study, response surface methodology (RSM) was utilized to optimize the extraction conditions for bioactive compounds from the leaves of M. officinalis L. Materials and Methods: A Box–Behnken design (BBD) was utilized to evaluate the effects of three independent variables, namely extraction temperature (°C), methanol concentration (%), and solvent-to-material ratio (mL/g) on the responses of the contents of caffeic acid and rosmarinic acid. Results: Regression analysis showed a good fit of the experimental data. The optimal condition was obtained at extraction temperature 80.53°C, methanol concentration 29.89%, and solvent-to-material ratio 30 mL/g. Conclusion: These results indicate the suitability of the model employed and the successful application of RSM in optimizing the extraction conditions. This study may be useful for standardizing production quality, including improving the efficiency of large-scale extraction systems. SUMMARY The optimum conditions for the extraction of major phenolic acids from the leaves of Melissa officinalis L. were determined using response surface methodologyBox–Behnken design was utilized to evaluate the effects of three independent variablesQuadratic polynomial model provided a satisfactory description of the experimental dataThe optimized condition for simultaneous maximum contents of caffeic acid and rosmarinic acid was determined. Abbreviations used: RSM: Response surface methodology, BBD: Box–Behnken design, CA: Caffeic acid, RA: Rosmarinic acid, HPLC: High-performance liquid chromatography. PMID:29720824

  12. New methodology for adjusting rotating shadowband irradiometer measurements

    NASA Astrophysics Data System (ADS)

    Vignola, Frank; Peterson, Josh; Wilbert, Stefan; Blanc, Philippe; Geuder, Norbert; Kern, Chris

    2017-06-01

    A new method is developed for correcting systematic errors found in rotating shadowband irradiometer measurements. Since the responsivity of photodiode-based pyranometers typically utilized for RST sensors is dependent upon the wavelength of the incident radiation and the spectral distribution of the incident radiation is different for the Direct Normal Trradiance and the Diffuse Horizontal Trradiance, spectral effects have to be considered. These cause the most problematic errors when applying currently available correction functions to RST measurements. Hence, direct normal and diffuse contributions are analyzed and modeled separately. An additional advantage of this methodology is that it provides a prescription for how to modify the adjustment algorithms to locations with different atmospheric characteristics from the location where the calibration and adjustment algorithms were developed. A summary of results and areas for future efforts are then discussed.

  13. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  14. A manufacturing error measurement methodology for a rotary vector reducer cycloidal gear based on a gear measuring center

    NASA Astrophysics Data System (ADS)

    Li, Tianxing; Zhou, Junxiang; Deng, Xiaozhong; Li, Jubo; Xing, Chunrong; Su, Jianxin; Wang, Huiliang

    2018-07-01

    A manufacturing error of a cycloidal gear is the key factor affecting the transmission accuracy of a robot rotary vector (RV) reducer. A methodology is proposed to realize the digitized measurement and data processing of the cycloidal gear manufacturing error based on the gear measuring center, which can quickly and accurately measure and evaluate the manufacturing error of the cycloidal gear by using both the whole tooth profile measurement and a single tooth profile measurement. By analyzing the particularity of the cycloidal profile and its effect on the actual meshing characteristics of the RV transmission, the cycloid profile measurement strategy is planned, and the theoretical profile model and error measurement model of cycloid-pin gear transmission are established. Through the digital processing technology, the theoretical trajectory of the probe and the normal vector of the measured point are calculated. By means of precision measurement principle and error compensation theory, a mathematical model for the accurate calculation and data processing of manufacturing error is constructed, and the actual manufacturing error of the cycloidal gear is obtained by the optimization iterative solution. Finally, the measurement experiment of the cycloidal gear tooth profile is carried out on the gear measuring center and the HEXAGON coordinate measuring machine, respectively. The measurement results verify the correctness and validity of the measurement theory and method. This methodology will provide the basis for the accurate evaluation and the effective control of manufacturing precision of the cycloidal gear in a robot RV reducer.

  15. Does Methodological Guidance Produce Consistency? A Review of Methodological Consistency in Breast Cancer Utility Value Measurement in NICE Single Technology Appraisals.

    PubMed

    Rose, Micah; Rice, Stephen; Craig, Dawn

    2018-06-01

    Since 2004, National Institute for Health and Care Excellence (NICE) methodological guidance for technology appraisals has emphasised a strong preference for using the validated EuroQol 5-Dimensions (EQ-5D) quality-of-life instrument, measuring patient health status from patients or carers, and using the general public's preference-based valuation of different health states when assessing health benefits in economic evaluations. The aim of this study was to review all NICE single technology appraisals (STAs) for breast cancer treatments to explore consistency in the use of utility scores in light of NICE methodological guidance. A review of all published breast cancer STAs was undertaken using all publicly available STA documents for each included assessment. Utility scores were assessed for consistency with NICE-preferred methods and original data sources. Furthermore, academic assessment group work undertaken during the STA process was examined to evaluate the emphasis of NICE-preferred quality-of-life measurement methods. Twelve breast cancer STAs were identified, and many STAs used evidence that did not follow NICE's preferred utility score measurement methods. Recent STA submissions show companies using EQ-5D and mapping. Academic assessment groups rarely emphasized NICE-preferred methods, and queries about preferred methods were rare. While there appears to be a trend in recent STA submissions towards following NICE methodological guidance, historically STA guidance in breast cancer has generally not used NICE's preferred methods. Future STAs in breast cancer and reviews of older guidance should ensure that utility measurement methods are consistent with the NICE reference case to help produce consistent, equitable decision making.

  16. [Factors conditioning primary care services utilization. Empirical evidence and methodological inconsistencies].

    PubMed

    Sáez, M

    2003-01-01

    In Spain, the degree and characteristics of primary care services utilization have been the subject of analysis since at least the 1980s. One of the main reasons for this interest is to assess the extent to which utilization matches primary care needs. In fact, the provision of an adequate health service for those who most need it is a generally accepted priority. The evidence shows that individual characteristics, mainly health status, are the factors most closely related to primary care utilization. Other personal characteristics, such as gender and age, could act as modulators of health care need. Some family and/or cultural variables, as well as factors related to the health care professional and institutions, could explain some of the observed variability in primary care services utilization. Socioeconomic variables, such as income, reveal a paradox. From an aggregate perspective, income is the main determinant of utilization as well as of health care expenditure. When data are analyzed for individuals, however, income is not related to primary health utilization. The situation is controversial, with methodological implications and, above all, consequences for the assessment of the efficiency in primary care utilization. Review of the literature reveals certain methodological inconsistencies that could at least partly explain the disparity of the empirical results. Among others, the following flaws can be highlighted: design problems, measurement errors, misspecification, and misleading statistical methods.Some solutions, among others, are quasi-experiments, the use of large administrative databases and of primary data sources (design problems); differentiation between types of utilization and between units of analysis other than consultations, and correction of measurement errors in the explanatory variables (measurement errors); consideration of relevant explanatory variables (misspecification); and the use of multilevel models (statistical methods).

  17. Characterization of gloss properties of differently treated polymer coating surfaces by surface clarity measurement methodology.

    PubMed

    Gruber, Dieter P; Buder-Stroisznigg, Michael; Wallner, Gernot; Strauß, Bernhard; Jandel, Lothar; Lang, Reinhold W

    2012-07-10

    With one measurement configuration, existing gloss measurement methodologies are generally restricted to specific gloss levels. A newly developed image-analytical gloss parameter called "clarity" provides the possibility to describe the perceptual result of a broad range of different gloss levels with one setup. In order to analyze and finally monitor the perceived gloss of products, a fast and flexible method also for the automated inspection is highly demanded. The clarity parameter is very fast to calculate and therefore usable for fast in-line surface inspection. Coated metal specimens were deformed by varying degree and polished afterwards in order to study the clarity parameter regarding the quantification of varying surface gloss types and levels. In order to analyze the correlation with the human gloss perception a study was carried out in which experts were asked to assess gloss properties of a series of surface samples under standardized conditions. The study confirmed clarity to exhibit considerably better correlation to the human perception than alternative gloss parameters.

  18. Dissociating contingency awareness and conditioned attitudes: evidence of contingency-unaware evaluative conditioning.

    PubMed

    Hütter, Mandy; Sweldens, Steven; Stahl, Christoph; Unkelbach, Christian; Klauer, Karl Christoph

    2012-08-01

    Whether human evaluative conditioning can occur without contingency awareness has been the subject of an intense and ongoing debate for decades, troubled by a wide array of methodological difficulties. Following recent methodological innovations, the available evidence currently points to the conclusion that evaluative conditioning effects do not occur without contingency awareness. In a simulation, we demonstrate, however, that these innovations are strongly biased toward the conclusion that evaluative conditioning requires contingency awareness, confounding the measurement of contingency memory with conditioned attitudes. We adopt a process-dissociation procedure to separate the memory and attitude components. In 4 studies, the attitude parameter is validated using existing attitudes and applied to probe for contingency-unaware evaluative conditioning. A fifth experiment incorporates a time-delay manipulation confirming the dissociability of the attitude and memory components. The results indicate that evaluative conditioning can produce attitudes without conscious awareness of the contingencies. Implications for theories of evaluative conditioning and associative learning are discussed. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  19. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews.

    PubMed

    Shea, Beverley J; Grimshaw, Jeremy M; Wells, George A; Boers, Maarten; Andersson, Neil; Hamel, Candyce; Porter, Ashley C; Tugwell, Peter; Moher, David; Bouter, Lex M

    2007-02-15

    Our objective was to develop an instrument to assess the methodological quality of systematic reviews, building upon previous tools, empirical evidence and expert consensus. A 37-item assessment tool was formed by combining 1) the enhanced Overview Quality Assessment Questionnaire (OQAQ), 2) a checklist created by Sacks, and 3) three additional items recently judged to be of methodological importance. This tool was applied to 99 paper-based and 52 electronic systematic reviews. Exploratory factor analysis was used to identify underlying components. The results were considered by methodological experts using a nominal group technique aimed at item reduction and design of an assessment tool with face and content validity. The factor analysis identified 11 components. From each component, one item was selected by the nominal group. The resulting instrument was judged to have face and content validity. A measurement tool for the 'assessment of multiple systematic reviews' (AMSTAR) was developed. The tool consists of 11 items and has good face and content validity for measuring the methodological quality of systematic reviews. Additional studies are needed with a focus on the reproducibility and construct validity of AMSTAR, before strong recommendations can be made on its use.

  20. A system and methodology for measuring volatile organic compounds produced by hydroponic lettuce in a controlled environment

    NASA Technical Reports Server (NTRS)

    Charron, C. S.; Cantliffe, D. J.; Wheeler, R. M.; Manukian, A.; Heath, R. R.

    1996-01-01

    A system and methodology were developed for the nondestructive qualitative and quantitative analysis of volatile emissions from hydroponically grown 'Waldmann's Green' leaf lettuce (Lactuca sativa L.). Photosynthetic photon flux (PPF), photoperiod, and temperature were automatically controlled and monitored in a growth chamber modified for the collection of plant volatiles. The lipoxygenase pathway products (Z)-3-hexenal, (Z)-3-hexenol, and (Z)-3-hexenyl acetate were emitted by lettuce plants after the transition from the light period to the dark period. The volatile collection system developed in this study enabled measurements of volatiles emitted by intact plants, from planting to harvest, under controlled environmental conditions.

  1. Telemetric measurement system of beehive environment conditions

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Sawicki, Aleksander

    2014-11-01

    This work presents a measurement system of beehive environmental conditions. The purpose of the device is to perform measurements of parameters such as ambient temperature, atmospheric pressure, internal temperature, humidity and sound level. The measured values were transferred to the MySQL database, which is located on an external server, with the use of GPRS protocol. A website presents the measurement data in the form of tables and graphs. The study also shows exemplary results of environmental conditions measurements recorded in the beehive by hour cycle.

  2. Optimization of hull-less pumpkin seed roasting conditions using response surface methodology.

    PubMed

    Vujasinović, Vesna; Radočaj, Olga; Dimić, Etelka

    2012-05-01

    Response surface methodology (RSM) was applied to optimize hull-less pumpkin seed roasting conditions before seed pressing to maximize the biochemical composition and antioxidant capacity of the virgin pumpkin oils obtained using a hydraulic press. Hull-less pumpkin seeds were roasted for various lengths of time (30 to 70 min) at various roasting temperatures (90 to 130 °C), resulting in 9 different oil samples, while the responses were phospholipids content, total phenols content, α- and γ-tocopherols, and antioxidative activity [by 2,2-diphenyl-1-picrylhydrazyl (DPPH) free-radical assay]. Mathematical models have shown that roasting conditions influenced all dependent variables at P < 0.05. The higher roasting temperatures had a significant effect (P < 0.05) on phospholipids, phenols, and α-tocopherols contents, while longer roasting time had a significant effect (P < 0.05) on γ-tocopherol content and antioxidant capacity, among the samples prepared under different roasting conditions. The optimum conditions for roasting the hull-less pumpkin seeds were 120 °C for duration of 49 min, which resulted in these oil concentrations: phospholipids 0.29%, total phenols 23.06 mg/kg, α-tocopherol 5.74 mg/100 g, γ-tocopherol 24.41 mg/100 g, and an antioxidative activity (EC(50)) of 27.18 mg oil/mg DPPH. © 2012 Institute of Food Technologists®

  3. A methodology for using borehole temperature-depth profiles under ambient, single and cross-borehole pumping conditions to estimate fracture hydraulic properties

    NASA Astrophysics Data System (ADS)

    Klepikova, Maria V.; Le Borgne, Tanguy; Bour, Olivier; Davy, Philippe

    2011-09-01

    SummaryTemperature profiles in the subsurface are known to be sensitive to groundwater flow. Here we show that they are also strongly related to vertical flow in the boreholes themselves. Based on a numerical model of flow and heat transfer at the borehole scale, we propose a method to invert temperature measurements to derive borehole flow velocities. This method is applied to an experimental site in fractured crystalline rocks. Vertical flow velocities deduced from the inversion of temperature measurements are compared with direct heat-pulse flowmeter measurements showing a good agreement over two orders of magnitudes. Applying this methodology under ambient, single and cross-borehole pumping conditions allows us to estimate fracture hydraulic head and local transmissivity, as well as inter-borehole fracture connectivity. Thus, these results provide new insights on how to include temperature profiles in inverse problems for estimating hydraulic fracture properties.

  4. [Basic questionnaire and methodological criteria for Surveys on Working Conditions, Employment, and Health in Latin America and the Caribbean].

    PubMed

    Benavides, Fernando G; Merino-Salazar, Pamela; Cornelio, Cecilia; Assunção, Ada Avila; Agudelo-Suárez, Andrés A; Amable, Marcelo; Artazcoz, Lucía; Astete, Jonh; Barraza, Douglas; Berhó, Fabián; Milián, Lino Carmenate; Delclòs, George; Funcasta, Lorena; Gerke, Johanna; Gimeno, David; Itatí-Iñiguez, María José; Lima, Eduardo de Paula; Martínez-Iñigo, David; Medeiros, Adriane Mesquita de; Orta, Lida; Pinilla, Javier; Rodrigo, Fernando; Rojas, Marianela; Sabastizagal, Iselle; Vallebuona, Clelia; Vermeylen, Greet; Villalobos, Gloria H; Vives, Alejandra

    2016-10-10

    This article aimed to present a basic questionnaire and minimum methodological criteria for consideration in future Surveys on Working Conditions, Employment, and Health in Latin America and the Caribbean. A virtual and face-to-face consensus process was conducted with participation by a group of international experts who used the surveys available up until 2013 as the point of departure for defining the proposal. The final questionnaire included 77 questions grouped in six dimensions: socio-demographic characteristics of workers and companies; employment conditions; working conditions; health status; resources and preventive activities; and family characteristics. The minimum methodological criteria feature the interviewee's home as the place for the interview and aspects related to the quality of the fieldwork. These results can help improve the comparability of future surveys in Latin America and the Caribbean, which would in turn help improve information on workers' heath in the region.

  5. Experimental Methodology for Measuring Combustion and Injection-Coupled Responses

    NASA Technical Reports Server (NTRS)

    Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.

    2006-01-01

    A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.

  6. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    PubMed Central

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-01-01

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the

  7. Laboratory measurements of gravel thermal properties. A methodology proposal

    NASA Astrophysics Data System (ADS)

    Cultrera, Matteo; Peron, Fabio; Bison, Paolo; Dalla Santa, Giorgia; Bertermann, David; Muller, Johannes; Bernardi, Adriana; Galgaro, Antonio

    2017-04-01

    Gravel thermal properties measurements at laboratory level is quite challenging due to several technical and logistic issues, mainly connected to the sediment sizes and the variability of their mineralogical composition. The direct measurement of gravel thermal properties usually are not able to involve a representative volume of geological material, consequently the thermal measurements performed produce much dispersed results and not consistent due to the large interstitial voids and the poor physical contact with the measuring sensors. With the aim of directly provide the measurement of the gravel thermal properties, a new methodology has been developed and some results are already available on several gravel deposits samples around Europe. Indeed, a single guarded hot plate Taurus Instruments TLP 800 measured the gravel thermal properties. Some instrumental adjustments were necessary to adapt the measuring devices and to finalize the thermal measurements on gravels at the IUAV FISTEC laboratory (Environmental Technical Physics Laboratory of Venice University). This device usually provides thermal measurements according to ISO 8302, ASTM C177, EN 1946-2, EN 12664, EN 12667 and EN 12939 for building materials. A preliminary calibration has been performed comparing the outcomes obtained with the single guarded hot plate with a needle probe of a portable thermal conductivity meter (ISOMET). Standard sand (ISO 67:2009) is used as reference material. This study is provided under the Cheap-GSHPs project that has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement no. 657982

  8. Methodology for extracting local constants from petroleum cracking flows

    DOEpatents

    Chang, Shen-Lin; Lottes, Steven A.; Zhou, Chenn Q.

    2000-01-01

    A methodology provides for the extraction of local chemical kinetic model constants for use in a reacting flow computational fluid dynamics (CFD) computer code with chemical kinetic computations to optimize the operating conditions or design of the system, including retrofit design improvements to existing systems. The coupled CFD and kinetic computer code are used in combination with data obtained from a matrix of experimental tests to extract the kinetic constants. Local fluid dynamic effects are implicitly included in the extracted local kinetic constants for each particular application system to which the methodology is applied. The extracted local kinetic model constants work well over a fairly broad range of operating conditions for specific and complex reaction sets in specific and complex reactor systems. While disclosed in terms of use in a Fluid Catalytic Cracking (FCC) riser, the inventive methodology has application in virtually any reaction set to extract constants for any particular application and reaction set formulation. The methodology includes the step of: (1) selecting the test data sets for various conditions; (2) establishing the general trend of the parametric effect on the measured product yields; (3) calculating product yields for the selected test conditions using coupled computational fluid dynamics and chemical kinetics; (4) adjusting the local kinetic constants to match calculated product yields with experimental data; and (5) validating the determined set of local kinetic constants by comparing the calculated results with experimental data from additional test runs at different operating conditions.

  9. Analytical Methodology Used To Assess/Refine Observatory Thermal Vacuum Test Conditions For the Landsat 8 Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Fantano, Louis

    2015-01-01

    Thermal and Fluids Analysis Workshop Silver Spring, MD NCTS 21070-15 The Landsat 8 Data Continuity Mission, which is part of the United States Geologic Survey (USGS), launched February 11, 2013. A Landsat environmental test requirement mandated that test conditions bound worst-case flight thermal environments. This paper describes a rigorous analytical methodology applied to assess refine proposed thermal vacuum test conditions and the issues encountered attempting to satisfy this requirement.

  10. A combined linear optimisation methodology for water resources allocation in Alfeios River Basin (Greece) under uncertain and vague system conditions

    NASA Astrophysics Data System (ADS)

    Bekri, Eleni; Yannopoulos, Panayotis; Disse, Markus

    2013-04-01

    In the present study, a combined linear programming methodology, based on Li et al. (2010) and Bekri et al. (2012), is employed for optimizing water allocation under uncertain system conditions in the Alfeios River Basin, in Greece. The Alfeios River is a water resources system of great natural, ecological, social and economic importance for Western Greece, since it has the longest and highest flow rate watercourse in the Peloponnisos region. Moreover, the river basin was exposed in the last decades to a plethora of environmental stresses (e.g. hydrogeological alterations, intensively irrigated agriculture, surface and groundwater overexploitation and infrastructure developments), resulting in the degradation of its quantitative and qualitative characteristics. As in most Mediterranean countries, water resource management in Alfeios River Basin has been focused up to now on an essentially supply-driven approach. It is still characterized by a lack of effective operational strategies. Authority responsibility relationships are fragmented, and law enforcement and policy implementation are weak. The present regulated water allocation puzzle entails a mixture of hydropower generation, irrigation, drinking water supply and recreational activities. Under these conditions its water resources management is characterised by high uncertainty and by vague and imprecise data. The considered methodology has been developed in order to deal with uncertainties expressed as either probability distributions, or/and fuzzy boundary intervals, derived by associated α-cut levels. In this framework a set of deterministic submodels is studied through linear programming. The ad hoc water resources management and alternative management patterns in an Alfeios subbasin are analyzed and evaluated under various scenarios, using the above mentioned methodology, aiming to promote a sustainable and equitable water management. Li, Y.P., Huang, G.H. and S.L., Nie, (2010), Planning water resources

  11. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    NASA Astrophysics Data System (ADS)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over

  12. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    ERIC Educational Resources Information Center

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  13. Conditioning Methodologies for DanceSport: Lessons from Gymnastics, Figure Skating, and Concert Dance Research.

    PubMed

    Outevsky, David; Martin, Blake Cw

    2015-12-01

    Dancesport, the competitive branch of ballroom dancing, places high physiological and psychological demands on its practitioners, but pedagogical resources in these areas for this dance form are limited. Dancesport competitors could benefit from strategies used in other aesthetic sports. In this review, we identify conditioning methodologies from gymnastics, figure skating, and contemporary, modern, and ballet dance forms that could have relevance and suitability for dancesport training, and propose several strategies for inclusion in the current dancesport curriculum. We reviewed articles derived from Google Scholar, PubMed, ScienceDirect, Taylor & Francis Online, and Web of Science search engines and databases, with publication dates from 1979 to 2013. The keywords included MeSH terms: dancing, gymnastics, physiology, energy metabolism, physical endurance, and range of motion. Out of 47 papers examined, 41 papers met the inclusion criteria (validity of scientific methods, topic relevance, transferability to dancesport, publication date). Quality and validity of the data were assessed by examining the methodologies in each study and comparing studies on similar populations as well as across time using the PRISMA 2009 checklist and flowchart. The relevant research suggests that macro-cycle periodization planning, aerobic and anaerobic conditioning, range of motion and muscular endurance training, and performance psychology methods have potential for adaptation for dancesport training. Dancesport coaches may help their students fulfill their ambitions as competitive athletes and dance artists by adapting the relevant performance enhancement strategies from gymnastics, figure skating, and concert dance forms presented in this paper.

  14. Financial Measures Project: Measuring Financial Conditions of Colleges and Universities, 1978 Working Conference.

    ERIC Educational Resources Information Center

    Coldren, Sharon L., Ed.

    Papers are presented from a 1978 working conference on measuring financial conditions of colleges and universities. Contents include the following: "The Federal Government's Interest in the Development of Financial Measures" by M. Chandler; "Improving the Conceptual Framework for Measuring Financial Condition Using Institutional…

  15. An innovative methodology for measurement of stress distribution of inflatable membrane structures

    NASA Astrophysics Data System (ADS)

    Zhao, Bing; Chen, Wujun; Hu, Jianhui; Chen, Jianwen; Qiu, Zhenyu; Zhou, Jinyu; Gao, Chengjun

    2016-02-01

    The inflatable membrane structure has been widely used in the fields of civil building, industrial building, airship, super pressure balloon and spacecraft. It is important to measure the stress distribution of the inflatable membrane structure because it influences the safety of the structural design. This paper presents an innovative methodology for the measurement and determination of the stress distribution of the inflatable membrane structure under different internal pressures, combining photogrammetry and the force-finding method. The shape of the inflatable membrane structure is maintained by the use of pressurized air, and the internal pressure is controlled and measured by means of an automatic pressure control system. The 3D coordinates of the marking points pasted on the membrane surface are acquired by three photographs captured from three cameras based on photogrammetry. After digitizing the markings on the photographs, the 3D curved surfaces are rebuilt. The continuous membrane surfaces are discretized into quadrilateral mesh and simulated by membrane links to calculate the stress distributions using the force-finding method. The internal pressure is simplified to the external node forces in the normal direction according to the contributory area of the node. Once the geometry x, the external force r and the topology C are obtained, the unknown force densities q in each link can be determined. Therefore, the stress distributions of the inflatable membrane structure can be calculated, combining the linear adjustment theory and the force density method based on the force equilibrium of inflated internal pressure and membrane internal force without considering the mechanical properties of the constitutive material. As the use of the inflatable membrane structure is attractive in the field of civil building, an ethylene-tetrafluoroethylene (ETFE) cushion is used with the measurement model to validate the proposed methodology. The comparisons between the

  16. Methodological considerations for measuring glucocorticoid metabolites in feathers

    PubMed Central

    Berk, Sara A.; McGettrick, Julie R.; Hansen, Warren K.; Breuner, Creagh W.

    2016-01-01

    In recent years, researchers have begun to use corticosteroid metabolites in feathers (fCORT) as a metric of stress physiology in birds. However, there remain substantial questions about how to measure fCORT most accurately. Notably, small samples contain artificially high amounts of fCORT per millimetre of feather (the small sample artefact). Furthermore, it appears that fCORT is correlated with circulating plasma corticosterone only when levels are artificially elevated by the use of corticosterone implants. Here, we used several approaches to address current methodological issues with the measurement of fCORT. First, we verified that the small sample artefact exists across species and feather types. Second, we attempted to correct for this effect by increasing the amount of methanol relative to the amount of feather during extraction. We consistently detected more fCORT per millimetre or per milligram of feather in small samples than in large samples even when we adjusted methanol:feather concentrations. We also used high-performance liquid chromatography to identify hormone metabolites present in feathers and measured the reactivity of these metabolites against the most commonly used antibody for measuring fCORT. We verified that our antibody is mainly identifying corticosterone (CORT) in feathers, but other metabolites have significant cross-reactivity. Lastly, we measured faecal glucocorticoid metabolites in house sparrows and correlated these measurements with corticosteroid metabolites deposited in concurrently grown feathers; we found no correlation between faecal glucocorticoid metabolites and fCORT. We suggest that researchers should be cautious in their interpretation of fCORT in wild birds and should seek alternative validation methods to examine species-specific relationships between environmental challenges and fCORT. PMID:27335650

  17. Evaluation of glucose controllers in virtual environment: methodology and sample application.

    PubMed

    Chassin, Ludovic J; Wilinska, Malgorzata E; Hovorka, Roman

    2004-11-01

    Adaptive systems to deliver medical treatment in humans are safety-critical systems and require particular care in both the testing and the evaluation phase, which are time-consuming, costly, and confounded by ethical issues. The objective of the present work is to develop a methodology to test glucose controllers of an artificial pancreas in a simulated (virtual) environment. A virtual environment comprising a model of the carbohydrate metabolism and models of the insulin pump and the glucose sensor is employed to simulate individual glucose excursions in subjects with type 1 diabetes. The performance of the control algorithm within the virtual environment is evaluated by considering treatment and operational scenarios. The developed methodology includes two dimensions: testing in relation to specific life style conditions, i.e. fasting, post-prandial, and life style (metabolic) disturbances; and testing in relation to various operating conditions, i.e. expected operating conditions, adverse operating conditions, and system failure. We define safety and efficacy criteria and describe the measures to be taken prior to clinical testing. The use of the methodology is exemplified by tuning and evaluating a model predictive glucose controller being developed for a wearable artificial pancreas focused on fasting conditions. Our methodology to test glucose controllers in a virtual environment is instrumental in anticipating the results of real clinical tests for different physiological conditions and for different operating conditions. The thorough testing in the virtual environment reduces costs and speeds up the development process.

  18. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    NASA Astrophysics Data System (ADS)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  19. Design and validation of a methodology using the International Classification of Diseases, 9th Revision, to identify secondary conditions in people with disabilities.

    PubMed

    Chan, Leighton; Shumway-Cook, Anne; Yorkston, Kathryn M; Ciol, Marcia A; Dudgeon, Brian J; Hoffman, Jeanne M

    2005-05-01

    To design and validate a methodology that identifies secondary conditions using International Classification of Disease, 9th Revision (ICD-9) codes. Secondary conditions were identified through a literature search and a survey of Washington State physiatrists. These conditions were translated into ICD-9 codes and this list was then validated against a national sample of Medicare survey respondents with differing levels of mobility and activities of daily living (ADL) disability. National survey. Participants (N=9731) in the 1999 Medicare Current Beneficiary Survey with no, mild, moderate, and severe mobility and ADL disability. Not applicable. Percentage of survey respondents with a secondary condition. The secondary conditions were grouped into 4 categories: medical, psychosocial, musculoskeletal, and dysphagia related (problems associated with difficulty in swallowing). Our literature search and survey of 26 physiatrists identified 64 secondary conditions, including depression, decubitus ulcers, and deconditioning. Overall, 70.4% of all survey respondents were treated for a secondary condition. We found a significant relation between increasing mobility as well as ADL disability and increasing numbers of secondary conditions (chi 2 test for trend, P <.001). This relation existed for all categories of secondary conditions: medical (chi 2 test for trend, P <.001), psychosocial (chi 2 test for trend, P <.001), musculoskeletal (chi 2 test for trend, P <.001), and dysphagia related (chi 2 test for trend, P <.001). We created a valid ICD-9-based methodology that identified secondary conditions in Medicare survey respondents and discriminated between people with different degrees of disability. This methodology will be useful for health services researchers who study the frequency and impact of secondary conditions.

  20. Review of PCR methodology.

    DOT National Transportation Integrated Search

    1998-03-01

    This study was conducted to review the Pavement Condition Rating (PCR) : methodology currently used by the Ohio DOT. The results of the literature search in this : connection indicated that many Highway agencies use a similar methodology to rate thei...

  1. Improvement of the Assignment Methodology of the Approach Embankment Design to Highway Structures in Difficult Conditions

    NASA Astrophysics Data System (ADS)

    Chistyy, Y.; Kuzakhmetova, E.; Fazilova, Z.; Tsukanova, O.

    2018-03-01

    Design issues of junction of bridges and overhead road with approach embankment are studied. The reasons for the formation of deformations in the road structure are indicated. Activities to ensure sustainability and acceleration of the shrinkage of a weak subgrade approach embankment are listed. The necessity of taking into account the man-made impact of the approach embankment on the subgrade behavior is proved. Modern stabilizing agents to improve the properties of used soils in the embankment and the subgrade are suggested. Clarified methodology for determining an active zone of compression in the subgrade under load from the weight of the embankment is described. As an additional condition to the existing methodology for establishing the lower bound of the active zone of compression it is offered to accept the accuracy of evaluation of soil compressibility and determine shrinkage.

  2. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  3. Development and validation of a continuous measure of patient condition using the Electronic Medical Record.

    PubMed

    Rothman, Michael J; Rothman, Steven I; Beals, Joseph

    2013-10-01

    Patient condition is a key element in communication between clinicians. However, there is no generally accepted definition of patient condition that is independent of diagnosis and that spans acuity levels. We report the development and validation of a continuous measure of general patient condition that is independent of diagnosis, and that can be used for medical-surgical as well as critical care patients. A survey of Electronic Medical Record data identified common, frequently collected non-static candidate variables as the basis for a general, continuously updated patient condition score. We used a new methodology to estimate in-hospital risk associated with each of these variables. A risk function for each candidate input was computed by comparing the final pre-discharge measurements with 1-year post-discharge mortality. Step-wise logistic regression of the variables against 1-year mortality was used to determine the importance of each variable. The final set of selected variables consisted of 26 clinical measurements from four categories: nursing assessments, vital signs, laboratory results and cardiac rhythms. We then constructed a heuristic model quantifying patient condition (overall risk) by summing the single-variable risks. The model's validity was assessed against outcomes from 170,000 medical-surgical and critical care patients, using data from three US hospitals. Outcome validation across hospitals yields an area under the receiver operating characteristic curve(AUC) of ≥0.92 when separating hospice/deceased from all other discharge categories, an AUC of ≥0.93 when predicting 24-h mortality and an AUC of 0.62 when predicting 30-day readmissions. Correspondence with outcomes reflective of patient condition across the acuity spectrum indicates utility in both medical-surgical units and critical care units. The model output, which we call the Rothman Index, may provide clinicians with a longitudinal view of patient condition to help address known

  4. The Influence of Measurement Methodology on the Accuracy of Electrical Waveform Distortion Analysis

    NASA Astrophysics Data System (ADS)

    Bartman, Jacek; Kwiatkowski, Bogdan

    2018-04-01

    The present paper covers a review of documents that specify measurement methods of voltage waveform distortion. It also presents measurement stages of waveform components that are uncommon in the classic fundamentals of electrotechnics and signal theory, including the creation process of groups and subgroups of harmonics and interharmonics. Moreover, the paper discusses selected distortion factors of periodic waveforms and presents analyses that compare the values of these distortion indices. The measurements were carried out in the cycle per cycle mode and the measurement methodology that was used complies with the IEC 61000-4-7 norm. The studies showed significant discrepancies between the values of analyzed parameters.

  5. Decision-problem state analysis methodology

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1980-01-01

    A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.

  6. Impact of volunteer-related and methodology-related factors on the reproducibility of brachial artery flow-mediated vasodilation: analysis of 672 individual repeated measurements.

    PubMed

    van Mil, Anke C C M; Greyling, Arno; Zock, Peter L; Geleijnse, Johanna M; Hopman, Maria T; Mensink, Ronald P; Reesink, Koen D; Green, Daniel J; Ghiadoni, Lorenzo; Thijssen, Dick H

    2016-09-01

    Brachial artery flow-mediated dilation (FMD) is a popular technique to examine endothelial function in humans. Identifying volunteer and methodological factors related to variation in FMD is important to improve measurement accuracy and applicability. Volunteer-related and methodology-related parameters were collected in 672 volunteers from eight affiliated centres worldwide who underwent repeated measures of FMD. All centres adopted contemporary expert-consensus guidelines for FMD assessment. After calculating the coefficient of variation (%) of the FMD for each individual, we constructed quartiles (n = 168 per quartile). Based on two regression models (volunteer-related factors and methodology-related factors), statistically significant components of these two models were added to a final regression model (calculated as β-coefficient and R). This allowed us to identify factors that independently contributed to the variation in FMD%. Median coefficient of variation was 17.5%, with healthy volunteers demonstrating a coefficient of variation 9.3%. Regression models revealed age (β = 0.248, P < 0.001), hypertension (β = 0.104, P < 0.001), dyslipidemia (β = 0.331, P < 0.001), time between measurements (β = 0.318, P < 0.001), lab experience (β = -0.133, P < 0.001) and baseline FMD% (β = 0.082, P < 0.05) as contributors to the coefficient of variation. After including all significant factors in the final model, we found that time between measurements, hypertension, baseline FMD% and lab experience with FMD independently predicted brachial artery variability (total R = 0.202). Although FMD% showed good reproducibility, larger variation was observed in conditions with longer time between measurements, hypertension, less experience and lower baseline FMD%. Accounting for these factors may improve FMD% variability.

  7. A methodology for using borehole temperature-depth profiles under ambient, single and cross-borehole pumping conditions to estimate fracture hydraulic properties

    NASA Astrophysics Data System (ADS)

    Klepikova, M.; Le Borgne, T.; Bour, O.; Lavenant, N.

    2011-12-01

    measurements is that temperature can be measured easily and very accurately, continuously in space and time. To test the methodology, we have performed a field experiment at a crystalline rocks field site, located in Ploemeur, Brittany (France). The site is composed of three 100 meters deep boreholes, located at 6-10 m distances from each other. The experiment consisted in measuring the borehole temperature profiles under all possible pumping configurations. Hence, the pumping and monitoring wells were successively changed. The thermal response in observation well induced by changes in pumping conditions is related to changes in vertical flow velocities and thus to the inter-borehole fracture connectivity. Based on this dataset, we propose a methodology to include temperature profiles in inverse problem for characterizing the spatial distribution of fracture zone hydraulic properties.

  8. Methodology for the Calibration of the Data Acquisition with a Six-Degree-of-Freedom Acceleration Measurement Device

    DOT National Transportation Integrated Search

    1989-06-01

    This report describes a methodology for calibrating and gathering data with a six-degree-of-freedom acceleration measurement device that is intended to measure head acceleration of anthropomorphic dummies and human volunteers in automotive crash test...

  9. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements

    PubMed Central

    do Amaral, Leonardo L.; Pavoni, Juliana F.; Sampaio, Francisco; Netto, Thomaz Ghilardi

    2015-01-01

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source‐to‐detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were −1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N‐ PMID

  10. An ultrasonic methodology for muscle cross section measurement of support space flight

    NASA Astrophysics Data System (ADS)

    Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.

    2004-09-01

    The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal

  11. Using Reported Rates of Sexually Transmitted Diseases to Illustrate Potential Methodological Issues in the Measurement of Racial and Ethnic Disparities.

    PubMed

    Chesson, Harrell W; Patel, Chirag G; Gift, Thomas L; Bernstein, Kyle T; Aral, Sevgi O

    2017-09-01

    Racial disparities in the burden of sexually transmitted diseases (STDs) have been documented and described for decades. Similarly, methodological issues and limitations in the use of disparity measures to quantify disparities in health have also been well documented. The purpose of this study was to use historic STD surveillance data to illustrate four of the most well-known methodological issues associated with the use of disparity measures. We manually searched STD surveillance reports to find examples of racial/ethnic distributions of reported STDs that illustrate key methodological issues in the use of disparity measures. The disparity measures we calculated included the black-white rate ratio, the Index of Disparity (weighted and unweighted by subgroup population), and the Gini coefficient. The 4 examples we developed included illustrations of potential differences in relative and absolute disparity measures, potential differences in weighted and nonweighted disparity measures, the importance of the reference point when calculating disparities, and differences in disparity measures in the assessment of trends in disparities over time. For example, the gonorrhea rate increased for all minority groups (relative to whites) from 1992 to 1993, yet the Index of Disparity suggested that racial/ethnic disparities had decreased. Although imperfect, disparity measures can be useful to quantify racial/ethnic disparities in STDs, to assess trends in these disparities, and to inform interventions to reduce these disparities. Our study uses reported STD rates to illustrate potential methodological issues with these disparity measures and highlights key considerations when selecting disparity measures for quantifying disparities in STDs.

  12. Comprehensive Safety Analysis 2010 Safety Measurement System (SMS) Methodology, Version 2.1 Revised December 2010

    DOT National Transportation Integrated Search

    2010-12-01

    This report documents the Safety Measurement System (SMS) methodology developed to support the Comprehensive Safety Analysis 2010 (CSA 2010) Initiative for the Federal Motor Carrier Safety Administration (FMCSA). The SMS is one of the major tools for...

  13. Measuring user experience in digital gaming: theoretical and methodological issues

    NASA Astrophysics Data System (ADS)

    Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte

    2007-01-01

    There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.

  14. Traction and film thickness measurements under starved elastohydrodynamic conditions

    NASA Technical Reports Server (NTRS)

    Wedeven, L. D.

    1974-01-01

    Traction measurements under starved elastohydrodynamic conditions were obtained for a point contact geometry. Simultaneous measurements of the film thickness and the locations of the inlet lubricant boundary were made optically. The thickness of a starved film for combination rolling and sliding conditions varies with the location of the inlet boundary in the same way found previously for pure rolling. A starved film was observed to possess greater traction than a flooded film for the same slide roll ratio. For a given slide roll ratio a starved film simply increases the shear rate in the Hertz region. The maximum shear rate depends on the degree of starvation and has no theoretical limit. Traction measurements under starved conditions were compared with flooded conditions under equivalent shear rates in the Hertz region. When the shear rates in the Hertz region were low and the film severely starved, the measured tractions were found to be much lower than expected.

  15. The choice of boundary conditions and mesh for scaffolding FEM model on the basis of natural vibrations measurements

    NASA Astrophysics Data System (ADS)

    Cyniak, Patrycja; Błazik-Borowa, Ewa; Szer, Jacek; Lipecki, Tomasz; Szer, Iwona

    2018-01-01

    Scaffolding is a specific construction with high susceptibility to low frequency vibrations. The numerical model of scaffolding presented in this paper contains real imperfections received from geodetic measurements of real construction. Boundary conditions were verified on the basis of measured free vibrations. A simulation of a man walking on penultimate working level as a dynamic load variable in time was made for verified model. The paper presents procedure for a choice of selected parameters of the scaffolding FEM model. The main aim of analysis is the best projection of the real construction and correct modeling of worker walking on the scaffolding. Different boundary conditions are considered, because of their impact on construction vibrations. Natural vibrations obtained from FEM calculations are compared with free vibrations measured during in-situ tests. Structure accelerations caused by walking human are then considered in this paper. Methodology of creating numerical models of scaffoldings and analysis of dynamic effects during human walking are starting points for further considerations about dynamic loads acting on such structures and effects of these loads to construction and workers, whose workplaces are situated on the scaffolding.

  16. A methodology to condition distorted acoustic emission signals to identify fracture timing from human cadaver spine impact tests.

    PubMed

    Arun, Mike W J; Yoganandan, Narayan; Stemper, Brian D; Pintar, Frank A

    2014-12-01

    While studies have used acoustic sensors to determine fracture initiation time in biomechanical studies, a systematic procedure is not established to process acoustic signals. The objective of the study was to develop a methodology to condition distorted acoustic emission data using signal processing techniques to identify fracture initiation time. The methodology was developed from testing a human cadaver lumbar spine column. Acoustic sensors were glued to all vertebrae, high-rate impact loading was applied, load-time histories were recorded (load cell), and fracture was documented using CT. Compression fracture occurred to L1 while other vertebrae were intact. FFT of raw voltage-time traces were used to determine an optimum frequency range associated with high decibel levels. Signals were bandpass filtered in this range. Bursting pattern was found in the fractured vertebra while signals from other vertebrae were silent. Bursting time was associated with time of fracture initiation. Force at fracture was determined using this time and force-time data. The methodology is independent of selecting parameters a priori such as fixing a voltage level(s), bandpass frequency and/or using force-time signal, and allows determination of force based on time identified during signal processing. The methodology can be used for different body regions in cadaver experiments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Riser Difference Uncertainty Methodology Based on Tank AY-101 Wall Thickness Measurements with Application to Tank AN-107

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weier, Dennis R.; Anderson, Kevin K.; Berman, Herbert S.

    2005-03-10

    The DST Integrity Plan (RPP-7574, 2003, Double-Shell Tank Integrity Program Plan, Rev. 1A, CH2M HILL Hanford Group, Inc., Richland, Washington.) requires the ultrasonic wall thickness measurement of two vertical scans of the tank primary wall while using a single riser location. The resulting measurements are then used in extreme value methodology to predict the minimum wall thickness expected for the entire tank. The representativeness of using a single riser in this manner to draw conclusions about the entire circumference of a tank has been questioned. The only data available with which to address the representativeness question comes from Tank AY-101more » since only for that tank have multiple risers been used for such inspection. The purpose of this report is to (1) further characterize AY-101 riser differences (relative to prior work); (2) propose a methodology for incorporating a ''riser difference'' uncertainty for subsequent tanks for which only a single riser is used, and (3) specifically apply the methodology to measurements made from a single riser in Tank AN-107.« less

  18. Methodological Considerations for Hair Cortisol Measurements in Children

    PubMed Central

    Slominski, Radomir; Rovnaghi, Cynthia R.; Anand, Kanwaljeet J. S.

    2015-01-01

    Background Hair cortisol levels are used increasingly as a measure for chronic stress in young children. We propose modifications to the current methods used for hair cortisol analysis to more accurately determine reference ranges for hair cortisol across different populations and age groups. Methods The authors compared standard (finely cutting hair) vs. milled methods for hair processing (n=16), developed a 4-step extraction process for hair protein and cortisol (n=16), and compared liquid chromatography-mass spectrometry (LCMS) vs. ELISA assays for measuring hair cortisol (n=28). The extraction process included sequential incubations in methanol and acetone, repeated twice. Hair protein was measured via spectrophotometric ratios at 260/280 nm to indicate the hair dissolution state using a BioTek® plate reader and dedicated software. Hair cortisol was measured using an ELISA assay kit. Individual (n=13), pooled hair samples (n=12) with high, intermediate, and low cortisol values and the ELISA assay internal standards (n=3) were also evaluated by LCMS. Results Milled and standard methods showed highly correlated hair cortisol (rs=0.951, p<0.0001) and protein values (rs=0.902, p=0.0002), although higher yields of cortisol and protein were obtained from the standard method in 13/16 and 14/16 samples respectively (p<0.05). Four sequential extractions yielded additional amounts of protein (36.5%, 27.5%, 30.5%, 3.1%) and cortisol (45.4%, 31.1%, 15.1%, 0.04%) from hair samples. Cortisol values measured by LCMS and ELISA were correlated (rs=0.737; p<0.0001), although cortisol levels (median [IQR]) detected in the same samples by LCMS (38.7 [14.4, 136] ng/ml) were lower than by ELISA (172.2 [67.9, 1051] ng/ml). LCMS also detected cortisone, which comprised 13.4% (3.7%, 25.9%) of the steroids detected. Conclusion Methodological studies suggest that finely cutting hair with sequential incubations in methanol and acetone, repeated twice, extracts greater yields of cortisol

  19. A methodology for successfully producing global translations of patient reported outcome measures for use in multiple countries.

    PubMed

    Two, Rebecca; Verjee-Lorenz, Aneesa; Clayson, Darren; Dalal, Mehul; Grotzinger, Kelly; Younossi, Zobair M

    2010-01-01

    The production of accurate and culturally relevant translations of patient reported outcome (PRO) measures is essential for the success of international clinical trials. Although there are many reports in publication regarding the translation of PRO measures, the techniques used to produce single translations for use in multiple countries (global translations) are not well documented. This article addresses this apparent lack of documentation and presents the methodology used to create global translations of the Chronic Liver Disease Questionnaire-Hepatitis C Virus (CLDQ-HCV). The challenges of creating a translation for use in multiple countries are discussed, and the criteria for a global translation project explained. Based on a thorough translation and linguistic validation methodology including a concept elaboration, multiple forward translations, two back translations, reviews by in-country clinicians and the instrument developer, pilot testing in each target country and multiple sets of proofreading, the key concept of the global translation methodology is consistent international harmonization, achieved through the involvement of linguists from each target country at every stage of the process. This methodology enabled the successful resolution of the translation issues encountered, and resulted in consistent translations of the CLDQ-HCV that were linguistically and culturally appropriate for all target countries.

  20. Measurement of testosterone in human sexuality research: methodological considerations.

    PubMed

    van Anders, Sari M; Goldey, Katherine L; Bell, Sarah N

    2014-02-01

    Testosterone (T) and other androgens are incorporated into an increasingly wide array of human sexuality research, but there are a number of issues that can affect or confound research outcomes. This review addresses various methodological issues relevant to research design in human studies with T; unaddressed, these issues may introduce unwanted noise, error, or conceptual barriers to interpreting results. Topics covered are (1) social and demographic factors (gender and sex; sexual orientations and sexual diversity; social/familial connections and processes; social location variables), (2) biological rhythms (diurnal variation; seasonality; menstrual cycles; aging and menopause), (3) sample collection, handling, and storage (saliva vs. blood; sialogogues, saliva, and tubes; sampling frequency, timing, and context; shipping samples), (4) health, medical issues, and the body (hormonal contraceptives; medications and nicotine; health conditions and stress; body composition, weight, and exercise), and (5) incorporating multiple hormones. Detailing a comprehensive set of important issues and relevant empirical evidence, this review provides a starting point for best practices in human sexuality research with T and other androgens that may be especially useful for those new to hormone research.

  1. Translation and linguistic validation of the Pediatric Patient-Reported Outcomes Measurement Information System measures into simplified Chinese using cognitive interviewing methodology.

    PubMed

    Liu, Yanyan; Hinds, Pamela S; Wang, Jichuan; Correia, Helena; Du, Shizheng; Ding, Jian; Gao, Wen Jun; Yuan, Changrong

    2013-01-01

    The Pediatric Patient-Reported Outcomes Measurement Information System (PROMIS) measures were developed using modern measurement theory and tested in a variety of settings to assess the quality of life, function, and symptoms of children and adolescents experiencing a chronic illness and its treatment. Developed in English, this set of measures had not been translated into Chinese. The objective of this study was to develop the Chinese version of the Pediatric PROMIS measures (C-Ped-PROMIS), specifically 8 short forms, and to pretest the translated measures in children and adolescents through cognitive interviewing methodology. The C-Ped-PROMIS was developed following the standard Functional Assessment of Chronic Illness Therapy Translation Methodology. Bilingual teams from the United States and China reviewed the translation to develop a provisional version, which was then pretested with cognitive interview by probing 10 native Chinese-speaking children aged 8 to 17 years in China. The translation was finalized by the bilingual teams. Most items, response options, and instructions were well understood by the children, and some revisions were made to address patient's comments during the cognitive interview. The results indicated that the C-Ped-PROMIS items were semantically and conceptually equivalent to the original. Children aged 8 to 17 years in China were able to comprehend these measures and express their experience and feelings about illness or their life. The C-Ped-PROMIS is available for psychometric validation. Future work will be directed at translating the rest of the item banks, calibrating them and creating a Chinese final version of the short forms.

  2. Advanced haptic sensor for measuring human skin conditions

    NASA Astrophysics Data System (ADS)

    Tsuchimi, Daisuke; Okuyama, Takeshi; Tanaka, Mami

    2009-12-01

    This paper is concerned with the development of a tactile sensor using PVDF (Polyvinylidene Fluoride) film as a sensory receptor of the sensor to evaluate softness, smoothness, and stickiness of human skin. Tactile sense is the most important sense in the sensation receptor of the human body along with eyesight, and we can examine skin condition quickly using these sense. But, its subjectivity and ambiguity make it difficult to quantify skin conditions. Therefore, development of measurement device which can evaluate skin conditions easily and objectively is demanded by dermatologists, cosmetic industries, and so on. In this paper, an advanced haptic sensor system that can measure multiple information of skin condition in various parts of human body is developed. The applications of the sensor system to evaluate softness, smoothness, and stickiness of skin are investigated through two experiments.

  3. Advanced haptic sensor for measuring human skin conditions

    NASA Astrophysics Data System (ADS)

    Tsuchimi, Daisuke; Okuyama, Takeshi; Tanaka, Mami

    2010-01-01

    This paper is concerned with the development of a tactile sensor using PVDF (Polyvinylidene Fluoride) film as a sensory receptor of the sensor to evaluate softness, smoothness, and stickiness of human skin. Tactile sense is the most important sense in the sensation receptor of the human body along with eyesight, and we can examine skin condition quickly using these sense. But, its subjectivity and ambiguity make it difficult to quantify skin conditions. Therefore, development of measurement device which can evaluate skin conditions easily and objectively is demanded by dermatologists, cosmetic industries, and so on. In this paper, an advanced haptic sensor system that can measure multiple information of skin condition in various parts of human body is developed. The applications of the sensor system to evaluate softness, smoothness, and stickiness of skin are investigated through two experiments.

  4. Conditional Probabilities and Collapse in Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  5. Methodology for application of field rainfall simulator to revise c-factor database for conditions of the Czech Republic

    NASA Astrophysics Data System (ADS)

    Neumann, Martin; Dostál, Tomáš; Krása, Josef; Kavka, Petr; Davidová, Tereza; Brant, Václav; Kroulík, Milan; Mistr, Martin; Novotný, Ivan

    2016-04-01

    The presentation will introduce a methodology of determination of crop and cover management factor (C-faktor) for the universal soil loss equation (USLE) using field rainfall simulator. The aim of the project is to determine the C-factor value for the different phenophases of the main crops of the central-european region, while also taking into account the different agrotechnical methods. By using the field rainfall simulator, it is possible to perform the measurements in specific phenophases, which is otherwise difficult to execute due to the variability and fortuity of the natural rainfall. Due to the number of measurements needed, two identical simulators will be used, operated by two independent teams, with coordinated methodology. The methodology will mainly specify the length of simulation, the rainfall intensity, and the sampling technique. The presentation includes a more detailed account of the methods selected. Due to the wide range of variable crops and soils, it is not possible to execute the measurements for all possible combinations. We therefore decided to perform the measurements for previously selected combinations of soils,crops and agrotechnologies that are the most common in the Czech Republic. During the experiments, the volume of the surface runoff and amount of sediment will be measured in their temporal distribution, as well as several other important parameters. The key values of the 3D matrix of the combinations of the crop, agrotechnique and soil will be determined experimentally. The remaining values will be determined by interpolation or by a model analogy. There are several methods used for C-factor calculation from measured experimental data. Some of these are not suitable to be used considering the type of data gathered. The presentation will discuss the benefits and drawbacks of these methods, as well as the final design of the method used. The problems concerning the selection of a relevant measurement method as well as the final

  6. Health Systems and Their Assessment: A Methodological Proposal of the Synthetic Outcome Measure

    PubMed Central

    Romaniuk, Piotr; Kaczmarek, Krzysztof; Syrkiewicz-Świtała, Magdalena; Holecki, Tomasz; Szromek, Adam R.

    2018-01-01

    The effectiveness of health systems is an area of constant interest for public health researchers and practitioners. The varied approach to effectiveness itself has resulted in numerous methodological proposals related to its measurement. The limitations of the currently used methods lead to a constant search for better tools for the assessment of health systems. This article shows the possibilities of using the health system synthetic outcome measure (SOM) for this purpose. It is an original tool using 41 indicators referring to the epidemiological situation, health behaviors, and factors related to the health-care system, which allows a relatively quick and easy assessment of the health system in terms of its effectiveness. Construction of the measure of health system functioning in such a way allowed its presentation in dynamic perspective, i.e., assessing not only the health system itself in a given moment of time but also changes in the value of the effectiveness measures. In order to demonstrate the cognitive value of the SOM, the analysis of the effectiveness of health systems in 21 countries of Central and Eastern Europe during the transformation period was carried out. The mean SOM values calculated on the basis of the component measures allowed to differentiate countries in terms of the effectiveness of their health systems. Considering the whole period, a similar level of health system effects can be observed in Slovenia, Croatia, Czech Republic, Slovakia, Poland, Macedonia, and Albania. In the middle group, Hungary, Romania, Latvia, Lithuania, Georgia, Estonia, Bulgaria, Belarus, and Armenia were found. The third group, weakest in terms of achieved effects, was formed by health systems in countries like Ukraine, Moldova, and Russia. The presented method allows for the analysis of the health system outcomes from a comparative angle, eliminating arbitrariness of pinpointing a model solution as a potential reference point in the assessment of the systems

  7. Health Systems and Their Assessment: A Methodological Proposal of the Synthetic Outcome Measure.

    PubMed

    Romaniuk, Piotr; Kaczmarek, Krzysztof; Syrkiewicz-Świtała, Magdalena; Holecki, Tomasz; Szromek, Adam R

    2018-01-01

    The effectiveness of health systems is an area of constant interest for public health researchers and practitioners. The varied approach to effectiveness itself has resulted in numerous methodological proposals related to its measurement. The limitations of the currently used methods lead to a constant search for better tools for the assessment of health systems. This article shows the possibilities of using the health system synthetic outcome measure (SOM) for this purpose. It is an original tool using 41 indicators referring to the epidemiological situation, health behaviors, and factors related to the health-care system, which allows a relatively quick and easy assessment of the health system in terms of its effectiveness. Construction of the measure of health system functioning in such a way allowed its presentation in dynamic perspective, i.e., assessing not only the health system itself in a given moment of time but also changes in the value of the effectiveness measures. In order to demonstrate the cognitive value of the SOM, the analysis of the effectiveness of health systems in 21 countries of Central and Eastern Europe during the transformation period was carried out. The mean SOM values calculated on the basis of the component measures allowed to differentiate countries in terms of the effectiveness of their health systems. Considering the whole period, a similar level of health system effects can be observed in Slovenia, Croatia, Czech Republic, Slovakia, Poland, Macedonia, and Albania. In the middle group, Hungary, Romania, Latvia, Lithuania, Georgia, Estonia, Bulgaria, Belarus, and Armenia were found. The third group, weakest in terms of achieved effects, was formed by health systems in countries like Ukraine, Moldova, and Russia. The presented method allows for the analysis of the health system outcomes from a comparative angle, eliminating arbitrariness of pinpointing a model solution as a potential reference point in the assessment of the systems

  8. Innovative methodologies and technologies for thermal energy release measurement.

    NASA Astrophysics Data System (ADS)

    Marotta, Enrica; Peluso, Rosario; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Chiodini, Giovanni; Mangiacapra, Annarita; Petrillo, Zaccaria; Sansivero, Fabio; Vilardo, Giuseppe; Marfe, Barbara

    2016-04-01

    Volcanoes exchange heat, gases and other fluids between the interrior of the Earth and its atmosphere influencing processes both at the surface and above it. This work is devoted to improve the knowledge on the parameters that control the anomalies in heat flux and chemical species emissions associated with the diffuse degassing processes of volcanic and hydrothermal zones. We are studying and developing innovative medium range remote sensing technologies to measure the variations through time of heat flux and chemical emissions in order to boost the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The current methodologies used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. Remote sensing of these parameters will allow for measurements faster than already accredited methods therefore it will be both more effective and efficient in case of emergency and it will be used to make quick routine monitoring. We are currently developing a method based on drone-born IR cameras to measure the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. The use of flying drones will allow to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature at distance in the order of hundreds of meters. Further development of remote sensing will be done through the use, on flying drones, of multispectral and/or iperspectral sensors, UV scanners in order to be able to detect the amount of chemical species released in the athmosphere.

  9. The Self-Concept. Volume 1, A Review of Methodological Considerations and Measuring Instruments. Revised Edition.

    ERIC Educational Resources Information Center

    Wylie, Ruth C.

    This volume of the revised edition describes and evaluates measurement methods, research designs, and procedures which have been or might appropriately be used in self-concept research. Working from the perspective that self-concept or phenomenal personality theories can be scientifically investigated, methodological flaws and questionable…

  10. Conditions of viscosity measurement for detecting irradiated peppers

    NASA Astrophysics Data System (ADS)

    Hayashi, Toru; Todoriki, Setsuko; Okadome, Hiroshi; Kohyama, Kaoru

    1995-04-01

    Viscosity of gelatinized suspensions of black and white peppers decreased depending upon dose. The viscosity was influenced by gelatinization and viscosity measurement conditions. The difference between unirradiated pepper and an irradiated one was larger at a higher pH and temperature for gelatinization. A viscosity parameter normalized with the starch content of pepper sample and the viscosity of a 5% suspension of corn starch could get rid of the influence of the conditions for viscosity measurement such as a type of viscometer, shear rate and temperature.

  11. Methodological evaluation and comparison of five urinary albumin measurements.

    PubMed

    Liu, Rui; Li, Gang; Cui, Xiao-Fan; Zhang, Dong-Ling; Yang, Qing-Hong; Mu, Xiao-Yan; Pan, Wen-Jie

    2011-01-01

    Microalbuminuria is an indicator of kidney damage and a risk factor for the progression kidney disease, cardiovascular disease, and so on. Therefore, accurate and precise measurement of urinary albumin is critical. However, there are no reference measurement procedures and reference materials for urinary albumin. Nephelometry, turbidimetry, colloidal gold method, radioimmunoassay, and chemiluminescence immunoassay were performed for methodological evaluation, based on imprecision test, recovery rate, linearity, haemoglobin interference rate, and verified reference interval. Then we tested 40 urine samples from diabetic patients by each method, and compared the result between assays. The results indicate that nephelometry is the method with best analytical performance among the five methods, with an average intraassay coefficient of variation (CV) of 2.6%, an average interassay CV of 1.7%, a mean recovery of 99.6%, a linearity of R=1.00 from 2 to 250 mg/l, and an interference rate of <10% at haemoglobin concentrations of <1.82 g/l. The correlation (r) between assays was from 0.701 to 0.982, and the Bland-Altman plots indicated each assay provided significantly different results from each other. Nephelometry is the clinical urinary albumin method with best analytical performance in our study. © 2011 Wiley-Liss, Inc.

  12. Scalar mixing and strain dynamics methodologies for PIV/LIF measurements of vortex ring flows

    NASA Astrophysics Data System (ADS)

    Bouremel, Yann; Ducci, Andrea

    2017-01-01

    Fluid mixing operations are central to possibly all chemical, petrochemical, and pharmaceutical industries either being related to biphasic blending in polymerisation processes, cell suspension for biopharmaceuticals production, and fractionation of complex oil mixtures. This work aims at providing a fundamental understanding of the mixing and stretching dynamics occurring in a reactor in the presence of a vortical structure, and the vortex ring was selected as a flow paradigm of vortices commonly encountered in stirred and shaken reactors in laminar flow conditions. High resolution laser induced fluorescence and particle imaging velocimetry measurements were carried out to fully resolve the flow dissipative scales and provide a complete data set to fully assess macro- and micro-mixing characteristics. The analysis builds upon the Lamb-Oseen vortex work of Meunier and Villermaux ["How vortices mix," J. Fluid Mech. 476, 213-222 (2003)] and the engulfment model of Baldyga and Bourne ["Simplification of micromixing calculations. I. Derivation and application of new model," Chem. Eng. J. 42, 83-92 (1989); "Simplification of micromixing calculations. II. New applications," ibid. 42, 93-101 (1989)] which are valid for diffusion-free conditions, and a comparison is made between three methodologies to assess mixing characteristics. The first method is commonly used in macro-mixing studies and is based on a control area analysis by estimating the variation in time of the concentration standard deviation, while the other two are formulated to provide an insight into local segregation dynamics, by either using an iso-concentration approach or an iso-concentration gradient approach to take into account diffusion.

  13. Measuring College Students' Alcohol Consumption in Natural Drinking Environments: Field Methodologies for Bars and Parties

    ERIC Educational Resources Information Center

    Clapp, John D.; Holmes, Megan R.; Reed, Mark B.; Shillington, Audrey M.; Freisthler, Bridget; Lange, James E.

    2007-01-01

    In recent years researchers have paid substantial attention to the issue of college students' alcohol use. One limitation to the current literature is an over reliance on retrospective, self-report survey data. This article presents field methodologies for measuring college students' alcohol consumption in natural drinking environments.…

  14. Body temperature as a conditional response measure for pavlovian fear conditioning.

    PubMed

    Godsil, B P; Quinn, J J; Fanselow, M S

    2000-01-01

    On six days rats were exposed to each of two contexts. They received an electric shock in one context and nothing in the other. Rats were tested later in each environment without shock. The rats froze and defecated more often in the shock-paired environment; they also exhibited a significantly larger elevation in rectal temperature in that environment. The rats discriminated between each context, and we suggest that the elevation in temperature is the consequence of associative learning. Thus, body temperature can be used as a conditional response measure in Pavlovian fear conditioning experiments that use footshock as the unconditional stimulus.

  15. Muscle dysmorphia: methodological issues, implications for research.

    PubMed

    Suffolk, Mark T; Dovey, Terence M; Goodwin, Huw; Meyer, Caroline

    2013-01-01

    Muscle dysmorphia is a male-dominated, body image-related psychological condition. Despite continued investigation, contention surrounds the nosological status of this disorder. The aim of this article was to review the literature on muscle dysmorphia to provide a qualitative account of methodological issues that may inhibit our understanding. Key areas relating to non-standardized participant groups, measuring instruments, and terminology were identified as potentially inhibiting symptom coherence and diagnostic reliability. New measuring instruments validated with clinical samples and carefully described participant groups, standardized terminology, and a greater emphasis on prospective longitudinal research with specific sub groups of the weight training community would be of interest to the field.

  16. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    PubMed

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  17. Extended Axiomatic Conjoint Measurement: A Solution to a Methodological Problem in Studying Fertility-Related Behaviors.

    ERIC Educational Resources Information Center

    Nickerson, Carol A.; McClelland, Gary H.

    1988-01-01

    A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)

  18. Condition Assessment Methodology for Spillways

    DTIC Science & Technology

    2008-06-01

    G. R., and V. H. Torrey III. 1995. Function-based condition indexing for embankment dams. J. Geotech . Engrg. ASCE, 121(8):579–588. Andersen, G.R...L. E. Chouinard, W. H. Hover, and C. W. Cox. 2001a. Risk indexing tool to assist in prioritizing improvements to embankment dams, J. Geotech . and...according to physical deficiencies. J. Geotech . and Geoenvir. Engrg., ASCE, 127(4):335–345. Chouinard, L. E., J. G. Robichaud, G. Blanchette, and R

  19. Review of PCR methodology : executive summary.

    DOT National Transportation Integrated Search

    1998-01-01

    This study was conducted to review the Pavement Condition Rating (peR) : methodology currently used by the Ohio DOT. The results of the literature search in this : connection indicated that many Highway agencies use a similar methodology to rate thei...

  20. Absolute Radiation Measurements in Earth and Mars Entry Conditions

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.

    2014-01-01

    This paper reports on the measurement of radiative heating for shock heated flows which simulate conditions for Mars and Earth entries. Radiation measurements are made in NASA Ames' Electric Arc Shock Tube at velocities from 3-15 km/s in mixtures of N2/O2 and CO2/N2/Ar. The technique and limitations of the measurement are summarized in some detail. The absolute measurements will be discussed in regards to spectral features, radiative magnitude and spatiotemporal trends. Via analysis of spectra it is possible to extract properties such as electron density, and rotational, vibrational and electronic temperatures. Relaxation behind the shock is analyzed to determine how these properties relax to equilibrium and are used to validate and refine kinetic models. It is found that, for some conditions, some of these values diverge from non-equilibrium indicating a lack of similarity between the shock tube and free flight conditions. Possible reasons for this are discussed.

  1. Dielectric Barrier Discharge (DBD) Plasma Actuators Thrust-Measurement Methodology Incorporating New Anti-Thrust Hypothesis

    NASA Technical Reports Server (NTRS)

    Ashpis, David E.; Laun, Matthew C.

    2014-01-01

    We discuss thrust measurements of Dielectric Barrier Discharge (DBD) plasma actuators devices used for aerodynamic active flow control. After a review of our experience with conventional thrust measurement and significant non-repeatability of the results, we devised a suspended actuator test setup, and now present a methodology of thrust measurements with decreased uncertainty. The methodology consists of frequency scans at constant voltages. The procedure consists of increasing the frequency in a step-wise fashion from several Hz to the maximum frequency of several kHz, followed by frequency decrease back down to the start frequency of several Hz. This sequence is performed first at the highest voltage of interest, then repeated at lower voltages. The data in the descending frequency direction is more consistent and selected for reporting. Sample results show strong dependence of thrust on humidity which also affects the consistency and fluctuations of the measurements. We also observed negative values of thrust or "anti-thrust", at low frequencies between 4 Hz and up to 64 Hz. The anti-thrust is proportional to the mean-squared voltage and is frequency independent. Departures from the parabolic anti-thrust curve are correlated with appearance of visible plasma discharges. We propose the anti-thrust hypothesis. It states that the measured thrust is a sum of plasma thrust and anti-thrust, and assumes that the anti-thrust exists at all frequencies and voltages. The anti-thrust depends on actuator geometry and materials and on the test installation. It enables the separation of the plasma thrust from the measured total thrust. This approach enables more meaningful comparisons between actuators at different installations and laboratories. The dependence on test installation was validated by surrounding the actuator with a large diameter, grounded, metal sleeve.

  2. Development of a methodological tool for the assessment of the hydromorphological conditions of lakes in Europe

    NASA Astrophysics Data System (ADS)

    Gay, Aurore; Argillier, Christine; Reynaud, Nathalie; Nicolas, Delphine; Baudoin, Jean-Marc

    2017-04-01

    The assessment of the ecological status of surface waters considering the biological, physico-chemical and hydromorphological conditions is requested by the European Water Framework Directive (WFD). If research efforts have particularly concentrated on rivers, lakes have yet received less attention. Nevertheless, due to their function of receptacles of inland waters, the habitats they provide to an important biodiversity and the numerous services they support (water supply, recreational activities, hydroelectricity), assessing the ecological quality of lakes becomes crucial for their protection. Still, this task remains challenging, especially considering the hydromorphological compartments. Indeed, while promising tools already exist to assess the lake biological and physico-chemical status, our comprehension of the impact of hydromophological impairments on the global ecosystem functioning remains poor and existing tools to assess such impacts often focus only on morphological aspects and in a qualitative rather than quantitative way. In this context, our study aims at providing stakeholders with a methodology to assess quantitatively the hydrological and morphological quality of lakes in Europe. The developed methodology, LAKe HYdromorphological Conditions tool (LAKHYC tool) is based on our current knowledge of the functioning of lakes and pre-existing works (e.g., Rowan et al., 2012; Rinaldi et al., 2013). The LAKHYC tool integrates the six parameters requested by the WFD, each one being assessed by at least three descriptors that are calculated as Ecological Quality Ratios, i.e. as the deviation from a reference condition. The originality of the present method lies in the fact that specific reference conditions are defined for each descriptor. In this way, we avoid using a predetermined set of lakes considered as not impacted by human activities and which often corresponds to natural lakes in specific areas (e.g., mountains) and do not represent the diversity

  3. Effects of Testing Conditions on Self-Concept Measurement

    ERIC Educational Resources Information Center

    Chandler, Theodore A.; And Others

    1976-01-01

    Many self-concept measures employ several different scales to which the subject responds in a set order at one sitting. This study examined effects of different testing conditions. The Index of Adjustment and Values (IAV) was administered to 191 graduate students under two different sequences and two time delay conditions. Results indicate…

  4. A new methodology for vibration error compensation of optical encoders.

    PubMed

    Lopez, Jesus; Artes, Mariano

    2012-01-01

    Optical encoders are sensors based on grating interference patterns. Tolerances inherent to the manufacturing process can induce errors in the position accuracy as the measurement signals stand apart from the ideal conditions. In case the encoder is working under vibrations, the oscillating movement of the scanning head is registered by the encoder system as a displacement, introducing an error into the counter to be added up to graduation, system and installation errors. Behavior improvement can be based on different techniques trying to compensate the error from measurement signals processing. In this work a new "ad hoc" methodology is presented to compensate the error of the encoder when is working under the influence of vibration. The methodology is based on fitting techniques to the Lissajous figure of the deteriorated measurement signals and the use of a look up table, giving as a result a compensation procedure in which a higher accuracy of the sensor is obtained.

  5. A New Methodology for Vibration Error Compensation of Optical Encoders

    PubMed Central

    Lopez, Jesus; Artes, Mariano

    2012-01-01

    Optical encoders are sensors based on grating interference patterns. Tolerances inherent to the manufacturing process can induce errors in the position accuracy as the measurement signals stand apart from the ideal conditions. In case the encoder is working under vibrations, the oscillating movement of the scanning head is registered by the encoder system as a displacement, introducing an error into the counter to be added up to graduation, system and installation errors. Behavior improvement can be based on different techniques trying to compensate the error from measurement signals processing. In this work a new “ad hoc” methodology is presented to compensate the error of the encoder when is working under the influence of vibration. The methodology is based on fitting techniques to the Lissajous figure of the deteriorated measurement signals and the use of a look up table, giving as a result a compensation procedure in which a higher accuracy of the sensor is obtained. PMID:22666067

  6. Extreme Sea Conditions in Shallow Water: Estimation based on in-situ measurements

    NASA Astrophysics Data System (ADS)

    Le Crom, Izan; Saulnier, Jean-Baptiste

    2013-04-01

    The design of marine renewable energy devices and components is based, among others, on the assessment of the environmental extreme conditions (winds, currents, waves, and water level) that must be combined together in order to evaluate the maximal loads on a floating/fixed structure, and on the anchoring system over a determined return period. Measuring devices are generally deployed at sea over relatively short durations (a few months to a few years), typically when describing water free surface elevation, and extrapolation methods based on hindcast data (and therefore on wave simulation models) have to be used. How to combine, in a realistic way, the action of the different loads (winds and waves for instance) and which correlation of return periods should be used are highly topical issues. However, the assessment of the extreme condition itself remains a not-fully-solved, crucial, and sensitive task. Above all in shallow water, extreme wave height, Hmax, is the most significant contribution in the dimensioning process of EMR devices. As a case study, existing methodologies for deep water have been applied to SEMREV, the French marine energy test site. The interest of this study, especially at this location, goes beyond the simple application to SEMREV's WEC and floating wind turbines deployment as it could also be extended to the Banc de Guérande offshore wind farm that are planned close by. More generally to pipes and communication cables as it is a redundant problematic. The paper will first present the existing measurements (wave and wind on site), the prediction chain that has been developed via wave models, the extrapolation methods applied to hindcast data, and will try to formulate recommendations for improving this assessment in shallow water.

  7. Optimization of Synthesis Conditions of Carbon Nanotubes via Ultrasonic-Assisted Floating Catalyst Deposition Using Response Surface Methodology

    PubMed Central

    Mohammadian, Narges; Ghoreishi, Seyyed M.; Hafeziyeh, Samira; Saeidi, Samrand; Dionysiou, Dionysios D.

    2018-01-01

    The growing use of carbon nanotubes (CNTs) in a plethora of applications has provided to us a motivation to investigate CNT synthesis by new methods. In this study, ultrasonic-assisted chemical vapor deposition (CVD) method was employed to synthesize CNTs. The difficulty of controlling the size of clusters and achieving uniform distribution—the major problem in previous methods—was solved by using ultrasonic bath and dissolving ferrocene in xylene outside the reactor. The operating conditions were optimized using a rotatable central composite design (CCD), which helped optimize the operating conditions of the method. Response surface methodology (RSM) was used to analyze these experiments. Using statistical software was very effective, considering that it decreased the number of experiments needed to achieve the optimum conditions. Synthesis of CNTs was studied as a function of three independent parameters viz. hydrogen flow rate (120–280 cm3/min), catalyst concentration (2–6 wt %), and synthesis temperature (800–1200 °C). Optimum conditions for the synthesis of CNTs were found to be 3.78 wt %, 184 cm3/min, and 976 °C for catalyst concentration, hydrogen flow rate, and synthesis temperature, respectively. Under these conditions, Raman spectrum indicates high values of (IG/ID), which means high-quality CNTs. PMID:29747451

  8. Dissociating Contingency Awareness and Conditioned Attitudes: Evidence of Contingency-Unaware Evaluative Conditioning

    ERIC Educational Resources Information Center

    Hutter, Mandy; Sweldens, Steven; Stahl, Christoph; Unkelbach, Christian; Klauer, Karl Christoph

    2012-01-01

    Whether human evaluative conditioning can occur without contingency awareness has been the subject of an intense and ongoing debate for decades, troubled by a wide array of methodological difficulties. Following recent methodological innovations, the available evidence currently points to the conclusion that evaluative conditioning effects do not…

  9. Development of a system to measure local measurement conditions around textile electrodes.

    PubMed

    Kim, Saim; Oliveira, Joana; Roethlingshoefer, Lisa; Leonhard, Steffen

    2010-01-01

    The three main influence factors on the interface between textile electrode an skin are: temperature, contact pressure and relative humidity. This paper presents first results of a prototype, which measures these local measurement conditions around textile electrodes. The wearable prototype is a data acquisition system based on a microcontroller with a flexible sensor sleeve. Validation measurements included variation of ambient temperature, contact pressures and sleeve material. Results show a good correlation with data found in literature.

  10. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    PubMed

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Practical methodological guide for hydrometric inter-laboratory organisation

    NASA Astrophysics Data System (ADS)

    Besson, David; Bertrand, Xavier

    2015-04-01

    Discharge measurements performed by the French governmental hydrometer team feed a national database. This data is available for general river flows knowkedge, flood forecasting, low water survey, statistical calculations flow, control flow regulatory and many other uses. Regularly checking the measurements quality and better quantifying its accuracy is therefore an absolute need. The practice of inter-laboratory comparison in hydrometry particularly developed during the last decade. Indeed, discharge measurement can not easily be linked to a standard. Therefore, on-site measurement accuracy control is very difficult. Inter-laboratory comparison is thus a practical solution to this issue. However, it needs some regulations in order to ease its practice and legitimize its results. To do so, the French government hydrometrics teams produced a practical methodological guide for hydrometric inter-laboratory organisation in destination of hydrometers community in view of ensure the harmonization of inter-laboratory comparison practices for different materials (ADCP, current meter on wadind rod or gauging van, tracer dilution, surface speed) and flow range (flood, low water). Ensure the results formalization and banking. The realisation of this practice guide is grounded on the experience of the governmental teams & their partners (or fellows), following existing approaches (Doppler group especially). The guide is designated to validate compliance measures and identify outliers : Hardware, methodological, environmental, or human. Inter-laboratory comparison provides the means to verify the compliance of the instruments (devices + methods + operators) and provides methods to determine an experimental uncertainty of the tested measurement method which is valid only for the site and the measurement conditions but does not address the calibration or periodic monitoring of the few materials. After some conceptual definitions, the guide describes the different stages of an

  12. Radioactive waste disposal fees-Methodology for calculation

    NASA Astrophysics Data System (ADS)

    Bemš, Július; Králík, Tomáš; Kubančák, Ján; Vašíček, Jiří; Starý, Oldřich

    2014-11-01

    This paper summarizes the methodological approach used for calculation of fee for low- and intermediate-level radioactive waste disposal and for spent fuel disposal. The methodology itself is based on simulation of cash flows related to the operation of system for waste disposal. The paper includes demonstration of methodology application on the conditions of the Czech Republic.

  13. Drosophila Courtship Conditioning As a Measure of Learning and Memory.

    PubMed

    Koemans, Tom S; Oppitz, Cornelia; Donders, Rogier A T; van Bokhoven, Hans; Schenck, Annette; Keleman, Krystyna; Kramer, Jamie M

    2017-06-05

    Many insights into the molecular mechanisms underlying learning and memory have been elucidated through the use of simple behavioral assays in model organisms such as the fruit fly, Drosophila melanogaster. Drosophila is useful for understanding the basic neurobiology underlying cognitive deficits resulting from mutations in genes associated with human cognitive disorders, such as intellectual disability (ID) and autism. This work describes a methodology for testing learning and memory using a classic paradigm in Drosophila known as courtship conditioning. Male flies court females using a distinct pattern of easily recognizable behaviors. Premated females are not receptive to mating and will reject the male's copulation attempts. In response to this rejection, male flies reduce their courtship behavior. This learned reduction in courtship behavior is measured over time, serving as an indicator of learning and memory. The basic numerical output of this assay is the courtship index (CI), which is defined as the percentage of time that a male spends courting during a 10 min interval. The learning index (LI) is the relative reduction of CI in flies that have been exposed to a premated female compared to naïve flies with no previous social encounters. For the statistical comparison of LIs between genotypes, a randomization test with bootstrapping is used. To illustrate how the assay can be used to address the role of a gene relating to learning and memory, the pan-neuronal knockdown of Dihydroxyacetone phosphate acyltransferase (Dhap-at) was characterized here. The human ortholog of Dhap-at, glyceronephosphate O-acyltransferase (GNPT), is involved in rhizomelic chondrodysplasia punctata type 2, an autosomal-recessive syndrome characterized by severe ID. Using the courtship conditioning assay, it was determined that Dhap-at is required for long-term memory, but not for short-term memory. This result serves as a basis for further investigation of the underlying molecular

  14. ASSESSING THE CONDITION OF SOUTH CAROLINA'S ESTUARIES: A NEW APPROACH INVOLVING INTEGRATED MEASURES OF CONDITION

    EPA Science Inventory

    The South Carolina Estuarine and Coastal Assessment Program (SCECAP) was initiated in 1999 to assess the condition of the state's coastal habitats using multiple measures of water quality, sediment quality, and biological condition. Sampling has subsequently been expanded to incl...

  15. A methodology for obtaining on-orbit SI-traceable spectral radiance measurements in the thermal infrared

    NASA Astrophysics Data System (ADS)

    Dykema, John A.; Anderson, James G.

    2006-06-01

    A methodology to achieve spectral thermal radiance measurements from space with demonstrable on-orbit traceability to the International System of Units (SI) is described. This technique results in measurements of infrared spectral radiance R(\\tilde {\\upsilon }) , with spectral index \\tilde {\\upsilon } in cm-1, with a relative combined uncertainty u_c[R(\\tilde {\\upsilon })] of 0.0015 (k = 1) for the average mid-infrared radiance emitted by the Earth. This combined uncertainty, expressed in brightness temperature units, is equivalent to ±0.1 K at 250 K at 750 cm-1. This measurement goal is achieved by utilizing a new method for infrared scale realization combined with an instrument design optimized to minimize component uncertainties and admit tests of radiometric performance. The SI traceability of the instrument scale is established by evaluation against source-based and detector-based infrared scales in defined laboratory protocols before launch. A novel strategy is executed to ensure fidelity of on-orbit calibration to the pre-launch scale. This strategy for on-orbit validation relies on the overdetermination of instrument calibration. The pre-launch calibration against scales derived from physically independent paths to the base SI units provides the foundation for a critical analysis of the overdetermined on-orbit calibration to establish an SI-traceable estimate of the combined measurement uncertainty. Redundant calibration sources and built-in diagnostic tests to assess component measurement uncertainties verify the SI traceability of the instrument calibration over the mission lifetime. This measurement strategy can be realized by a practical instrument, a prototype Fourier-transform spectrometer under development for deployment on a small satellite. The measurement record resulting from the methodology described here meets the observational requirements for climate monitoring and climate model testing and improvement.

  16. Exhaled nitric oxide measurements in the first 2 years of life: methodological issues, clinical and epidemiological applications

    PubMed Central

    Gabriele, Carmelo; de Benedictis, Fernando M; de Jongste, Johan C

    2009-01-01

    Fractional exhaled nitric oxide (FeNO) is a useful tool to diagnose and monitor eosinophilic bronchial inflammation in asthmatic children and adults. In children younger than 2 years of age FeNO has been successfully measured both with the tidal breathing and with the single breath techniques. However, there are a number of methodological issues that need to be addressed in order to increase the reproducibility of the FeNO measurements within and between infants. Indeed, a standardized method to measure FeNO in the first 2 years of life would be extremely useful in order to meaningfully interpret FeNO values in this age group. Several factors related to the measurement conditions have been found to influence FeNO, such as expiratory flow, ambient NO and nasal contamination. Furthermore, the exposure to pre- and postnatal risk factors for respiratory morbidity has been shown to influence FeNO values. Therefore, these factors should always be assessed and their association with FeNO values in the specific study population should be evaluated and, eventually, controlled for. There is evidence consistently suggesting that FeNO is increased in infants with family history of atopy/atopic diseases and in infants with recurrent wheezing. These findings could support the hypothesis that eosinophilic bronchial inflammation is present at an early stage in those infants at increased risk of developing persistent respiratory symptoms and asthma. Furthermore, it has been shown that FeNO measurements could represent a useful tool to assess bronchial inflammation in other airways diseases, such as primary ciliary dyskinesia, bronchopulmonary dysplasia and cystic fibrosis. Further studies are needed in order to improve the reproducibility of the measurements, and large prospective studies are warranted in order to evaluate whether FeNO values measured in the first years of life can predict the future development of asthma or other respiratory diseases. PMID:19712438

  17. The methodological quality of diagnostic test accuracy studies for musculoskeletal conditions can be improved.

    PubMed

    Henschke, Nicholas; Keuerleber, Julia; Ferreira, Manuela; Maher, Christopher G; Verhagen, Arianne P

    2014-04-01

    To provide an overview of reporting and methodological quality in diagnostic test accuracy (DTA) studies in the musculoskeletal field and evaluate the use of the QUality Assessment of Diagnostic Accuracy Studies (QUADAS) checklist. A literature review identified all systematic reviews that evaluated the accuracy of clinical tests to diagnose musculoskeletal conditions and used the QUADAS checklist. Two authors screened all identified reviews and extracted data on the target condition, index tests, reference standard, included studies, and QUADAS items. A descriptive analysis of the QUADAS checklist was performed, along with Rasch analysis to examine the construct validity and internal reliability. A total of 19 systematic reviews were included, which provided data on individual items of the QUADAS checklist for 392 DTA studies. In the musculoskeletal field, uninterpretable or intermediate test results are commonly not reported, with 175 (45%) studies scoring "no" to this item. The proportion of studies fulfilling certain items varied from 22% (item 11) to 91% (item 3). The interrater reliability of the QUADAS checklist was good and Rasch analysis showed excellent construct validity and internal consistency. This overview identified areas where the reporting and performance of diagnostic studies within the musculoskeletal field can be improved. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Aircraft and ground vehicle friction measurements obtained under winter runway conditions

    NASA Technical Reports Server (NTRS)

    Yager, Thomas J.

    1989-01-01

    Tests with specially instrumented NASA B-737 and B-727 aircraft together with several different ground friction measuring devices have been conducted for a variety of runway surface types and wetness conditions. This effort is part of the Joint FAA/NASA Aircraft/Ground Vehicle Runway Friction Program aimed at obtaining a better understanding of aircraft ground handling performance under adverse weather conditions, and defining relationships between aircraft and ground vehicle tire friction measurements. Aircraft braking performance on dry, wet, snow-, and ice-covered runway conditions is discussed together with ground vehicle friction data obtained under similar runway conditions. For the wet, compacted snow- and ice-covered runway conditions, the relationship between ground vehicles and aircraft friction data is identified. The influence of major test parameters on friction measurements such as speed, test tire characteristics, and surface contaminant-type are discussed. The test results indicate that use of properly maintained and calibrated ground vehicles for monitoring runway friction conditions should be encouraged particularly under adverse weather conditions.

  19. Harman Measurements for Thermoelectric Materials and Modules under Non-Adiabatic Conditions

    NASA Astrophysics Data System (ADS)

    Roh, Im-Jun; Lee, Yun Goo; Kang, Min-Su; Lee, Jae-Uk; Baek, Seung-Hyub; Kim, Seong Keun; Ju, Byeong-Kwon; Hyun, Dow-Bin; Kim, Jin-Sang; Kwon, Beomjin

    2016-12-01

    Accuracy of the Harman measurement largely depends on the heat transfer between the sample and its surroundings, so-called parasitic thermal effects (PTEs). Similar to the material evaluations, measuring thermoelectric modules (TEMs) is also affected by the PTEs especially when measuring under atmospheric condition. Here, we study the correction methods for the Harman measurements with systematically varied samples (both bulk materials and TEMs) at various conditions. Among several PTEs, the heat transfer via electric wires is critical. Thus, we estimate the thermal conductance of the electric wires, and correct the measured properties for a certain sample shape and measuring temperature. The PTEs are responsible for the underestimation of the TEM properties especially under atmospheric conditions (10-35%). This study will be useful to accurately characterize the thermoelectric properties of materials and modules.

  20. Harman Measurements for Thermoelectric Materials and Modules under Non-Adiabatic Conditions

    PubMed Central

    Roh, Im-Jun; Lee, Yun Goo; Kang, Min-Su; Lee, Jae-Uk; Baek, Seung-Hyub; Kim, Seong Keun; Ju, Byeong-Kwon; Hyun, Dow-Bin; Kim, Jin-Sang; Kwon, Beomjin

    2016-01-01

    Accuracy of the Harman measurement largely depends on the heat transfer between the sample and its surroundings, so-called parasitic thermal effects (PTEs). Similar to the material evaluations, measuring thermoelectric modules (TEMs) is also affected by the PTEs especially when measuring under atmospheric condition. Here, we study the correction methods for the Harman measurements with systematically varied samples (both bulk materials and TEMs) at various conditions. Among several PTEs, the heat transfer via electric wires is critical. Thus, we estimate the thermal conductance of the electric wires, and correct the measured properties for a certain sample shape and measuring temperature. The PTEs are responsible for the underestimation of the TEM properties especially under atmospheric conditions (10–35%). This study will be useful to accurately characterize the thermoelectric properties of materials and modules. PMID:27966622

  1. Current psychometric and methodological issues in the measurement of overgeneral autobiographical memory.

    PubMed

    Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk

    2012-12-01

    Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Measuring persistence: A literature review focusing on methodological issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, A.K.; Brown, M.A.; Trumble, D.

    1995-03-01

    This literature review was conducted as part of a larger project to produce a handbook on the measurement of persistence. The past decade has marked the development of the concept of persistence and a growing recognition that the long-term impacts of demand-side management (DSM) programs warrant careful assessment. Although Increasing attention has been paid to the topic of persistence, no clear consensus has emerged either about its definition or about the methods most appropriate for its measurement and analysis. This project strives to fill that gap by reviewing the goals, terminology, and methods of past persistence studies. It was conductedmore » from the perspective of a utility that seeks to acquire demand-side resources and is interested in their long-term durability; it was not conducted from the perspective of the individual consumer. Over 30 persistence studies, articles, and protocols were examined for this report. The review begins by discussing the underpinnings of persistence studies: namely, the definitions of persistence and the purposes of persistence studies. Then. it describes issues relevant to both the collection and analysis of data on the persistence of energy and demand savings. Findings from persistence studies also are summarized. Throughout the review, four studies are used repeatedly to illustrate different methodological and analytical approaches to persistence so that readers can track the data collection. data analysis, and findings of a set of comprehensive studies that represent alternative approaches.« less

  3. Thermophysical Properties Measurement of High-Temperature Liquids Under Microgravity Conditions in Controlled Atmospheric Conditions

    NASA Technical Reports Server (NTRS)

    Watanabe, Masahito; Ozawa, Shumpei; Mizuno, Akotoshi; Hibiya, Taketoshi; Kawauchi, Hiroya; Murai, Kentaro; Takahashi, Suguru

    2012-01-01

    Microgravity conditions have advantages of measurement of surface tension and viscosity of metallic liquids by the oscillating drop method with an electromagnetic levitation (EML) device. Thus, we are preparing the experiments of thermophysical properties measurements using the Materials-Science Laboratories ElectroMagnetic-Levitator (MSL-EML) facilities in the international Space station (ISS). Recently, it has been identified that dependence of surface tension on oxygen partial pressure (Po2) must be considered for industrial application of surface tension values. Effect of Po2 on surface tension would apparently change viscosity from the damping oscillation model. Therefore, surface tension and viscosity must be measured simultaneously in the same atmospheric conditions. Moreover, effect of the electromagnetic force (EMF) on the surface oscillations must be clarified to obtain the ideal surface oscillation because the EMF works as the external force on the oscillating liquid droplets, so extensive EMF makes apparently the viscosity values large. In our group, using the parabolic flight levitation experimental facilities (PFLEX) the effect of Po2 and external EMF on surface oscillation of levitated liquid droplets was systematically investigated for the precise measurements of surface tension and viscosity of high temperature liquids for future ISS experiments. We performed the observation of surface oscillations of levitated liquid alloys using PFLEX on board flight experiments by Gulfstream II (G-II) airplane operated by DAS. These observations were performed under the controlled Po2 and also under the suitable EMF conditions. In these experiments, we obtained the density, the viscosity and the surface tension values of liquid Cu. From these results, we discuss about as same as reported data, and also obtained the difference of surface oscillations with the change of the EMF conditions.

  4. Methodological Quality Assessment of Meta-Analyses of Hyperthyroidism Treatment.

    PubMed

    Qin, Yahong; Yao, Liang; Shao, Feifei; Yang, Kehu; Tian, Limin

    2018-01-01

    Hyperthyroidism is a common condition that is associated with increased morbidity and mortality. A number of meta-analyses (MAs) have assessed the therapeutic measures for hyperthyroidism, including antithyroid drugs, surgery, and radioiodine, however, the methodological quality has not been evaluated. This study evaluated the methodological quality and summarized the evidence obtained from MAs of hyperthyroidism treatments for radioiodine, antithyroid drugs, and surgery. We searched the PubMed, EMBASE, Cochrane Library, Web of Science, and Chinese Biomedical Literature Database databases. Two investigators independently assessed the meta-analyses titles and abstracts for inclusion. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. A total of 26 MAs fulfilled the inclusion criteria. Based on the AMSTAR scores, the average methodological quality was 8.31, with large variability ranging from 4 to 11. The methodological quality of English meta-analyses was better than that of Chinese meta-analyses. Cochrane reviews had better methodological quality than non-Cochrane reviews due to better study selection and data extraction, the inclusion of unpublished studies, and better reporting of study characteristics. The authors did not report conflicts of interest in 53.8% meta-analyses, and 19.2% did not report the harmful effects of treatment. Publication bias was not assessed in 38.5% of meta-analyses, and 19.2% did not report the follow-up time. Large-scale assessment of methodological quality of meta-analyses of hyperthyroidism treatment highlighted methodological strengths and weaknesses. Consideration of scientific quality when formulating conclusions should be made explicit. Future meta-analyses should improve on reporting conflict of interest. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Optimization of phenolics and flavonoids extraction conditions of Curcuma Zedoaria leaves using response surface methodology.

    PubMed

    Azahar, Nur Fauwizah; Gani, Siti Salwa Abd; Mohd Mokhtar, Nor Fadzillah

    2017-10-02

    This study focused on maximizing the extraction yield of total phenolics and flavonoids from Curcuma Zedoaria leaves as a function of time (80-120 min), temperature (60-80 °C) and ethanol concentration (70-90 v/v%). The data were subjected to response surface methodology (RSM) and the results showed that the polynomial equations for all models were significant, did not show lack of fit, and presented adjusted determination coefficients (R 2 ) above 99%, proving their suitability for prediction purposes. Using desirability function, the optimum operating conditions to attain a higher extraction of phenolics and flavonoids was found to be 75 °C, 92 min of extraction time and 90:10 of ethanol concentration ratios. Under these optimal conditions, the experimental values for total phenolics and flavonoids of Curcuma zedoaria leaves were 125.75 ± 0.17 mg of gallic acid equivalents and 6.12 ± 0.23 mg quercetin/g of extract, which closely agreed with the predicted values. Besides, in this study, the leaves from Curcuma zedoaria could be considered to have the strong antioxidative ability and can be used in various cosmeceuticals or medicinal applications.

  6. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  7. Noncontacting measurement technologies for space propulsion condition monitoring

    NASA Technical Reports Server (NTRS)

    Randall, M. R.; Barkhoudarian, S.; Collins, J. J.; Schwartzbart, A.

    1987-01-01

    This paper describes four noncontacting measurement technologies that can be used in a turbopump condition monitoring system. The isotope wear analyzer, fiberoptic deflectometer, brushless torque-meter, and fiberoptic pyrometer can be used to monitor component wear, bearing degradation, instantaneous shaft torque, and turbine blade cracking, respectively. A complete turbopump condition monitoring system including these four technologies could predict remaining component life, thus reducing engine operating costs and increasing reliability.

  8. Device for mass measurement under zero-gravity conditions.

    PubMed

    Sarychev, V A; Sazonov, V V; Zlatorunsky, A S; Khlopina, S F; Egorov, A D; Somov, V I

    1980-06-01

    The problem considered in this paper is the investigation of the properties of a mass-meter, i.e. the device for determining the mass of cosmonaut's body under zero-gravity conditions. The estimates of accuracy of mass measurement by this device are given, and the results of measuring the masses of cosmonauts' bodies on the Salyut 5 and 6 orbital stations are presented.

  9. Measuring the impact of medicines regulatory interventions - Systematic review and methodological considerations.

    PubMed

    Goedecke, Thomas; Morales, Daniel R; Pacurariu, Alexandra; Kurz, Xavier

    2018-03-01

    Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non-European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. From 1246 screened articles, 229 were eligible for full-text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill-over effects were rarely evaluated. Two-thirds used before-after time series and 15.7% before-after cross-sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Whilst impact evaluation of pharmacovigilance and product-specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. © 2017 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of

  10. Conditioning of FRF measurements for use with frequency based substructuring

    NASA Astrophysics Data System (ADS)

    Nicgorski, Dana; Avitabile, Peter

    2010-02-01

    Frequency based substructuring approaches have been used for the generation of system models from component data. While numerical models show successful results, there have been many difficulties with actual measurements in many instances. Previous work has identified some of these typical problems using simulated data to incorporate specific measurement difficulties commonly observed along with approaches to overcome some of these difficulties. This paper presents the results using actual measured data for a laboratory structure subjected to both analytical and experimental studies. Various commonly used approaches are shown to illustrate some of the difficulties with measured data. A new approach to better condition the measured functions and purge commonly found data measurement contaminants is utilized to provide dramatically improved results. Several cases are explored to show the difficulties commonly observed as well as the improved conditioning of the measured data to obtain acceptable results.

  11. Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.

    The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighter’s cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts;more » and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.« less

  12. Intercomparison of magnetic field measurements near MV/LV transformer substations: methodological planning and results.

    PubMed

    Violanti, S; Fraschetta, M; Adda, S; Caputo, E

    2009-12-01

    Within the framework of Environmental Agencies system's activities, coordinated by ISPRA (superior institute for environmental protection and research), a comparison among measurements was designed and accomplished, in order to go into depth on the matter of measurement problems and to evaluate magnetic field at power frequencies. These measurements have been taken near medium voltage /low voltage transformer substation. This project was developed with the contribution of several experts who belong to different Regional Agencies. In three of these regions, substations having specific international standard characteristics were chosen; then a measurement and data analysis protocol was arranged. Data analysis showed a good level of coherence among results obtained by different laboratories. However, a range of problems emerged, either during the protocol predisposition and definition of the data analysis procedure or during the execution of measures and data reprocessing, because of the spatial and temporal variability of magnetic field. These problems represent elements of particular interest in determining a correct measurement methodology, whose purpose is the comparison with limits of exposure, attention values and quality targets.

  13. Low-frequency noise measurements: applications, methodologies and instrumentation

    NASA Astrophysics Data System (ADS)

    Ciofi, Carmine; Neri, Bruno

    2003-05-01

    Low frequency noise measurements (f<10Hz) are a powerful tool for the investigation of the quality and reliability of electron devices and material. In most cases, however, the application of this technique is made quite difficult both because of the effect of external interferences (temperature fluctuations, EMI, mechanical vibrations, etc.) and because of the high level of flicker noise of the commercial instrumentation. In this paper the most remarkable results we obtained by using low frequency noise measurements for the characterization of the reliability of VLSI metallic interconnections and thin oxides are resumed. Moreover, we discuss the effects of the several sources of noise and interferences which contribute to reduce the sensitivity of the measurement chain. In particular, we demonstrate that by means of a proper design, dedicated instrumentation can be built which allows for a considerable reduction of the overall background noise. Examples will be given with reference to voltage and transresistance amplifiers (both AC and DC coupled), to programmable biasing systems (both current and voltage sources), to thermal stabilization systems and to data acquisition systems. Finally, we will discuss methods which may allow, in proper conditions, to accurately measure noise levels well below the background noise of the input preamplifiers coupled to the device under test. As the systems we discuss are characterized by moderate complexity and employ components readily available on the market, we trust that this paper may also serve as a simple guideline to anyone interested in exploiting the possibility of using very low frequency noise measurements by building his own instrumentation.

  14. Setting the light conditions for measuring root transparency for age-at-death estimation methods.

    PubMed

    Adserias-Garriga, Joe; Nogué-Navarro, Laia; Zapico, Sara C; Ubelaker, Douglas H

    2018-03-01

    Age-at-death estimation is one of the main goals in forensic identification, being an essential parameter to determine the biological profile, narrowing the possibility of identification in cases involving missing persons and unidentified bodies. The study of dental tissues has been long considered as a proper tool for age estimation with several age estimation methods based on them. Dental age estimation methods can be divided into three categories: tooth formation and development, post-formation changes, and histological changes. While tooth formation and growth changes are important for fetal and infant consideration, when the end of dental and skeletal growth is achieved, post-formation or biochemical changes can be applied. Lamendin et al. in J Forensic Sci 37:1373-1379, (1992) developed an adult age estimation method based on root transparency and periodontal recession. The regression formula demonstrated its accuracy of use for 40 to 70-year-old individuals. Later on, Prince and Ubelaker in J Forensic Sci 47(1):107-116, (2002) evaluated the effects of ancestry and sex and incorporated root height into the equation, developing four new regression formulas for males and females of African and European ancestry. Even though root transparency is a key element in the method, the conditions for measuring this element have not been established. The aim of the present study is to set the light conditions measured in lumens that offer greater accuracy when applying the Lamendin et al. method modified by Prince and Ubelaker. The results must be also taken into account in the application of other age estimation methodologies using root transparency to estimate age-at-death.

  15. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  16. Performance measurement for people with multiple chronic conditions: conceptual model.

    PubMed

    Giovannetti, Erin R; Dy, Sydney; Leff, Bruce; Weston, Christine; Adams, Karen; Valuck, Tom B; Pittman, Aisha T; Blaum, Caroline S; McCann, Barbara A; Boyd, Cynthia M

    2013-10-01

    Improving quality of care for people with multiple chronic conditions (MCCs) requires performance measures reflecting the heterogeneity and scope of their care. Since most existing measures are disease specific, performance measures must be refined and new measures must be developed to address the complexity of care for those with MCCs. To describe development of the Performance Measurement for People with Multiple Chronic Conditions (PM-MCC) conceptual model. Framework development and a national stakeholder panel. We used reviews of existing conceptual frameworks of performance measurement, review of the literature on MCCs, input from experts in the multistakeholder Steering Committee, and public comment. The resulting model centers on the patient and family goals and preferences for care in the context of multiple care sites and providers, the type of care they are receiving, and the national priority domains for healthcare quality measurement. This model organizes measures into a comprehensive framework and identifies areas where measures are lacking. In this context, performance measures can be prioritized and implemented at different levels, in the context of patients' overall healthcare needs.

  17. Measurement of additional shear during sludge conditioning and dewatering.

    PubMed

    Ormeci, Banu; Ahmad, Ayaz

    2009-07-01

    Optimum polymer dose is influenced both by the polymer demand of the sludge and the shear applied during conditioning. Sludge exposed to additional shear following conditioning will experience a decrease in cake solids concentration for the same polymer dose. Therefore, it is necessary to measure or quantify the additional shear in order to optimize the conditioning and dewatering. There is currently no direct or indirect method to achieve this. The main objective of this study was to develop a method based on torque rheology to measure the amount of shear that a sludge network experiences during conditioning and dewatering. Anaerobically digested sludge samples were exposed to increasing levels of mixing intensities and times, and rheological characteristics of samples were measured using a torque rheometer. Several rheological parameters were evaluated including the peak torque and totalized torque (area under the rheograms). The results of this study show that at the optimum polymer dose, a linear relationship exists between the applied shear and the area under the rheograms, and this relationship can be used to estimate an unknown amount of shear that the sludge was exposed to. The method is useful as a research tool to study the effect of shear on dewatering but also as an optimization tool in a dewatering automation system based on torque rheology.

  18. Using Geographic Information Systems to measure retail food environments: Discussion of methodological considerations and a proposed reporting checklist (Geo-FERN).

    PubMed

    Wilkins, Emma L; Morris, Michelle A; Radley, Duncan; Griffiths, Claire

    2017-03-01

    Geographic Information Systems (GIS) are widely used to measure retail food environments. However the methods used are hetrogeneous, limiting collation and interpretation of evidence. This problem is amplified by unclear and incomplete reporting of methods. This discussion (i) identifies common dimensions of methodological diversity across GIS-based food environment research (data sources, data extraction methods, food outlet construct definitions, geocoding methods, and access metrics), (ii) reviews the impact of different methodological choices, and (iii) highlights areas where reporting is insufficient. On the basis of this discussion, the Geo-FERN reporting checklist is proposed to support methodological reporting and interpretation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Guidelines for reporting methodological challenges and evaluating potential bias in dementia research

    PubMed Central

    Weuve, Jennifer; Proust-Lima, Cécile; Power, Melinda C.; Gross, Alden L.; Hofer, Scott M.; Thiébaut, Rodolphe; Chêne, Geneviève; Glymour, M. Maria; Dufouil, Carole

    2015-01-01

    Clinical and population research on dementia and related neurologic conditions, including Alzheimer’s disease, faces several unique methodological challenges. Progress to identify preventive and therapeutic strategies rests on valid and rigorous analytic approaches, but the research literature reflects little consensus on “best practices.” We present findings from a large scientific working group on research methods for clinical and population studies of dementia, which identified five categories of methodological challenges as follows: (1) attrition/sample selection, including selective survival; (2) measurement, including uncertainty in diagnostic criteria, measurement error in neuropsychological assessments, and practice or retest effects; (3) specification of longitudinal models when participants are followed for months, years, or even decades; (4) time-varying measurements; and (5) high-dimensional data. We explain why each challenge is important in dementia research and how it could compromise the translation of research findings into effective prevention or care strategies. We advance a checklist of potential sources of bias that should be routinely addressed when reporting dementia research. PMID:26397878

  20. Methodology for the analysis of pollutant emissions from a city bus

    NASA Astrophysics Data System (ADS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  1. Selecting Models for Measuring Change When True Experimental Conditions Do Not Exist.

    ERIC Educational Resources Information Center

    Fortune, Jim C.; Hutson, Barbara A.

    1984-01-01

    Measuring change when true experimental conditions do not exist is a difficult process. This article reviews the artifacts of change measurement in evaluations and quasi-experimental designs, delineates considerations in choosing a model to measure change under nonideal conditions, and suggests ways to organize models to facilitate selection.…

  2. Development of plant condition measurement - The Jimah Model

    NASA Astrophysics Data System (ADS)

    Evans, Roy F.; Syuhaimi, Mohd; Mazli, Mohammad; Kamarudin, Nurliyana; Maniza Othman, Faiz

    2012-05-01

    The Jimah Model is an information management model. The model has been designed to facilitate analysis of machine condition by integrating diagnostic data with quantitative and qualitative information. The model treats data as a single strand of information - metaphorically a 'genome' of data. The 'Genome' is structured to be representative of plant function and identifies the condition of selected components (or genes) in each machine. To date in industry, computer aided work processes used with traditional industrial practices, have been unable to consistently deliver a standard of information suitable for holistic evaluation of machine condition and change. Significantly the reengineered site strategies necessary for implementation of this "data genome concept" have resulted in enhanced knowledge and management of plant condition. In large plant with high initial equipment cost and subsequent high maintenance costs, accurate measurement of major component condition becomes central to whole of life management and replacement decisions. A case study following implementation of the model at a major power station site in Malaysia (Jimah) shows that modeling of plant condition and wear (in real time) can be made a practical reality.

  3. Adaptability of laser diffraction measurement technique in soil physics methodology

    NASA Astrophysics Data System (ADS)

    Barna, Gyöngyi; Szabó, József; Rajkai, Kálmán; Bakacsi, Zsófia; Koós, Sándor; László, Péter; Hauk, Gabriella; Makó, András

    2016-04-01

    There are intentions all around the world to harmonize soils' particle size distribution (PSD) data by the laser diffractometer measurements (LDM) to that of the sedimentation techniques (pipette or hydrometer methods). Unfortunately, up to the applied methodology (e. g. type of pre-treatments, kind of dispersant etc.), PSDs of the sedimentation methods (due to different standards) are dissimilar and could be hardly harmonized with each other, as well. A need was arisen therefore to build up a database, containing PSD values measured by the pipette method according to the Hungarian standard (MSZ-08. 0205: 1978) and the LDM according to a widespread and widely used procedure. In our current publication the first results of statistical analysis of the new and growing PSD database are presented: 204 soil samples measured with pipette method and LDM (Malvern Mastersizer 2000, HydroG dispersion unit) were compared. Applying usual size limits at the LDM, clay fraction was highly under- and silt fraction was overestimated compared to the pipette method. Subsequently soil texture classes determined from the LDM measurements significantly differ from results of the pipette method. According to previous surveys and relating to each other the two dataset to optimizing, the clay/silt boundary at LDM was changed. Comparing the results of PSDs by pipette method to that of the LDM, in case of clay and silt fractions the modified size limits gave higher similarities. Extension of upper size limit of clay fraction from 0.002 to 0.0066 mm, and so change the lower size limit of silt fractions causes more easy comparability of pipette method and LDM. Higher correlations were found between clay content and water vapor adsorption, specific surface area in case of modified limit, as well. Texture classes were also found less dissimilar. The difference between the results of the two kind of PSD measurement methods could be further reduced knowing other routinely analyzed soil parameters

  4. Measuring the impact of medicines regulatory interventions – Systematic review and methodological considerations

    PubMed Central

    Morales, Daniel R.; Pacurariu, Alexandra; Kurz, Xavier

    2017-01-01

    Aims Evaluating the public health impact of regulatory interventions is important but there is currently no common methodological approach to guide this evaluation. This systematic review provides a descriptive overview of the analytical methods for impact research. Methods We searched MEDLINE and EMBASE for articles with an empirical analysis evaluating the impact of European Union or non‐European Union regulatory actions to safeguard public health published until March 2017. References from systematic reviews and articles from other known sources were added. Regulatory interventions, data sources, outcomes of interest, methodology and key findings were extracted. Results From 1246 screened articles, 229 were eligible for full‐text review and 153 articles in English language were included in the descriptive analysis. Over a third of articles studied analgesics and antidepressants. Interventions most frequently evaluated are regulatory safety communications (28.8%), black box warnings (23.5%) and direct healthcare professional communications (10.5%); 55% of studies measured changes in drug utilization patterns, 27% evaluated health outcomes, and 18% targeted knowledge, behaviour or changes in clinical practice. Unintended consequences like switching therapies or spill‐over effects were rarely evaluated. Two‐thirds used before–after time series and 15.7% before–after cross‐sectional study designs. Various analytical approaches were applied including interrupted time series regression (31.4%), simple descriptive analysis (28.8%) and descriptive analysis with significance tests (23.5%). Conclusion Whilst impact evaluation of pharmacovigilance and product‐specific regulatory interventions is increasing, the marked heterogeneity in study conduct and reporting highlights the need for scientific guidance to ensure robust methodologies are applied and systematic dissemination of results occurs. PMID:29105853

  5. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    NASA Astrophysics Data System (ADS)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  6. Validation of the PROMIS® measures of self-efficacy for managing chronic conditions.

    PubMed

    Gruber-Baldini, Ann L; Velozo, Craig; Romero, Sergio; Shulman, Lisa M

    2017-07-01

    The Patient-Reported Outcomes Measurement Information System ® (PROMIS ® ) was designed to develop, validate, and standardize item banks to measure key domains of physical, mental, and social health in chronic conditions. This paper reports the calibration and validation testing of the PROMIS Self-Efficacy for Managing Chronic Conditions measures. PROMIS Self-Efficacy for Managing Chronic Conditions item banks comprise five domains, Self-Efficacy for Managing: Daily Activities, Symptoms, Medications and Treatments, Emotions, and Social Interactions. Banks were calibrated in 1087 subjects from two data sources: 837 patients with chronic neurologic conditions (epilepsy, multiple sclerosis, neuropathy, Parkinson disease, and stroke) and 250 subjects from an online Internet sample of adults with general chronic conditions. Scores were compared with one legacy scale: Self-Efficacy for Managing Chronic Disease 6-Item scale (SEMCD6) and five PROMIS short forms: Global Health (Physical and Mental), Physical Function, Fatigue, Depression, and Anxiety. The sample was 57% female, mean age = 53.8 (SD = 14.7), 76% white, 21% African American, 6% Hispanic, and 76% with greater than high school education. Full-item banks were created for each domain. All measures had good internal consistency and correlated well with SEMCD6 (r  = 0.56-0.75). Significant correlations were seen between the Self-Efficacy measures and other PROMIS short forms (r  > 0.38). The newly developed PROMIS Self-Efficacy for Managing Chronic Conditions measures include five domains of self-efficacy that were calibrated across diverse chronic conditions and show good internal consistency and cross-sectional validity.

  7. A Novel Instrument and Methodology for the In-Situ Measurement of the Stress in Thin Films

    NASA Technical Reports Server (NTRS)

    Broadway, David M.; Omokanwaye, Mayowa O.; Ramsey, Brian D.

    2014-01-01

    We introduce a novel methodology for the in-situ measurement of mechanical stress during thin film growth utilizing a highly sensitive non-contact variation of the classic spherometer. By exploiting the known spherical deformation of the substrate the value of the stress induced curvature is inferred by measurement of only one point on the substrate's surface-the sagittal. From the known curvature the stress can be calculated using the well-known Stoney equation. Based on this methodology, a stress sensor has been designed which is simple, highly sensitive, compact, and low cost. As a result of its compact nature, the sensor can be mounted in any orientation to accommodate a given deposition geometry without the need for extensive modification to an already existing deposition system. The technique employs the use of a double side polished substrate that offers good specular reflectivity and is isotropic in its mechanical properties, such as <111> oriented crystalline silicon or amorphous soda lime glass, for example. The measurement of the displacement of the uncoated side during deposition is performed with a high resolution (i.e. 5nm), commercially available, inexpensive, fiber optic sensor which can be used in both high vacuum and high temperature environments (i.e. 10(exp-7) Torr and 480oC, respectively). A key attribute of this instrument lies in its potential to achieve sensitivity that rivals other measurement techniques such as the micro cantilever method but, due to the comparatively larger substrate area, offers a more robust and practical alternative for subsequent measurement of additional characteristics of the film that can might be correlated to film stress. We present measurement results of nickel films deposited by magnetron sputtering which show good qualitative agreement to the know behavior of polycrystalline films previously reported by Hoffman.

  8. Conditional Standard Errors of Measurement for Scale Scores.

    ERIC Educational Resources Information Center

    Kolen, Michael J.; And Others

    1992-01-01

    A procedure is described for estimating the reliability and conditional standard errors of measurement of scale scores incorporating the discrete transformation of raw scores to scale scores. The method is illustrated using a strong true score model, and practical applications are described. (SLD)

  9. Optimizing spray drying conditions of sour cherry juice based on physicochemical properties, using response surface methodology (RSM).

    PubMed

    Moghaddam, Arasb Dabbagh; Pero, Milad; Askari, Gholam Reza

    2017-01-01

    In this study, the effects of main spray drying conditions such as inlet air temperature (100-140 °C), maltodextrin concentration (MDC: 30-60%), and aspiration rate (AR) (30-50%) on the physicochemical properties of sour cherry powder such as moisture content (MC), hygroscopicity, water solubility index (WSI), and bulk density were investigated. This investigation was carried out by employing response surface methodology and the process conditions were optimized by using this technique. The MC of the powder was negatively related to the linear effect of the MDC and inlet air temperature (IT) and directly related to the AR. Hygroscopicity of the powder was significantly influenced by the MDC. By increasing MDC in the juice, the hygroscopicity of the powder was decreased. MDC and inlet temperature had a positive effect, but the AR had a negative effect on the WSI of powder. MDC and inlet temperature negatively affected the bulk density of powder. By increasing these two variables, the bulk density of powder was decreased. The optimization procedure revealed that the following conditions resulted in a powder with the maximum solubility and minimum hygroscopicity: MDC = 60%, IT = 134 °C, and AR = 30% with a desirability of 0.875.

  10. ACCF/AHA methodology for the development of quality measures for cardiovascular technology: a report of the American College of Cardiology Foundation/American Heart Association Task Force on Performance Measures.

    PubMed

    Bonow, Robert O; Douglas, Pamela S; Buxton, Alfred E; Cohen, David J; Curtis, Jeptha P; Delong, Elizabeth; Drozda, Joseph P; Ferguson, T Bruce; Heidenreich, Paul A; Hendel, Robert C; Masoudi, Frederick A; Peterson, Eric D; Taylor, Allen J

    2011-09-27

    Consistent with the growing national focus on healthcare quality, the American College of Cardiology Foundation (ACCF) and the American Heart Association (AHA) have taken a leadership role over the past decade in developing measures of the quality of cardiovascular care by convening a joint ACCF/AHA Task Force on Performance Measures. The Task Force is charged with identifying the clinical topics appropriate for the development of performance measures and with assembling writing committees composed of clinical and methodological experts in collaboration with appropriate subspecialty societies. The Task Force has also created methodology documents that offer guidance in the development of process, outcome, composite, and efficiency measures. Cardiovascular performance measures using existing ACCF/AHA methodology are based on Class I or Class III guidelines recommendations, usually with Level A evidence. These performance measures, based on evidence-based ACCF/AHA guidelines, remain the most rigorous quality measures for both internal quality improvement and public reporting. However, many of the tools for diagnosis and treatment of cardiovascular disease involve advanced technologies, such as cardiac imaging, for which there are often no underlying guideline documents. Because these technologies affect the quality of cardiovascular care and also have the potential to contribute to cardiovascular health expenditures, there is a need for more critical assessment of the use of technology, including the development of quality and performance measures in areas in which guideline recommendations are absent. The evaluation of quality in the use of cardiovascular technologies requires consideration of multiple parameters that differ from other healthcare processes. The present document describes methodology for development of 2 new classes of quality measures in these situations, appropriate use measures and structure/safety measures. Appropriate use measures are based on

  11. Prediction of work metabolism from heart rate measurements in forest work: some practical methodological issues.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Auger, Isabelle; Leone, Mario

    2015-01-01

    Individual heart rate (HR) to workload relationships were determined using 93 submaximal step-tests administered to 26 healthy participants attending physical activities in a university training centre (laboratory study) and 41 experienced forest workers (field study). Predicted maximum aerobic capacity (MAC) was compared to measured MAC from a maximal treadmill test (laboratory study) to test the effect of two age-predicted maximum HR Equations (220-age and 207-0.7 × age) and two clothing insulation levels (0.4 and 0.91 clo) during the step-test. Work metabolism (WM) estimated from forest work HR was compared against concurrent work V̇O2 measurements while taking into account the HR thermal component. Results show that MAC and WM can be accurately predicted from work HR measurements and simple regression models developed in this study (1% group mean prediction bias and up to 25% expected prediction bias for a single individual). Clothing insulation had no impact on predicted MAC nor age-predicted maximum HR equations. Practitioner summary: This study sheds light on four practical methodological issues faced by practitioners regarding the use of HR methodology to assess WM in actual work environments. More specifically, the effect of wearing work clothes and the use of two different maximum HR prediction equations on the ability of a submaximal step-test to assess MAC are examined, as well as the accuracy of using an individual's step-test HR to workload relationship to predict WM from HR data collected during actual work in the presence of thermal stress.

  12. Optimization of fermentation conditions for 1,3-propanediol production by marine Klebsiella pneumonia HSL4 using response surface methodology

    NASA Astrophysics Data System (ADS)

    Li, Lili; Zhou, Sheng; Ji, Huasong; Gao, Ren; Qin, Qiwei

    2014-09-01

    The industrially important organic compound 1,3-propanediol (1,3-PDO) is mainly used as a building block for the production of various polymers. In the present study, response surface methodology protocol was followed to determine and optimize fermentation conditions for the maximum production of 1,3-PDO using marine-derived Klebsiella pneumoniae HSL4. Four nutritional supplements together with three independent culture conditions were optimized as follows: 29.3 g/L glycerol, 8.0 g/L K2 HPO4, 7.6 g/L (NH4)2 SO4, 3.0 g/L KH2 PO4, pH 7.1, cultivation at 35°C for 12 h. Under the optimal conditions, a maximum 1,3-PDO concentration of 14.5 g/L, a productivity of 1.21 g/(L·h) and a conversion of glycerol of 0.49 g/g were obtained. In comparison with the control conditions, fermentation under the optimized conditions achieved an increase of 38.8% in 1,3-PDO concentration, 39.0% in productivity and 25.7% in glycerol conversion in flask. This enhancement trend was further confirmed when the fermentation was conducted in a 5-L fermentor. The optimized fermentation conditions could be an important basis for developing lowcost, large-scale methods for industrial production of 1,3-PDO in the future.

  13. Associated phoria and the measuring and correcting methodology after H.-J. Haase (MKH).

    PubMed

    Brautaset, R L; Jennings, J A

    2001-09-01

    The test charts included in the Polatest, designed by H.-J. Haase and manufactured by Zeiss, are used in Germany, Switzerland and Scandinavia for prism correction of 'associated phoria.' From clinical experience with the Polatest Haase developed a motor and sensory theory of the different stages of decompensation of 'associated phoria ' and a strategy for its prismatic correction - the MKH (Measuring and Correcting Methodology after H.-J. Haase). The theory challenges many accepted ideas about the plasticity of the visual system and the use of prisms in the treatment of sensory abnormalities. This article, the first full description in English, describes and critically discusses the MKH.

  14. Moving to Capture Children's Attention: Developing a Methodology for Measuring Visuomotor Attention.

    PubMed

    Hill, Liam J B; Coats, Rachel O; Mushtaq, Faisal; Williams, Justin H G; Aucott, Lorna S; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child's development. However, methodological limitations currently make large-scale assessment of children's attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of 'Visual Motor Attention' (VMA)-a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method's core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults' attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action).

  15. Comparing Classic and Interval Analytical Hierarchy Process Methodologies for Measuring Area-Level Deprivation to Analyze Health Inequalities

    PubMed Central

    Cabrera-Barona, Pablo

    2018-01-01

    Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas. PMID:29337915

  16. Comparing Classic and Interval Analytical Hierarchy Process Methodologies for Measuring Area-Level Deprivation to Analyze Health Inequalities.

    PubMed

    Cabrera-Barona, Pablo; Ghorbanzadeh, Omid

    2018-01-16

    Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas.

  17. A new method to assess Pavlovian conditioning of psychostimulant drug effects.

    PubMed

    Damianopoulos, E N; Carey, R J

    1994-07-01

    Experimental studies of psychoactive drugs by pavlovian drug-conditioning methods, which originally began with investigations of drug-induced responses mediated by the autonomic nervous system, have now been expanded to include drug-induced response effects expressed as modulations of spontaneous motoric behaviors. In the latter application, however, equivalent behavioral response outcomes in post-treatment tests for conditioning can occur following a psychostimulant drug treatment either through drug interference effects on habituation processes, drug-induced stress effects and/or by pavlovian conditioning of the drug-induced motoric activation effect. Current methodologies for the study of pavlovian conditioned drug effects and/or drug sensitization cannot distinguish among these possibilities. This methodological inadequacy was addressed by a modification of the conventional paired-unpaired treatment protocol. In the new protocol, the animal is sequentially placed into two test compartments with the drug treatment administered in conjunction with placement into the second test compartment. This design permits a differentiation of a pavlovian conditioned drug responses from non-conditioned drug effects through continuous measurement of the non-drug behavioral baseline in both the drug and non-drug control treatment groups combined with multiple response measurements and post-treatment tests for conditioning at variable post-conditioning intervals. The present study details the use of the new modified pavlovian protocol with repeated cocaine (10 mg/kg) treatment. A cocaine conditioned response at 1, 7, and 21 days post-conditioning was identified and distinguished from habituation and stress effects.

  18. Effects of foot orthotics on running economy: methodological considerations.

    PubMed

    Burke, Jeanmarie R; Papuga, M Owen

    2012-05-01

    The purpose of the study was to collect preliminary data to address methodological considerations that may impact subject-specific reactions to foot orthotics during running. Six endurance-trained recreational runners recruited from a chiropractic college campus wore their preferred running shoes and then inserted either their custom-made orthotics during 1 testing session or their shoe-fitted insoles during the other testing session. Comfort perception was measured for each footwear condition. Measurements of oxygen consumption (VO2) at several moderate exercise intensities, to mimic recreational running, generated an individual's economy-of-running line. Predicted running velocity at VO(2max) (vVO2max) was calculated as an index of endurance performance. Lower extremity muscle activity was recorded. Descriptive statistics, a repeated-measures analysis of variance model, and a paired t test were used to document any systematic changes in running economy, lower extremity muscle activities, and vVO2max within and across subjects as a function of footwear conditions. Decreases in VO2 at several moderate exercise intensities (F((1,5)footwear) = 10.37, P = .023) and increases in vVO2max (t(5) = 4.20, P = .008) occurred in all 6 subjects while wearing their orthotic intervention vs their shoe-fitted insoles. There were no consistent changes in lower extremity muscle activity. Methodological decisions to use a sustained incremental exercise protocol at several moderate exercise intensities and to measure comfort perception of a custom-molded foot orthosis were effective at documenting systematic improvements in running economy among the 6 recreational runners tested. The development of a less physically demanding sustained exercise protocol is necessary to determine underlying neuromuscular mechanisms and/or clinical effectiveness of orthotic interventions. Copyright © 2012 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  19. Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System

    NASA Astrophysics Data System (ADS)

    Lee, Chang Jae; Yun, Jae Hee

    2017-06-01

    Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.

  20. Methodology for testing infrared focal plane arrays in simulated nuclear radiation environments

    NASA Astrophysics Data System (ADS)

    Divita, E. L.; Mills, R. E.; Koch, T. L.; Gordon, M. J.; Wilcox, R. A.; Williams, R. E.

    1992-07-01

    This paper summarizes test methodology for focal plane array (FPA) testing that can be used for benign (clear) and radiation environments, and describes the use of custom dewars and integrated test equipment in an example environment. The test methodology, consistent with American Society for Testing Materials (ASTM) standards, is presented for the total accumulated gamma dose, transient dose rate, gamma flux, and neutron fluence environments. The merits and limitations of using Cobalt 60 for gamma environment simulations and of using various fast-neutron reactors and neutron sources for neutron simulations are presented. Test result examples are presented to demonstrate test data acquisition and FPA parameter performance under different measurement conditions and environmental simulations.

  1. A comprehensive methodology for the multidimensional and synchronic data collecting in soundscape.

    PubMed

    Kogan, Pablo; Turra, Bruno; Arenas, Jorge P; Hinalaf, María

    2017-02-15

    The soundscape paradigm is comprised of complex living systems where individuals interact moment-by-moment among one another and with the physical environment. The real environments provide promising conditions to reveal deep soundscape behavior, including the multiple components involved and their interrelations as a whole. However, measuring and analyzing the numerous simultaneous variables of soundscape represents a challenge that is not completely understood. This work proposes and applies a comprehensive methodology for multidimensional and synchronic data collection in soundscape. The soundscape variables were organized into three main entities: experienced environment, acoustic environment, and extra-acoustic environment, containing, in turn, subgroups of variables called components. The variables contained in these components were acquired through synchronic field techniques that include surveys, acoustic measurements, audio recordings, photography, and video. The proposed methodology was tested, optimized, and applied in diverse open environments, including squares, parks, fountains, university campuses, streets, and pedestrian areas. The systematization of this comprehensive methodology provides a framework for soundscape research, a support for urban and environment management, and a preliminary procedure for standardization in soundscape data collecting. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Speciated arsenic in air: measurement methodology and risk assessment considerations.

    PubMed

    Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L

    2012-01-01

    Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate

  3. Validation of a CFD Methodology for Variable Speed Power Turbine Relevant Conditions

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.; Giel, Paul W.; McVetta, Ashlie B.

    2013-01-01

    Analysis tools are needed to investigate aerodynamic performance of Variable-Speed Power Turbines (VSPT) for rotorcraft applications. The VSPT operates at low Reynolds numbers (transitional flow) and over a wide range of incidence. Previously, the capability of a published three-equation turbulence model to predict accurately the transition location for three-dimensional heat transfer problems was assessed. In this paper, the results of a post-diction exercise using a three-dimensional flow in a transonic linear cascade comprising VSPT blading are presented. The measured blade pressure distributions and exit total pressure and flow angles for two incidence angles corresponding to cruise (i = 5.8deg) and takeoff (i = -36.7deg) were used for this study. For the higher loading condition of cruise and the negative incidence condition of takeoff, overall agreement with data may be considered satisfactory but areas of needed improvement are also indicated.

  4. Signal conditioning units for vibration measurement in HUMS

    NASA Astrophysics Data System (ADS)

    Wu, Kaizhi; Liu, Tingting; Yu, Zirong; Chen, Lijuan; Huang, Xinjie

    2018-03-01

    A signal conditioning units for vibration measurement in HUMS is proposed in the paper. Due to the frequency of vibrations caused by components in helicopter are different, two steps amplifier and programmable anti-aliasing filter are designed to meet the measurement of different types of helicopter. Vibration signals are converted into measurable electrical signals combing with ICP driver firstly. Then pre-amplifier and programmable gain amplifier is applied to magnify the weak electrical signals. In addition, programmable anti-aliasing filter is utilized to filter the interference of noise. The units were tested using function signal generator and oscilloscope. The experimental results have demonstrated the effectiveness of our proposed method in quantitatively and qualitatively. The method presented in this paper can meet the measurement requirement for different types of helicopter.

  5. Three-dimensional sensing methodology combining stereo vision and phase-measuring profilometry based on dynamic programming

    NASA Astrophysics Data System (ADS)

    Lee, Hyunki; Kim, Min Young; Moon, Jeon Il

    2017-12-01

    Phase measuring profilometry and moiré methodology have been widely applied to the three-dimensional shape measurement of target objects, because of their high measuring speed and accuracy. However, these methods suffer from inherent limitations called a correspondence problem, or 2π-ambiguity problem. Although a kind of sensing method to combine well-known stereo vision and phase measuring profilometry (PMP) technique simultaneously has been developed to overcome this problem, it still requires definite improvement for sensing speed and measurement accuracy. We propose a dynamic programming-based stereo PMP method to acquire more reliable depth information and in a relatively small time period. The proposed method efficiently fuses information from two stereo sensors in terms of phase and intensity simultaneously based on a newly defined cost function of dynamic programming. In addition, the important parameters are analyzed at the view point of the 2π-ambiguity problem and measurement accuracy. To analyze the influence of important hardware and software parameters related to the measurement performance and to verify its efficiency, accuracy, and sensing speed, a series of experimental tests were performed with various objects and sensor configurations.

  6. Expression and Secretion of Endostar Protein by Escherichia Coli: Optimization of Culture Conditions Using the Response Surface Methodology.

    PubMed

    Mohajeri, Abbas; Abdolalizadeh, Jalal; Pilehvar-Soltanahmadi, Younes; Kiafar, Farhad; Zarghami, Nosratollah

    2016-10-01

    Endostar as a specific drug in treatment of the nonsmall cell lung cancer is produced using Escherichia coli expression system. Plackett-Burman design (PBD) and response surface methodology (RSM) are statistical tools for experimental design and optimization of biotechnological processes. This investigation aimed to predict and develop the optimal culture condition and its components for expression and secretion of endostar into the culture medium of E. coli. The synthetic endostar coding sequence was fused with PhoA signal peptide. The nine factors involved in the production of recombinant protein-postinduction temperature, cell density, rotation speed, postinduction time, concentration of glycerol, IPTG, peptone, glycine, and triton X-100-were evaluated using PBD. Four significant factors were selected based on PBD results for optimizing culture condition using RSM. Endostar was purified using cation exchange chromatography and size exclusion chromatography. The maximum level of endostar was obtained under the following condition: 13.57-h postinduction time, 0.76 % glycine, 0.7 % triton X-100, and 4.87 % glycerol. The predicted levels of endostar was significantly correlated with experimental levels (R 2 = 0.982, P = 0.00). The obtained results indicated that PBD and RSM are effective tools for optimization of culture condition and its components for endostar production in E. coli. The most important factors in the enhancement of the protein production are glycerol, glycine, and postinduction time.

  7. ACCF/AHA methodology for the development of quality measures for cardiovascular technology: a report of the American College of Cardiology Foundation/American Heart Association Task Force on Performance Measures.

    PubMed

    Bonow, Robert O; Douglas, Pamela S; Buxton, Alfred E; Cohen, David J; Curtis, Jeptha P; Delong, Elizabeth; Drozda, Joseph P; Ferguson, T Bruce; Heidenreich, Paul A; Hendel, Robert C; Masoudi, Frederick A; Peterson, Eric D; Taylor, Allen J

    2011-09-27

    Consistent with the growing national focus on healthcare quality, the American College of Cardiology Foundation (ACCF) and the American Heart Association (AHA) have taken a leadership role over the past decade in developing measures of the quality of cardiovascular care by convening a joint ACCF/AHA Task Force on Performance Measures. The Task Force is charged with identifying the clinical topics appropriate for the development of performance measures and with assembling writing committees composed of clinical and methodological experts in collaboration with appropriate subspecialty societies. The Task Force has also created methodology documents that offer guidance in the development of process, outcome, composite, and efficiency measures. Cardiovascular performance measures using existing ACCF/AHA methodology are based on Class I or Class III guidelines recommendations, usually with Level A evidence. These performance measures, based on evidence-based ACCF/AHA guidelines, remain the most rigorous quality measures for both internal quality improvement and public reporting. However, many of the tools for diagnosis and treatment of cardiovascular disease involve advanced technologies, such as cardiac imaging, for which there are often no underlying guideline documents. Because these technologies affect the quality of cardiovascular care and also have the potential to contribute to cardiovascular health expenditures, there is a need for more critical assessment of the use of technology, including the development of quality and performance measures in areas in which guideline recommendations are absent. The evaluation of quality in the use of cardiovascular technologies requires consideration of multiple parameters that differ from other healthcare processes. The present document describes methodology for development of 2 new classes of quality measures in these situations, appropriate use measures and structure/safety measures. Appropriate use measures are based on

  8. Improving inferior vena cava filter retrieval rates with the define, measure, analyze, improve, control methodology.

    PubMed

    Sutphin, Patrick D; Reis, Stephen P; McKune, Angie; Ravanzo, Maria; Kalva, Sanjeeva P; Pillai, Anil K

    2015-04-01

    To design a sustainable process to improve optional inferior vena cava (IVC) filter retrieval rates based on the Define, Measure, Analyze, Improve, Control (DMAIC) methodology of the Six Sigma process improvement paradigm. DMAIC, an acronym for Define, Measure, Analyze, Improve, and Control, was employed to design and implement a quality improvement project to increase IVC filter retrieval rates at a tertiary academic hospital. Retrievable IVC filters were placed in 139 patients over a 2-year period. The baseline IVC filter retrieval rate (n = 51) was reviewed through a retrospective analysis, and two strategies were devised to improve the filter retrieval rate: (a) mailing of letters to clinicians and patients for patients who had filters placed within 8 months of implementation of the project (n = 43) and (b) a prospective automated scheduling of a clinic visit at 4 weeks after filter placement for all new patients (n = 45). The effectiveness of these strategies was assessed by measuring the filter retrieval rates and estimated increase in revenue to interventional radiology. IVC filter retrieval rates increased from a baseline of 8% to 40% with the mailing of letters and to 52% with the automated scheduling of a clinic visit 4 weeks after IVC filter placement. The estimated revenue per 100 IVC filters placed increased from $2,249 to $10,518 with the mailing of letters and to $17,022 with the automated scheduling of a clinic visit. Using the DMAIC methodology, a simple and sustainable quality improvement intervention was devised that markedly improved IVC filter retrieval rates in eligible patients. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  9. Guidelines for reporting methodological challenges and evaluating potential bias in dementia research.

    PubMed

    Weuve, Jennifer; Proust-Lima, Cécile; Power, Melinda C; Gross, Alden L; Hofer, Scott M; Thiébaut, Rodolphe; Chêne, Geneviève; Glymour, M Maria; Dufouil, Carole

    2015-09-01

    Clinical and population research on dementia and related neurologic conditions, including Alzheimer's disease, faces several unique methodological challenges. Progress to identify preventive and therapeutic strategies rests on valid and rigorous analytic approaches, but the research literature reflects little consensus on "best practices." We present findings from a large scientific working group on research methods for clinical and population studies of dementia, which identified five categories of methodological challenges as follows: (1) attrition/sample selection, including selective survival; (2) measurement, including uncertainty in diagnostic criteria, measurement error in neuropsychological assessments, and practice or retest effects; (3) specification of longitudinal models when participants are followed for months, years, or even decades; (4) time-varying measurements; and (5) high-dimensional data. We explain why each challenge is important in dementia research and how it could compromise the translation of research findings into effective prevention or care strategies. We advance a checklist of potential sources of bias that should be routinely addressed when reporting dementia research. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Quantification of Greenhouse Gas Emission Rates from strong Point Sources by Airborne IPDA-Lidar Measurements: Methodology and Experimental Results

    NASA Astrophysics Data System (ADS)

    Ehret, G.; Amediek, A.; Wirth, M.; Fix, A.; Kiemle, C.; Quatrevalet, M.

    2016-12-01

    We report on a new method and on the first demonstration to quantify emission rates from strong greenhouse gas (GHG) point sources using airborne Integrated Path Differential Absorption (IPDA) Lidar measurements. In order to build trust in the self-reported emission rates by countries, verification against independent monitoring systems is a prerequisite to check the reported budget. A significant fraction of the total anthropogenic emission of CO2 and CH4 originates from localized strong point sources of large energy production sites or landfills. Both are not monitored with sufficiently accuracy by the current observation system. There is a debate whether airborne remote sensing could fill in the gap to infer those emission rates from budgeting or from Gaussian plume inversion approaches, whereby measurements of the GHG column abundance beneath the aircraft can be used to constrain inverse models. In contrast to passive sensors, the use of an active instrument like CHARM-F for such emission verification measurements is new. CHARM-F is a new airborne IPDA-Lidar devised for the German research aircraft HALO for the simultaneous measurement of the column-integrated dry-air mixing ratio of CO2 and CH4 commonly denoted as XCO2 und XCH4, respectively. It has successfully been tested in a serious of flights over Central Europe to assess its performance under various reflectivity conditions and in a strongly varying topography like the Alps. The analysis of a methane plume measured in crosswind direction of a coal mine ventilation shaft revealed an instantaneous emission rate of 9.9 ± 1.7 kt CH4 yr-1. We discuss the methodology of our point source estimation approach and give an outlook on the CoMet field experiment scheduled in 2017 for the measurement of anthropogenic and natural GHG emissions by a combination of active and passive remote sensing instruments on research aircraft.

  11. Assessing the Practical Equivalence of Conversions when Measurement Conditions Change

    ERIC Educational Resources Information Center

    Liu, Jinghua; Dorans, Neil J.

    2012-01-01

    At times, the same set of test questions is administered under different measurement conditions that might affect the psychometric properties of the test scores enough to warrant different score conversions for the different conditions. We propose a procedure for assessing the practical equivalence of conversions developed for the same set of test…

  12. 42 CFR 410.31 - Bone mass measurement: Conditions for coverage and frequency standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false Bone mass measurement: Conditions for coverage and... Medical and Other Health Services § 410.31 Bone mass measurement: Conditions for coverage and frequency... applies: Bone mass measurement means a radiologic, radioisotopic, or other procedure that meets the...

  13. 42 CFR 410.31 - Bone mass measurement: Conditions for coverage and frequency standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 2 2012-10-01 2012-10-01 false Bone mass measurement: Conditions for coverage and... Medical and Other Health Services § 410.31 Bone mass measurement: Conditions for coverage and frequency... applies: Bone mass measurement means a radiologic, radioisotopic, or other procedure that meets the...

  14. 42 CFR 410.31 - Bone mass measurement: Conditions for coverage and frequency standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 2 2014-10-01 2014-10-01 false Bone mass measurement: Conditions for coverage and... Medical and Other Health Services § 410.31 Bone mass measurement: Conditions for coverage and frequency... applies: Bone mass measurement means a radiologic, radioisotopic, or other procedure that meets the...

  15. 42 CFR 410.31 - Bone mass measurement: Conditions for coverage and frequency standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Bone mass measurement: Conditions for coverage and... Medical and Other Health Services § 410.31 Bone mass measurement: Conditions for coverage and frequency... applies: Bone mass measurement means a radiologic, radioisotopic, or other procedure that meets the...

  16. 42 CFR 410.31 - Bone mass measurement: Conditions for coverage and frequency standards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 2 2013-10-01 2013-10-01 false Bone mass measurement: Conditions for coverage and... Medical and Other Health Services § 410.31 Bone mass measurement: Conditions for coverage and frequency... applies: Bone mass measurement means a radiologic, radioisotopic, or other procedure that meets the...

  17. Vital physical signals measurements using a webcam

    NASA Astrophysics Data System (ADS)

    Ouyang, Jianfei; Yan, Yonggang; Yao, Lifeng

    2013-10-01

    Non-contact and remote measurements of vital physical signals are important for reliable and comfortable physiological self-assessment. In this paper, we provide a new video-based methodology for remote and fast measurements of vital physical signals such as cardiac pulse and breathing rate. A webcam is used to track color video of a human face or wrist, and a Photoplethysmography (PPG) technique is applied to perform the measurements of the vital signals. A novel sequential blind signal extraction methodology is applied to the color video under normal lighting conditions, based on correlation analysis between the green trace and the source signals. The approach is successfully applied in the measurement of vital signals under the condition of different illuminating in which the target signal can also be found out accurately. To assess the advantages, the measuring time of a large number of cases is recorded correctly. The experimental results show that it only takes less than 30 seconds to measure the vital physical signals using presented technique. The study indicates the proposed approach is feasible for PPG technique, which provides a way to study the relationship of the signal for different ROI in future research.

  18. The relationship between ground conditions and injury: what level of evidence do we have?

    PubMed

    Petrass, Lauren A; Twomey, Dara M

    2013-03-01

    To identify studies which address the relationship between ground conditions and injury, in a sporting context and to evaluate current practice and provide recommendations for future studies that measure ground conditions and injury risk. Systematic review. A comprehensive search of electronic databases from the earliest records available until the end of 2011, and supplemental hand searching was conducted to identify relevant studies. A classification scale was used to rate the methodological quality of studies. 79 potentially relevant articles were identified, and 27 met all inclusion criteria. They varied in methodological quality, with analytical observational studies the most common design, although four descriptive observational studies, considered to be of lower quality were also identified. Only five studies objectively measured ground conditions, and of studies that used subjective assessment, only one provided descriptors to explain their classifications. It appears that harder/drier grounds are associated with an increased injury risk but the presence of major limitations necessitates cautious interpretation of many key findings. There is limited high quality evidence of the relationship between injury risk and ground conditions. Further research with high quality designs, and measurement of ground conditions are required to draw more definitive conclusions regarding this relationship. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  19. Measurement properties of tools used to assess depression in adults with and without autism spectrum conditions: A systematic review.

    PubMed

    Cassidy, S A; Bradley, L; Bowen, E; Wigham, S; Rodgers, J

    2018-01-23

    Depression is the most commonly experienced mental health condition in adults with autism spectrum conditions (ASC). However, it is unclear what tools are currently being used to assess depression in ASC, or whether tools need to be adapted for this group. This systematic review therefore aimed to identify tools used to assess depression in adults with and without ASC, and then evaluate these tools for their appropriateness and measurement properties. Medline, PsychINFO and Web of Knowledge were searched for studies of depression in: (a) adults with ASC, without co-morbid intellectual disability; and (b) adults from the general population without co-morbid conditions. Articles examining the measurement properties of these tools were then searched for using a methodological filter in PubMed, and the quality of the evidence was evaluated using the COSMIN checklist. Twelve articles were identified which utilized three tools to assess depression in adults with ASC, but only one article which assessed the measurement properties of one of these tools was identified and thus evaluated. Sixty-four articles were identified which utilized five tools to assess depression in general population adults, and fourteen articles had assessed the measurement properties of these tools. Overall, two tools were found to be robust in their measurement properties in the general population-the Beck Depression Inventory (BDI-II), and the patient health questionnaire (PHQ-9). Crucially only one study was identified from the COSMIN search, which showed weak evidence in support of the measurement properties of the BDI-II in an ASC sample. Implications for effective measurement of depression in ASC are discussed. Autism Res 2018. © 2018 The Authors Autism Research published by International Society for Autism Research and Wiley Periodicals, Inc. Depression is the most common mental health problem experienced by adults with autism. However, the current study found very limited evidence

  20. Measurement in Learning Games Evolution: Review of Methodologies Used in Determining Effectiveness of "Math Snacks" Games and Animations

    ERIC Educational Resources Information Center

    Trujillo, Karen; Chamberlin, Barbara; Wiburg, Karin; Armstrong, Amanda

    2016-01-01

    This article captures the evolution of research goals and methodologies used to assess the effectiveness and impact of a set of mathematical educational games and animations for middle-school aged students. The researchers initially proposed using a mixed model research design of formative and summative measures, such as user-testing,…

  1. The COSMIN checklist for assessing the methodological quality of studies on measurement properties of health status measurement instruments: an international Delphi study.

    PubMed

    Mokkink, Lidwine B; Terwee, Caroline B; Patrick, Donald L; Alonso, Jordi; Stratford, Paul W; Knol, Dirk L; Bouter, Lex M; de Vet, Henrica C W

    2010-05-01

    Aim of the COSMIN study (COnsensus-based Standards for the selection of health status Measurement INstruments) was to develop a consensus-based checklist to evaluate the methodological quality of studies on measurement properties. We present the COSMIN checklist and the agreement of the panel on the items of the checklist. A four-round Delphi study was performed with international experts (psychologists, epidemiologists, statisticians and clinicians). Of the 91 invited experts, 57 agreed to participate (63%). Panel members were asked to rate their (dis)agreement with each proposal on a five-point scale. Consensus was considered to be reached when at least 67% of the panel members indicated 'agree' or 'strongly agree'. Consensus was reached on the inclusion of the following measurement properties: internal consistency, reliability, measurement error, content validity (including face validity), construct validity (including structural validity, hypotheses testing and cross-cultural validity), criterion validity, responsiveness, and interpretability. The latter was not considered a measurement property. The panel also reached consensus on how these properties should be assessed. The resulting COSMIN checklist could be useful when selecting a measurement instrument, peer-reviewing a manuscript, designing or reporting a study on measurement properties, or for educational purposes.

  2. Measuring domestic water use: a systematic review of methodologies that measure unmetered water use in low-income settings.

    PubMed

    Tamason, Charlotte C; Bessias, Sophia; Villada, Adriana; Tulsiani, Suhella M; Ensink, Jeroen H J; Gurley, Emily S; Mackie Jensen, Peter Kjaer

    2016-11-01

    To present a systematic review of methods for measuring domestic water use in settings where water meters cannot be used. We systematically searched EMBASE, PubMed, Water Intelligence Online, Water Engineering and Development Center, IEEExplore, Scielo, and Science Direct databases for articles that reported methodologies for measuring water use at the household level where water metering infrastructure was absent or incomplete. A narrative review explored similarities and differences between the included studies and provide recommendations for future research in water use. A total of 21 studies were included in the review. Methods ranged from single-day to 14-consecutive-day visits, and water use recall ranged from 12 h to 7 days. Data were collected using questionnaires, observations or both. Many studies only collected information on water that was carried into the household, and some failed to mention whether water was used outside the home. Water use in the selected studies was found to range from two to 113 l per capita per day. No standardised methods for measuring unmetered water use were found, which brings into question the validity and comparability of studies that have measured unmetered water use. In future studies, it will be essential to define all components that make up water use and determine how they will be measured. A pre-study that involves observations and direct measurements during water collection periods (these will have to be determined through questioning) should be used to determine optimal methods for obtaining water use information in a survey. Day-to-day and seasonal variation should be included. A study that investigates water use recall is warranted to further develop standardised methods to measure water use; in the meantime, water use recall should be limited to 24 h or fewer. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  3. Standardized Assay Medium To Measure Lactococcus lactis Enzyme Activities while Mimicking Intracellular Conditions

    PubMed Central

    Goel, Anisha; Santos, Filipe; de Vos, Willem M.; Teusink, Bas

    2012-01-01

    Knowledge of how the activity of enzymes is affected under in vivo conditions is essential for analyzing their regulation and constructing models that yield an integrated understanding of cell behavior. Current kinetic parameters for Lactococcus lactis are scattered through different studies and performed under different assay conditions. Furthermore, assay conditions often diverge from conditions prevailing in the intracellular environment. To establish uniform assay conditions that resemble intracellular conditions, we analyzed the intracellular composition of anaerobic glucose-limited chemostat cultures of L. lactis subsp. cremoris MG 1363. Based on this, we designed a new assay medium for enzyme activity measurements of growing cells of L. lactis, mimicking as closely as practically possible its intracellular environment. Procedures were optimized to be carried out in 96-well plates, and the reproducibility and dynamic range were checked for all enzyme activity measurements. The effects of freezing and the carryover of ammonium sulfate from the addition of coupling enzymes were also established. Activities of all 10 glycolytic and 4 fermentative enzymes were measured. Remarkably, most in vivo-like activities were lower than previously published data. Yet, the ratios of Vmax over measured in vivo fluxes were above 1. With this work, we have developed and extensively validated standard protocols for enzyme activity measurements for L. lactis. PMID:22020503

  4. On the determination of χ(2) in thin films: a comparison of one-beam second-harmonic generation measurement methodologies

    PubMed Central

    Hermans, Artur; Kieninger, Clemens; Koskinen, Kalle; Wickberg, Andreas; Solano, Eduardo; Dendooven, Jolien; Kauranen, Martti; Clemmen, Stéphane; Wegener, Martin; Koos, Christian; Baets, Roel

    2017-01-01

    The determination of the second-order susceptibility (χ(2)) of thin film samples can be a delicate matter since well-established χ(2) measurement methodologies such as the Maker fringe technique are best suited for nonlinear materials with large thicknesses typically ranging from tens of microns to several millimeters. Here we compare two different second-harmonic generation setups and the corresponding measurement methodologies that are especially advantageous for thin film χ(2) characterization. This exercise allows for cross-checking the χ(2) obtained for identical samples and identifying the main sources of error for the respective techniques. The development of photonic integrated circuits makes nonlinear thin films of particular interest, since they can be processed into long waveguides to create efficient nonlinear devices. The investigated samples are ABC-type nanolaminates, which were reported recently by two different research groups. However, the subsequent analysis can be useful for all researchers active in the field of thin film χ(2) characterization. PMID:28317938

  5. The use of an optical method to evaluate prokaryotic oxygen consumption under high pressure condition

    NASA Astrophysics Data System (ADS)

    Garel, M.; Martini, S.; Lefèvre, D.; Tamburini, C.

    2016-02-01

    The heterotrophic prokaryotes are the main contributor to organic matter degradation in the ocean and particularly in the deep ocean. Nowadays, a classical way to evaluate the prokaryotic carbon demand (PCD) needs the estimation of both prokaryotic heterotrophic production (PHP) and prokaryotic respiration (PR). PHP measurements in deep-sea waters are relatively well documented and the importance of maintaining the in situ conditions (pressure and temperature) to avoid bias of the real deep-sea activities has been highlighted. However, no accurate methodology is available to measure directly, under in situ conditions (pressure and temperature) PR in the dark ocean. This study is presenting PR measurements under in situ conditions. High-pressure bottles have been adapted with a non-invasive sensor to measure prokaryotic oxygen consumption. The methodology is based on fluorescence quenching where molecular oxygen quenches the luminescence of planar-optode-oxygen sensor widely used in oceanography. Firstly, accuracy, detection limit, precision and response time of oxygen concentration measurements have been investigated in relation to an increase of hydrostatic pressure. Secondly, we will present experiments performed on natural prokaryotic consortium mixed with freshly collected particles to assess the O2 consumption in relation with increasing hydrostatic pressure (150 m depth per day). Finally, first results of coupled PHP and PR measurements at in situ conditions (temperature and pressure) from mesopelagic and bathypelagic samples of the Atlantic Ocean (PAP site), will be discussed. Finally, we will discuss first results of coupled PHP and PR measurements at in situ conditions (temperature and pressure) from Atlantic Ocean mesopelagic and bathypelagic samples (PAP site).

  6. Concurrent measurement of "real-world" stress and arousal in individuals with psychosis: assessing the feasibility and validity of a novel methodology.

    PubMed

    Kimhy, David; Delespaul, Philippe; Ahn, Hongshik; Cai, Shengnan; Shikhman, Marina; Lieberman, Jeffrey A; Malaspina, Dolores; Sloan, Richard P

    2010-11-01

    Psychosis has been repeatedly suggested to be affected by increases in stress and arousal. However, there is a dearth of evidence supporting the temporal link between stress, arousal, and psychosis during "real-world" functioning. This paucity of evidence may stem from limitations of current research methodologies. Our aim is to the test the feasibility and validity of a novel methodology designed to measure concurrent stress and arousal in individuals with psychosis during "real-world" daily functioning. Twenty patients with psychosis completed a 36-hour ambulatory assessment of stress and arousal. We used experience sampling method with palm computers to assess stress (10 times per day, 10 AM → 10 PM) along with concurrent ambulatory measurement of cardiac autonomic regulation using a Holter monitor. The clocks of the palm computer and Holter monitor were synchronized, allowing the temporal linking of the stress and arousal data. We used power spectral analysis to determine the parasympathetic contributions to autonomic regulation and sympathovagal balance during 5 minutes before and after each experience sample. Patients completed 79% of the experience samples (75% with a valid concurrent arousal data). Momentary increases in stress had inverse correlation with concurrent parasympathetic activity (ρ = -.27, P < .0001) and positive correlation with sympathovagal balance (ρ = .19, P = .0008). Stress and heart rate were not significantly related (ρ = -.05, P = .3875). The findings support the feasibility and validity of our methodology in individuals with psychosis. The methodology offers a novel way to study in high time resolution the concurrent, "real-world" interactions between stress, arousal, and psychosis. The authors discuss the methodology's potential applications and future research directions.

  7. Rapid condition assessment of structural condition after a blast using state-space identification

    NASA Astrophysics Data System (ADS)

    Eskew, Edward; Jang, Shinae

    2015-04-01

    After a blast event, it is important to quickly quantify the structural damage for emergency operations. In order improve the speed, accuracy, and efficiency of condition assessments after a blast, the authors have previously performed work to develop a methodology for rapid assessment of the structural condition of a building after a blast. The method involved determining a post-event equivalent stiffness matrix using vibration measurements and a finite element (FE) model. A structural model was built for the damaged structure based on the equivalent stiffness, and inter-story drifts from the blast are determined using numerical simulations, with forces determined from the blast parameters. The inter-story drifts are then compared to blast design conditions to assess the structures damage. This method still involved engineering judgment in terms of determining significant frequencies, which can lead to error, especially with noisy measurements. In an effort to improve accuracy and automate the process, this paper will look into a similar method of rapid condition assessment using subspace state-space identification. The accuracy of the method will be tested using a benchmark structural model, as well as experimental testing. The blast damage assessments will be validated using pressure-impulse (P-I) diagrams, which present the condition limits across blast parameters. Comparisons between P-I diagrams generated using the true system parameters and equivalent parameters will show the accuracy of the rapid condition based blast assessments.

  8. Optimization of conditions for probiotic curd formulation by Enterococcus faecium MTCC 5695 with probiotic properties using response surface methodology.

    PubMed

    Ramakrishnan, Vrinda; Goveas, Louella Concepta; Prakash, Maya; Halami, Prakash M; Narayan, Bhaskar

    2014-11-01

    Enterococcus faecium MTCC 5695 possessing potential probiotic properties as well as enterocin producing ability was used as starter culture. Effect of time (12-24 h) and inoculum level (3-7 % v/v) on cell growth, bacteriocin production, antioxidant property, titrable acidity and pH of curd was studied by response surface methodology (RSM). The optimized conditions were 26.48 h and 2.17%v/v inoculum and the second order model validated. Co cultivation studies revealed that the formulated product had the ability to prevent growth of foodborne pathogens that affect keeping quality of the product during storage. The results indicated that application of E. faecium MTCC 5695 along with usage of optimized conditions attributed to the formation of highly consistent well set curd with bioactive and bioprotective properties. Formulated curd with potential probiotic attributes can be used as therapeutic agent for the treatment of foodborne diseases like Traveler's diarrhea and gastroenteritis which thereby help in improvement of bowel health.

  9. Ethical and methodological issues in qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions: a critical review.

    PubMed

    Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika

    2017-01-01

    Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.

  10. Ethical and methodological issues in qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions: a critical review

    PubMed Central

    Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika

    2017-01-01

    ABSTRACT Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness. PMID:28901217

  11. Optimization of physical conditions for the production of thermostable T1 lipase in Pichia guilliermondii strain SO using response surface methodology.

    PubMed

    Abu, Mary Ladidi; Nooh, Hisham Mohd; Oslan, Siti Nurbaya; Salleh, Abu Bakar

    2017-11-10

    Pichia guilliermondii was found capable of expressing the recombinant thermostable lipase without methanol under the control of methanol dependent alcohol oxidase 1 promoter (AOXp 1). In this study, statistical approaches were employed for the screening and optimisation of physical conditions for T1 lipase production in P. guilliermondii. The screening of six physical conditions by Plackett-Burman Design has identified pH, inoculum size and incubation time as exerting significant effects on lipase production. These three conditions were further optimised using, Box-Behnken Design of Response Surface Methodology, which predicted an optimum medium comprising pH 6, 24 h incubation time and 2% inoculum size. T1 lipase activity of 2.0 U/mL was produced with a biomass of OD 600 23.0. The process of using RSM for optimisation yielded a 3-fold increase of T1 lipase over medium before optimisation. Therefore, this result has proven that T1 lipase can be produced at a higher yield in P. guilliermondii.

  12. Phoretic and Radiometric Force Measurements on Microparticles in Microgravity Conditions

    NASA Technical Reports Server (NTRS)

    Davis, E. James

    1996-01-01

    Thermophoretic, diffusiophoretic and radiometric forces on microparticles are being measured over a wide range of gas phase and particle conditions using electrodynamic levitation of single particles to simulate microgravity conditions. The thermophoretic force, which arises when a particle exists in a gas having a temperature gradient, is measured by levitating an electrically charged particle between heated and cooled plates mounted in a vacuum chamber. The diffusiophoretic force arising from a concentration gradient in the gas phase is measured in a similar manner except that the heat exchangers are coated with liquids to establish a vapor concentration gradient. These phoretic forces and the radiation pressure force acting on a particle are measured directly in terms of the change in the dc field required to levitate the particle with and without the force applied. The apparatus developed for the research and the experimental techniques are discussed, and results obtained by thermophoresis experiments are presented. The determination of the momentum and energy accommodation coefficients associated with molecular collisions between gases molecules and particles and the measurement of the interaction between electromagnetic radiation and small particles are of particular interest.

  13. Regional Shelter Analysis Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b)more » country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.« less

  14. Event- and interval-based measurement of stuttering: a review.

    PubMed

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  15. The Harmonizing Outcome Measures for Eczema (HOME) roadmap: a methodological framework to develop core sets of outcome measurements in dermatology.

    PubMed

    Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C

    2015-01-01

    Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care.

  16. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    PubMed

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  17. Experimental methodology for turbocompressor in-duct noise evaluation based on beamforming wave decomposition

    NASA Astrophysics Data System (ADS)

    Torregrosa, A. J.; Broatch, A.; Margot, X.; García-Tíscar, J.

    2016-08-01

    An experimental methodology is proposed to assess the noise emission of centrifugal turbocompressors like those of automotive turbochargers. A step-by-step procedure is detailed, starting from the theoretical considerations of sound measurement in flow ducts and examining specific experimental setup guidelines and signal processing routines. Special care is taken regarding some limiting factors that adversely affect the measuring of sound intensity in ducts, namely calibration, sensor placement and frequency ranges and restrictions. In order to provide illustrative examples of the proposed techniques and results, the methodology has been applied to the acoustic evaluation of a small automotive turbocharger in a flow bench. Samples of raw pressure spectra, decomposed pressure waves, calibration results, accurate surge characterization and final compressor noise maps and estimated spectrograms are provided. The analysis of selected frequency bands successfully shows how different, known noise phenomena of particular interest such as mid-frequency "whoosh noise" and low-frequency surge onset are correlated with operating conditions of the turbocharger. Comparison against external inlet orifice intensity measurements shows good correlation and improvement with respect to alternative wave decomposition techniques.

  18. 40 CFR 201.25 - Measurement location and weather conditions for measurement on receiving property of the noise of...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Measurement location and weather conditions for measurement on receiving property of the noise of retarders, car coupling, locomotive load... EQUIPMENT; INTERSTATE RAIL CARRIERS Measurement Criteria § 201.25 Measurement location and weather...

  19. 40 CFR 201.25 - Measurement location and weather conditions for measurement on receiving property of the noise of...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Measurement location and weather conditions for measurement on receiving property of the noise of retarders, car coupling, locomotive load... EQUIPMENT; INTERSTATE RAIL CARRIERS Measurement Criteria § 201.25 Measurement location and weather...

  20. 40 CFR 201.25 - Measurement location and weather conditions for measurement on receiving property of the noise of...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Measurement location and weather conditions for measurement on receiving property of the noise of retarders, car coupling, locomotive load... EQUIPMENT; INTERSTATE RAIL CARRIERS Measurement Criteria § 201.25 Measurement location and weather...

  1. 40 CFR 201.25 - Measurement location and weather conditions for measurement on receiving property of the noise of...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Measurement location and weather conditions for measurement on receiving property of the noise of retarders, car coupling, locomotive load... EQUIPMENT; INTERSTATE RAIL CARRIERS Measurement Criteria § 201.25 Measurement location and weather...

  2. The Role of Condition-Specific Preference-Based Measures in Health Technology Assessment.

    PubMed

    Rowen, Donna; Brazier, John; Ara, Roberta; Azzabi Zouraq, Ismail

    2017-12-01

    A condition-specific preference-based measure (CSPBM) is a measure of health-related quality of life (HRQOL) that is specific to a certain condition or disease and that can be used to obtain the quality adjustment weight of the quality-adjusted life-year (QALY) for use in economic models. This article provides an overview of the role and the development of CSPBMs, and presents a description of existing CSPBMs in the literature. The article also provides an overview of the psychometric properties of CSPBMs in comparison with generic preference-based measures (generic PBMs), and considers the advantages and disadvantages of CSPBMs in comparison with generic PBMs. CSPBMs typically include dimensions that are important for that condition but may not be important across all patient groups. There are a large number of CSPBMs across a wide range of conditions, and these vary from covering a wide range of dimensions to more symptomatic or uni-dimensional measures. Psychometric evidence is limited but suggests that CSPBMs offer an advantage in more accurate measurement of milder health states. The mean change and standard deviation can differ for CSPBMs and generic PBMs, and this may impact on incremental cost-effectiveness ratios. CSPBMs have a useful role in HTA where a generic PBM is not appropriate, sensitive or responsive. However, due to issues of comparability across different patient groups and interventions, their usage in health technology assessment is often limited to conditions where it is inappropriate to use a generic PBM or sensitivity analyses.

  3. Methodology for Assessing the Probability of Corrosion in Concrete Structures on the Basis of Half-Cell Potential and Concrete Resistivity Measurements

    PubMed Central

    2013-01-01

    In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures. PMID:23766706

  4. Methodology for assessing the probability of corrosion in concrete structures on the basis of half-cell potential and concrete resistivity measurements.

    PubMed

    Sadowski, Lukasz

    2013-01-01

    In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  5. Anonymous indexing of health conditions for a similarity measure.

    PubMed

    Song, Insu; Marsh, Nigel V

    2012-07-01

    A health social network is an online information service which facilitates information sharing between closely related members of a community with the same or a similar health condition. Over the years, many automated recommender systems have been developed for social networking in order to help users find their communities of interest. For health social networking, the ideal source of information for measuring similarities of patients is the medical information of the patients. However, it is not desirable that such sensitive and private information be shared over the Internet. This is also true for many other security sensitive domains. A new information-sharing scheme is developed where each patient is represented as a small number of (possibly disjoint) d-words (discriminant words) and the d-words are used to measure similarities between patients without revealing sensitive personal information. The d-words are simple words like "food,'' and thus do not contain identifiable personal information. This makes our method an effective one-way hashing of patient assessments for a similarity measure. The d-words can be easily shared on the Internet to find peers who might have similar health conditions.

  6. On the methodology of Engineering Geodesy

    NASA Astrophysics Data System (ADS)

    Brunner, Fritz K.

    2007-09-01

    Textbooks on geodetic surveying usually describe a very small number of principles which should provide the foundation of geodetic surveying. Here, the author argues that an applied field, such as engineering geodesy, has a methodology as foundation rather than a few principles. Ten methodological elements (ME) are identified: (1) Point discretisation of natural surfaces and objects, (2) distinction between coordinate and observation domain, (3) definition of reference systems, (4) specification of unknown parameters and desired precisions, (5) geodetic network and observation design, (6) quality control of equipment, (7) quality control of measurements, (8) establishment of measurement models, (9) establishment of parameter estimation models, (10) quality control of results. Each ME consists of a suite of theoretical developments, geodetic techniques and calculation procedures, which will be discussed. This paper is to be considered a first attempt at identifying the specific elements of the methodology of engineering geodesy. A better understanding of this methodology could lead to an increased objectivity, to a transformation of subjective practical experiences into objective working methods, and consequently to a new structure for teaching this rather diverse subject.

  7. Efficient solution methodology for calibrating the hemodynamic model using functional Magnetic Resonance Imaging (fMRI) measurements.

    PubMed

    Zambri, Brian; Djellouli, Rabia; Laleg-Kirati, Taous-Meriem

    2015-08-01

    Our aim is to propose a numerical strategy for retrieving accurately and efficiently the biophysiological parameters as well as the external stimulus characteristics corresponding to the hemodynamic mathematical model that describes changes in blood flow and blood oxygenation during brain activation. The proposed method employs the TNM-CKF method developed in [1], but in a prediction/correction framework. We present numerical results using both real and synthetic functional Magnetic Resonance Imaging (fMRI) measurements to highlight the performance characteristics of this computational methodology.

  8. Quality of life assessment in children: a review of conceptual and methodological issues in multidimensional health status measures.

    PubMed Central

    Pal, D K

    1996-01-01

    STUDY OBJECTIVE: To clarify concepts and methodological problems in existing multidimensional health status measures for children. DESIGN: Thematic review of instruments found by computerised and manual searches, 1979-95. SUBJECTS: Nine health status instruments. MAIN RESULTS: Many instruments did not satisfy criteria of being child centered or family focussed; few had sufficient psychometric properties for research or clinical use; underlying conceptual assumptions were rarely explicit. CONCLUSIONS: Quality of life measures should be viewed cautiously. Interdisciplinary discussion is required, as well as discussion with children and parents, to establish constructs that are truly useful. PMID:8882220

  9. Measuring systems of hard to get objects: problems with analysis of measurement results

    NASA Astrophysics Data System (ADS)

    Gilewska, Grazyna

    2005-02-01

    The problem accessibility of metrological parameters features of objects appeared in many measurements. Especially if it is biological object which parameters very often determined on the basis of indirect research. Accidental component predominate in forming of measurement results with very limited access to measurement objects. Every measuring process has a lot of conditions limiting its abilities to any way processing (e.g. increase number of measurement repetition to decrease random limiting error). It may be temporal, financial limitations, or in case of biological object, small volume of sample, influence measuring tool and observers on object, or whether fatigue effects e.g. at patient. It's taken listing difficulties into consideration author worked out and checked practical application of methods outlying observation reduction and next innovative methods of elimination measured data with excess variance to decrease of mean standard deviation of measured data, with limited aomunt of data and accepted level of confidence. Elaborated methods wee verified on the basis of measurement results of knee-joint width space got from radiographs. Measurements were carried out by indirectly method on the digital images of radiographs. Results of examination confirmed legitimacy to using of elaborated methodology and measurement procedures. Such methodology has special importance when standard scientific ways didn't bring expectations effects.

  10. Conditional Inference and Logic for Intelligent Systems: A Theory of Measure-Free Conditioning

    DTIC Science & Technology

    1991-08-01

    work in this direction was executed. 1 Bruno and Gilio (1985), inspired by DeFimetti’s much earlier work, proposed an abbreviated algebra of measure...See also Section 1.5.) Based on DeFinetti’s work, but independent of Bruno and Gilio , Darigelli and Scozzafi;, . (1984) mentioned the lack of apparent...1.5 below. 1.5 Logical operations among conditional events Schay (1968), Bruno and Gilio (1985) and Calabrese (1987) contain developments 36 A Survey of

  11. Local conditional entropy in measure for covers with respect to a fixed partition

    NASA Astrophysics Data System (ADS)

    Romagnoli, Pierre-Paul

    2018-05-01

    In this paper we introduce two measure theoretical notions of conditional entropy for finite measurable covers conditioned to a finite measurable partition and prove that they are equal. Using this we state a local variational principle with respect to the notion of conditional entropy defined by Misiurewicz (1976 Stud. Math. 55 176–200) for the case of open covers. This in particular extends the work done in Romagnoli (2003 Ergod. Theor. Dynam. Syst. 23 1601–10), Glasner and Weiss (2006 Handbook of Dynamical Systems vol 1B (Amsterdam: Elsevier)) and Huang et al (2006 Ergod. Theor. Dynam. Syst. 26 219–45).

  12. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    NASA Astrophysics Data System (ADS)

    Phillips-Smith, Catherine; Jeong, Cheol-Heon; Healy, Robert M.; Dabek-Zlotorzynska, Ewa; Celo, Valbona; Brook, Jeffrey R.; Evans, Greg

    2017-08-01

    The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter) were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010-November 2012) at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013), hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow, water, and biota samples collected

  13. Guidelines for measuring the physical, chemical, and biological condition of wilderness ecosystems

    Treesearch

    Douglas G Fox; J. Christopher Bernabo; Betsy Hood

    1987-01-01

    Guidelines include a large number of specific measures to characterize the existing condition of wilderness resources. Measures involve the atmospheric environment, water chemistry and biology, geology and soils, and flora. Where possible, measures are coordinated with existing long-term monitoring programs. Application of the measures will allow more effective...

  14. Optimization of the production conditions of the lipase produced by Bacillus cereus from rice flour through Plackett-Burman Design (PBD) and response surface methodology (RSM).

    PubMed

    Vasiee, Alireza; Behbahani, Behrooz Alizadeh; Yazdi, Farideh Tabatabaei; Moradi, Samira

    2016-12-01

    In this study, the screening of lipase positive bacteria from rice flour was carried out by Rhodamin B agar plate method. Bacillus cereus was identified by 16S rDNA method. Screening of the appropriate variables and optimization of the lipase production was performed using Plackett-Burman design (PBD) and response surface methodology (RSM). Among the isolated bacteria, an aerobic Bacillus cereus strain was recognized as the best lipase-producing bacteria (177.3 ± 20 U/ml). Given the results, the optimal enzyme production conditions were achieved with coriander seed extract (CSE)/yeast extract ratio of 16.9 w/w, olive oil (OO) and MgCl 2 concentration of 2.37 g/L and 24.23 mM, respectively. In these conditions, the lipase activity (LA) was predicted 343 U/mL that was approximately close to the predicted value (324 U/mL), which was increased 1.83 fold LA compared with the non-optimized lipase. The kinetic parameters of V max and K m for the lipase were measured 0.367 μM/min.mL and 5.3 mM, respectively. The lipase producing Bacillus cereus was isolated and RSM was used for the optimization of enzyme production. The CSE/yeast extract ratio of 16.9 w/w, OO concentration of 2.37 g/L and MgCl 2 concentration of 24.23 mM, were found to be the optimal conditions of the enzyme production process. LA at optimal enzyme production conditions was observed 1.83 times more than the non-optimal conditions. Ultimately, it can be concluded that the isolated B. cereus from rice flour is a proper source of lipase. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A new metric for measuring condition in large predatory sharks.

    PubMed

    Irschick, D J; Hammerschlag, N

    2014-09-01

    A simple metric (span condition analysis; SCA) is presented for quantifying the condition of sharks based on four measurements of body girth relative to body length. Data on 104 live sharks from four species that vary in body form, behaviour and habitat use (Carcharhinus leucas, Carcharhinus limbatus, Ginglymostoma cirratum and Galeocerdo cuvier) are given. Condition shows similar levels of variability among individuals within each species. Carcharhinus leucas showed a positive relationship between condition and body size, whereas the other three species showed no relationship. There was little evidence for strong differences in condition between males and females, although more male sharks are needed for some species (e.g. G. cuvier) to verify this finding. SCA is potentially viable for other large marine or terrestrial animals that are captured live and then released. © 2014 The Fisheries Society of the British Isles.

  16. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    PubMed

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information

  17. Rat sperm motility analysis: methodologic considerations

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  18. Social cognition interventions for people with schizophrenia: a systematic review focussing on methodological quality and intervention modality.

    PubMed

    Grant, Nina; Lawrence, Megan; Preti, Antonio; Wykes, Til; Cella, Matteo

    2017-08-01

    People with a diagnosis of schizophrenia have significant social and functional difficulties. Social cognition was found to influences these outcomes and in recent years interventions targeting this domain were developed. This paper reviews the existing literature on social cognition interventions for people with a diagnosis of schizophrenia focussing on: i) comparing focussed (i.e. targeting only one social cognitive domain) and global interventions and ii) studies methodological quality. Systematic search was conducted on PubMed and PsycInfo. Studies were included if they were randomised control trials, participants had a diagnosis of schizophrenia or schizoaffective disorder, and the intervention targeted at least one out of four social cognition domains (i.e. theory of mind, affect recognition, social perception and attribution bias). All papers were assessed for methodological quality. Information on the intervention, control condition, study methodology and the main findings from each study were extracted and critically summarised. Data from 32 studies fulfilled the inclusion criteria, considering a total of 1440 participants. Taking part in social cognition interventions produced significant improvements in theory of mind and affect recognition compared to both passive and active control conditions. Results were less clear for social perception and attributional bias. Focussed and global interventions had similar results on outcomes. Overall study methodological quality was modest. There was very limited evidence showing that social cognitive intervention result in functional outcome improvement. The evidence considered suggests that social cognition interventions may be a valuable approach for people with a diagnosis of schizophrenia. However, evidence quality is limited by measure heterogeneity, modest study methodology and short follow-up periods. The findings point to a number of recommendations for future research, including measurement standardisation

  19. Optimization of Manufacturing Conditions for Improving Storage Stability of Coffee-Supplemented Milk Beverage Using Response Surface Methodology.

    PubMed

    Ahn, Sung-Il; Park, Jun-Hong; Kim, Jae-Hoon; Oh, Duk-Geun; Kim, Moojoong; Chung, Donghwa; Jhoo, Jin-Woo; Kim, Gur-Yoo

    2017-01-01

    This study aimed at optimizing the manufacturing conditions of a milk beverage supplemented with coffee, and monitoring its physicochemical and sensory properties during storage. Raw milk, skim milk powder, coffee extract, and emulsifiers were used to manufacture the beverage. Two sucrose fatty acid esters, F110 and F160, were identified as suitable emulsifiers. The optimum conditions for the beverage manufacture, which can satisfy two conditions at the same time, determined by response surface methodology (RSM), were 5,000 rpm primary homogenization speed and 0.207% sucrose fatty acid emulsifier addition. The particle size and zeta-potential of the beverage under the optimum condition were 190.1 nm and - 25.94±0.06 mV, respectively. In comparison study between F110 added group (GF110) and F160 added group (GF160) during storage, all samples maintained its pH around 6.6 to 6.7, and there was no significant difference ( p <0.05). In addition, GF110 showed significantly higher zeta-potential than GF160 ( p <0.05). The particle size of GF110 and GF160 were approximately 190.1 and 223.1 nm, respectively at initial. However, size distribution of the GF160 tended to increase during storage. Moreover, increase of the particle size in GF160 was observed in microphotographs of it during storage. The L* values gradually decreased within all groups, whereas the a* and b* values did not show significant variations ( p <0.05). Compared with GF160, bitterness, floating cream, and rancid flavor were more pronounced in the GF110. Based on the result obtained from the present study, it appears that the sucrose fatty acid ester F110 is more suitable emulsifier when it comes to manufacturing this beverage than the F160, and also contributes to extending product shelf-life.

  20. Optimization of Manufacturing Conditions for Improving Storage Stability of Coffee-Supplemented Milk Beverage Using Response Surface Methodology

    PubMed Central

    Kim, Jae-Hoon; Oh, Duk-Geun; Kim, Moojoong; Chung, Donghwa

    2017-01-01

    This study aimed at optimizing the manufacturing conditions of a milk beverage supplemented with coffee, and monitoring its physicochemical and sensory properties during storage. Raw milk, skim milk powder, coffee extract, and emulsifiers were used to manufacture the beverage. Two sucrose fatty acid esters, F110 and F160, were identified as suitable emulsifiers. The optimum conditions for the beverage manufacture, which can satisfy two conditions at the same time, determined by response surface methodology (RSM), were 5,000 rpm primary homogenization speed and 0.207% sucrose fatty acid emulsifier addition. The particle size and zeta-potential of the beverage under the optimum condition were 190.1 nm and - 25.94±0.06 mV, respectively. In comparison study between F110 added group (GF110) and F160 added group (GF160) during storage, all samples maintained its pH around 6.6 to 6.7, and there was no significant difference (p<0.05). In addition, GF110 showed significantly higher zeta-potential than GF160 (p<0.05). The particle size of GF110 and GF160 were approximately 190.1 and 223.1 nm, respectively at initial. However, size distribution of the GF160 tended to increase during storage. Moreover, increase of the particle size in GF160 was observed in microphotographs of it during storage. The L* values gradually decreased within all groups, whereas the a* and b* values did not show significant variations (p<0.05). Compared with GF160, bitterness, floating cream, and rancid flavor were more pronounced in the GF110. Based on the result obtained from the present study, it appears that the sucrose fatty acid ester F110 is more suitable emulsifier when it comes to manufacturing this beverage than the F160, and also contributes to extending product shelf-life. PMID:28316475

  1. Conditioning a segmented stem profile model for two diameter measurements

    Treesearch

    Raymond L. Czaplewski; Joe P. Mcclure

    1988-01-01

    The stem profile model of Max and Burkhart (1976) is conditioned for dbh and a second upper stem measurement. This model was applied to a loblolly pine data set using diameter outside bark at 5.3m (i.e., height of 17.3 foot Girard form class) as the second upper stem measurement, and then compared to the original, unconditioned model. Variance of residuals was reduced...

  2. An evaluation of total starch and starch gelatinization methodologies in pelleted animal feed.

    PubMed

    Zhu, L; Jones, C; Guo, Q; Lewis, L; Stark, C R; Alavi, S

    2016-04-01

    The quantification of total starch content (TS) or degree of starch gelatinization (DG) in animal feed is always challenging because of the potential interference from other ingredients. In this study, the differences in TS or DG measurement in pelleted swine feed due to variations in analytical methodology were quantified. Pelleted swine feed was used to create 6 different diets manufactured with various processing conditions in a 2 × 3 factorial design (2 conditioning temperatures, 77 or 88°C, and 3 conditioning retention times, 15, 30, or 60 s). Samples at each processing stage (cold mash, hot mash, hot pelletized feed, and final cooled pelletized feed) were collected for each of the 6 treatments and analyzed for TS and DG. Two different methodologies were evaluated for TS determination (the AOAC International method 996.11 vs. the modified glucoamylase method) and DG determination (the modified glucoamylase method vs. differential scanning calorimetry [DSC]). For TS determination, the AOAC International method 996.11 measured lower TS values in cold pellets compared with the modified glucoamylase method. The AOAC International method resulted in lower TS in cold mash than cooled pelletized feed, whereas the modified glucoamylase method showed no significant differences in TS content before or after pelleting. For DG, the modified glucoamylase method demonstrated increased DG with each processing step. Furthermore, increasing the conditioning temperature and time resulted in a greater DG when evaluated by the modified glucoamylase method. However, results demonstrated that DSC is not suitable as a quantitative tool for determining DG in multicomponent animal feeds due to interferences from nonstarch transformations, such as protein denaturation.

  3. Moving to Capture Children’s Attention: Developing a Methodology for Measuring Visuomotor Attention

    PubMed Central

    Coats, Rachel O.; Mushtaq, Faisal; Williams, Justin H. G.; Aucott, Lorna S.; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child’s development. However, methodological limitations currently make large-scale assessment of children’s attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of ‘Visual Motor Attention’ (VMA)—a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method’s core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults’ attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  4. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  5. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  6. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  7. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  8. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  9. Methodology for dynamic biaxial tension testing of pregnant uterine tissue.

    PubMed

    Manoogian, Sarah; Mcnally, Craig; Calloway, Britt; Duma, Stefan

    2007-01-01

    Placental abruption accounts for 50% to 70% of fetal losses in motor vehicle crashes. Since automobile crashes are the leading cause of traumatic fetal injury mortality in the United States, research of this injury mechanism is important. Before research can adequately evaluate current and future restraint designs, a detailed model of the pregnant uterine tissues is necessary. The purpose of this study is to develop a methodology for testing the pregnant uterus in biaxial tension at a rate normally seen in a motor vehicle crash. Since the majority of previous biaxial work has established methods for quasi-static testing, this paper combines previous research and new methods to develop a custom designed system to strain the tissue at a dynamic rate. Load cells and optical markers are used for calculating stress strain curves of the perpendicular loading axes. Results for this methodology show images of a tissue specimen loaded and a finite verification of the optical strain measurement. The biaxial test system dynamically pulls the tissue to failure with synchronous motion of four tissue grips that are rigidly coupled to the tissue specimen. The test device models in situ loading conditions of the pregnant uterus and overcomes previous limitations of biaxial testing. A non-contact method of measuring strains combined with data reduction to resolve the stresses in two directions provides the information necessary to develop a three dimensional constitutive model of the material. Moreover, future research can apply this method to other soft tissues with similar in situ loading conditions.

  10. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    PubMed

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA

  11. Combining tracer flux ratio methodology with low-flying aircraft measurements to estimate dairy farm CH4 emissions

    NASA Astrophysics Data System (ADS)

    Daube, C.; Conley, S.; Faloona, I. C.; Yacovitch, T. I.; Roscioli, J. R.; Morris, M.; Curry, J.; Arndt, C.; Herndon, S. C.

    2017-12-01

    Livestock activity, enteric fermentation of feed and anaerobic digestion of waste, contributes significantly to the methane budget of the United States (EPA, 2016). Studies question the reported magnitude of these methane sources (Miller et. al., 2013), calling for more detailed research of agricultural animals (Hristov, 2014). Tracer flux ratio is an attractive experimental method to bring to this problem because it does not rely on estimates of atmospheric dispersion. Collection of data occurred during one week at two dairy farms in central California (June, 2016). Each farm varied in size, layout, head count, and general operation. The tracer flux ratio method involves releasing ethane on-site with a known flow rate to serve as a tracer gas. Downwind mixed enhancements in ethane (from the tracer) and methane (from the dairy) were measured, and their ratio used to infer the unknown methane emission rate from the farm. An instrumented van drove transects downwind of each farm on public roads while tracer gases were released on-site, employing the tracer flux ratio methodology to assess simultaneous methane and tracer gas plumes. Flying circles around each farm, a small instrumented aircraft made measurements to perform a mass balance evaluation of methane gas. In the course of these two different methane quantification techniques, we were able to validate yet a third method: tracer flux ratio measured via aircraft. Ground-based tracer release rates were applied to the aircraft-observed methane-to-ethane ratios, yielding whole-site methane emission rates. Never before has the tracer flux ratio method been executed with aircraft measurements. Estimates from this new application closely resemble results from the standard ground-based technique to within their respective uncertainties. Incorporating this new dimension to the tracer flux ratio methodology provides additional context for local plume dynamics and validation of both ground and flight-based data.

  12. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  13. The application of conditioning paradigms in the measurement of pain

    PubMed Central

    Li, Jun-Xu

    2013-01-01

    Pain is a private experience that involves both sensory and emotional components. Animal studies of pain can only be inferred by their responses, and therefore the measurement of reflexive responses dominate the pain literature for nearly a century. It has been argued that although reflexive responses are important to unveil the sensory nature of pain in organisms, pain affect is equally important but largely ignored in pain studies primarily due to the lack of validated animal models. One strategy to begin to understand pain affect is to use conditioning principles to indirectly reveal the affective condition of pain. This review critically analyzed several procedures that are thought to measure affective learning of pain. The procedures regarding the current knowledge, the applications, and their advantages and disadvantages in pain research are discussed. It is proposed that these procedures should be combined with traditional reflex-based pain measurements in future studies of pain, which could greatly benefit both the understanding of neural underpinnings of pain and preclinical assessment of novel analgesics. PMID:23500202

  14. The application of conditioning paradigms in the measurement of pain.

    PubMed

    Li, Jun-Xu

    2013-09-15

    Pain is a private experience that involves both sensory and emotional components. Animal studies of pain can only be inferred by their responses, and therefore the measurement of reflexive responses dominates the pain literature for nearly a century. It has been argued that although reflexive responses are important to unveil the sensory nature of pain in organisms, pain affect is equally important but largely ignored in pain studies primarily due to the lack of validated animal models. One strategy to begin to understand pain affect is to use conditioning principles to indirectly reveal the affective condition of pain. This review critically analyzed several procedures that are thought to measure affective learning of pain. The procedures regarding the current knowledge, the applications, and their advantages and disadvantages in pain research are discussed. It is proposed that these procedures should be combined with traditional reflex-based pain measurements in future studies of pain, which could greatly benefit both the understanding of neural underpinnings of pain and preclinical assessment of novel analgesics. © 2013 Elsevier B.V. All rights reserved.

  15. Methodology issues in implementation science.

    PubMed

    Newhouse, Robin; Bobay, Kathleen; Dykes, Patricia C; Stevens, Kathleen R; Titler, Marita

    2013-04-01

    Putting evidence into practice at the point of care delivery requires an understanding of implementation strategies that work, in what context and how. To identify methodological issues in implementation science using 4 studies as cases and make recommendations for further methods development. Four cases are presented and methodological issues identified. For each issue raised, evidence on the state of the science is described. Issues in implementation science identified include diverse conceptual frameworks, potential weaknesses in pragmatic study designs, and the paucity of standard concepts and measurement. Recommendations to advance methods in implementation include developing a core set of implementation concepts and metrics, generating standards for implementation methods including pragmatic trials, mixed methods designs, complex interventions and measurement, and endorsing reporting standards for implementation studies.

  16. Measurement and meaning of salivary cortisol: a focus on health and disease in children.

    PubMed

    Jessop, David S; Turner-Cobb, Julie M

    2008-01-01

    Measurement of salivary cortisol can provide important information about hypothalamic-pituitary-adrenal (HPA) axis activity under normal conditions and in response to stress. However, there are many variables relating to the measurement of cortisol in saliva which may introduce error and therefore may render difficult the comparison and interpretation of data between, and within, laboratories. This review addresses the effects of gender, age, time and location of sampling, units of measurement, assay conditions and compliance with the protocol, all of which have the potential to impact upon the precision, accuracy and reliability of salivary cortisol measurements in the literature. Some of these factors are applicable to both adults and children, but the measurement of salivary cortisol in children introduces aspects of unique variability which demand special attention. The specific focus of this review is upon the somewhat neglected area of methodological variability of salivary cortisol measurement in children. In addition to these methodological issues, the review highlights the use of salivary cortisol measurements to provide information about HPA axis dysfunction associated with psycho- and patho-physiological conditions in children. Novel applications for salivary cortisol measurements in future research into HPA axis activity in children are also discussed.

  17. Methodology for modeling the devolatilization of refuse-derived fuel from thermogravimetric analysis of municipal solid waste components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritsky, K.J.; Miller, D.L.; Cernansky, N.P.

    1994-09-01

    A methodology was introduced for modeling the devolatilization characteristics of refuse-derived fuel (RFD) in terms of temperature-dependent weight loss. The basic premise of the methodology is that RDF is modeled as a combination of select municipal solid waste (MSW) components. Kinetic parameters are derived for each component from thermogravimetric analyzer (TGA) data measured at a specific set of conditions. These experimentally derived parameters, along with user-derived parameters, are inputted to model equations for the purpose of calculating thermograms for the components. The component thermograms are summed to create a composite thermogram that is an estimate of the devolatilization for themore » as-modeled RFD. The methodology has several attractive features as a thermal analysis tool for waste fuels. 7 refs., 10 figs., 3 tabs.« less

  18. Probability distributions of continuous measurement results for conditioned quantum evolution

    NASA Astrophysics Data System (ADS)

    Franquet, A.; Nazarov, Yuli V.

    2017-02-01

    We address the statistics of continuous weak linear measurement on a few-state quantum system that is subject to a conditioned quantum evolution. For a conditioned evolution, both the initial and final states of the system are fixed: the latter is achieved by the postselection in the end of the evolution. The statistics may drastically differ from the nonconditioned case, and the interference between initial and final states can be observed in the probability distributions of measurement outcomes as well as in the average values exceeding the conventional range of nonconditioned averages. We develop a proper formalism to compute the distributions of measurement outcomes, and evaluate and discuss the distributions in experimentally relevant setups. We demonstrate the manifestations of the interference between initial and final states in various regimes. We consider analytically simple examples of nontrivial probability distributions. We reveal peaks (or dips) at half-quantized values of the measurement outputs. We discuss in detail the case of zero overlap between initial and final states demonstrating anomalously big average outputs and sudden jump in time-integrated output. We present and discuss the numerical evaluation of the probability distribution aiming at extending the analytical results and describing a realistic experimental situation of a qubit in the regime of resonant fluorescence.

  19. Methodological rigor and citation frequency in patient compliance literature.

    PubMed Central

    Bruer, J T

    1982-01-01

    An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334

  20. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  1. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  2. Adaptation of EVIAVE methodology for monitoring and follow-up when evaluating the environmental impact of landfills

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arrieta, Gabriela, E-mail: tonina1903@hotmail.com; Requena, Ignacio, E-mail: requena@decsai.ugr.es; Toro, Javier, E-mail: jjtoroca@unal.edu.co

    Treatment and final disposal of Municipal Solid Waste can have a significant role in the generation of negative environmental impacts. As a prevention strategy, such activities are subjected to the process of Environmental Impact Assessment (EIA). Still, the follow-up of Environmental Management Plans or mitigation measures is limited, for one due to a lack of methodological approaches. In searching for possibilities, the University of Granada (Spain) developed a diagnostic methodology named EVIAVE, which allows one to quantify, by means of indexes, the environmental impact of landfills in view of their location and the conditions of exploitation. EVIAVE is applicable withinmore » the legal framework of the European Union and can be adapted to the environmental and legal conditions of other countries. This study entails its adaptation in Colombia, for the follow-up and control of the EIA process for landfills. Modifications involved inclusion of the environmental elements flora and fauna, and the evaluation of the environmental descriptors in agreement with the concept of vulnerability. The application of the modified EVIAVE in Colombian landfills allowed us to identify the elements affected by the operating conditions and maintenance. It may be concluded that this methodology is viable and effective for the follow-up and environmental control of EIA processes for landfills, and to analyze the associated risks, as it takes into account related environmental threats and vulnerabilities. - Highlights: • A modified methodology is used to monitor and follow-up environmental impacts in landfills. • The improved methodology includes the Vulnerability of Flora and Fauna to evaluate environmental impact of landfills. • The methodology serves to identify and evaluate the sources of risk generated in the construction and siting of landfills. • Environmental vulnerability indicators improve effectiveness of the control and follow-up phases of landfill management.

  3. Investigation of Seepage Meter Measurements in Steady Flow and Wave Conditions.

    PubMed

    Russoniello, Christopher J; Michael, Holly A

    2015-01-01

    Water exchange between surface water and groundwater can modulate or generate ecologically important fluxes of solutes across the sediment-water interface. Seepage meters can directly measure fluid flux, but mechanical resistance and surface water dynamics may lead to inaccurate measurements. Tank experiments were conducted to determine effects of mechanical resistance on measurement efficiency and occurrence of directional asymmetry that could lead to erroneous net flux measurements. Seepage meter efficiency was high (average of 93%) and consistent for inflow and outflow under steady flow conditions. Wave effects on seepage meter measurements were investigated in a wave flume. Seepage meter net flux measurements averaged 0.08 cm/h-greater than the expected net-zero flux, but significantly less than theoretical wave-driven unidirectional discharge or recharge. Calculations of unidirectional flux from pressure measurements (Darcy flux) and theory matched well for a ratio of wave length to water depth less than 5, but not when this ratio was greater. Both were higher than seepage meter measurements of unidirectional flux made with one-way valves. Discharge averaged 23% greater than recharge in both seepage meter measurements and Darcy calculations of unidirectional flux. Removal of the collection bag reduced this net discharge. The presence of a seepage meter reduced the amplitude of pressure signals at the bed and resulted in a nearly uniform pressure distribution beneath the seepage meter. These results show that seepage meters may provide accurate measurements of both discharge and recharge under steady flow conditions and illustrate the potential measurement errors associated with dynamic wave environments. © 2014, National Ground Water Association.

  4. Novel Methods for Optically Measuring Whitecaps Under Natural Wave Breaking Conditions in the Southern Ocean

    NASA Astrophysics Data System (ADS)

    Randolph, K. L.; Dierssen, H. M.; Cifuentes-Lorenzen, A.; Balch, W. M.; Monahan, E. C.; Zappa, C. J.; Drapeau, D.; Bowler, B.

    2016-02-01

    Breaking waves on the ocean surface mark areas of significant importance to air-sea flux estimates of gas, aerosols, and heat. Traditional methods of measuring whitecap coverage using digital photography can miss features that are small in size or do not show high enough contrast to the background. The geometry of the images collected captures the near surface, bright manifestations of the whitecap feature and miss a portion of the bubble plume that is responsible for the production of sea salt aerosols and the transfer of lower solubility gases. Here, a novel method for accurately measuring both the fractional coverage of whitecaps and the intensity and decay rate of whitecap events using above water radiometry is presented. The methodology was developed using data collected during the austral summer in the Atlantic sector of the Southern Ocean under a large range of wind (speeds of 1 to 15 m s-1) and wave (significant wave heights 2 to 8 m) conditions as part of the Southern Ocean Gas Exchange experiment. Whitecap metrics were retrieved by employing a magnitude threshold based on the interquartile range of the radiance or reflectance signal for a single channel (411 nm) after a baseline removal, determined using a moving minimum/maximum filter. Breaking intensity and decay rate metrics were produced from the integration of, and the exponential fit to, radiance or reflectance over the lifetime of the whitecap. When compared to fractional whitecap coverage measurements obtained from high resolution digital images, radiometric estimates were consistently higher because they capture more of the decaying bubble plume area that is difficult to detect with photography. Radiometrically-retrieved whitecap measurements are presented in the context of concurrently measured meteorological (e.g., wind speed) and oceanographic (e.g., wave) data. The optimal fit of the radiometrically estimated whitecap coverage to the instantaneous wind speed, determined using ordinary least

  5. Aircraft measurement of ozone turbulent flux in the atmospheric boundary layer

    NASA Astrophysics Data System (ADS)

    Affre, Ch.; Carrara, A.; Lefebre, F.; Druilhet, A.; Fontan, J.; Lopez, A.

    In May 1995, the "Chimie-Creil 95" experiment was undertaken in the north of France. The field data are first used to validate the methodology for airborne measurement of ozone flux. A certain number of methodological problems due to the location of the fast ozone sensor inside the airplane are, furthermore discussed. The paper describes the instrumentation of the ARAT (Avion de Recherche Atmosphérique et de Télédétection), an atmospheric research and remote-sensing aircraft used to perform the airborne measurements, the area flown over, the meteorological conditions and boundary layer stability conditions. These aircraft measurements are then used to determine ozone deposition velocity and values are proposed for aerodynamic, bulk transfer coefficients (ozone and momentum). The paper also establishes the relationship between the normalised standard deviation and stability parameters ( z/ L) for ozone, temperature, humidity and vertical velocity. The laws obtained are then presented.

  6. Measurement of heat stress conditions at cow level and comparison to climate conditions at stationary locations inside a dairy barn.

    PubMed

    Schüller, Laura K; Heuwieser, Wolfgang

    2016-08-01

    The objectives of this study were to examine heat stress conditions at cow level and to investigate the relationship to the climate conditions at 5 different stationary locations inside a dairy barn. In addition, we compared the climate conditions at cow level between primiparous and multiparous cows for a period of 1 week after regrouping. The temperature-humidity index (THI) differed significantly between all stationary loggers. The lowest THI was measured at the window logger in the experimental stall and the highest THI was measured at the central logger in the experimental stall. The THI at the mobile cow loggers was 2·33 THI points higher than at the stationary loggers. Furthermore, the mean daily THI was higher at the mobile cow loggers than at the stationary loggers on all experimental days. The THI in the experimental pen was 0·44 THI points lower when the experimental cow group was located inside the milking parlour. The THI measured at the mobile cow loggers was 1·63 THI points higher when the experimental cow group was located inside the milking parlour. However, there was no significant difference for all climate variables between primiparous and multiparous cows. These results indicate, there is a wide range of climate conditions inside a dairy barn and especially areas with a great distance to a fresh air supply have an increased risk for the occurrence of heat stress conditions. Furthermore, the heat stress conditions are even higher at cow level and cows not only influence their climatic environment, but also generate microclimates within different locations inside the barn. Therefore climate conditions should be obtained at cow level to evaluate the heat stress conditions that dairy cows are actually exposed to.

  7. A Nursing Process Methodology.

    ERIC Educational Resources Information Center

    Ryan-Wenger, Nancy M.

    1990-01-01

    A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)

  8. Energy index decomposition methodology at the plant level

    NASA Astrophysics Data System (ADS)

    Kumphai, Wisit

    Scope and method of study. The dissertation explores the use of a high level energy intensity index as a facility-level energy performance monitoring indicator with a goal of developing a methodology for an economically based energy performance monitoring system that incorporates production information. The performance measure closely monitors energy usage, production quantity, and product mix and determines the production efficiency as a part of an ongoing process that would enable facility managers to keep track of and, in the future, be able to predict when to perform a recommissioning process. The study focuses on the use of the index decomposition methodology and explored several high level (industry, sector, and country levels) energy utilization indexes, namely, Additive Log Mean Divisia, Multiplicative Log Mean Divisia, and Additive Refined Laspeyres. One level of index decomposition is performed. The indexes are decomposed into Intensity and Product mix effects. These indexes are tested on a flow shop brick manufacturing plant model in three different climates in the United States. The indexes obtained are analyzed by fitting an ARIMA model and testing for dependency between the two decomposed indexes. Findings and conclusions. The results concluded that the Additive Refined Laspeyres index decomposition methodology is suitable to use on a flow shop, non air conditioned production environment as an energy performance monitoring indicator. It is likely that this research can be further expanded in to predicting when to perform a recommissioning process.

  9. Effects of Specimen Collection Methodologies and Storage Conditions on the Short-Term Stability of Oral Microbiome Taxonomy.

    PubMed

    Luo, Ting; Srinivasan, Usha; Ramadugu, Kirtana; Shedden, Kerby A; Neiswanger, Katherine; Trumble, Erika; Li, Jiean J; McNeil, Daniel W; Crout, Richard J; Weyant, Robert J; Marazita, Mary L; Foxman, Betsy

    2016-09-15

    a study site, and shipping requirements. The research presented in this paper measures the effects of multiple storage parameters and collection methodologies on the measured ecology of the oral microbiome from healthy adults and children. These results will potentially enable investigators to conduct oral microbiome studies at maximal efficiency by guiding informed administrative decisions pertaining to the necessary field or clinical work. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  10. Effects of Specimen Collection Methodologies and Storage Conditions on the Short-Term Stability of Oral Microbiome Taxonomy

    PubMed Central

    Luo, Ting; Srinivasan, Usha; Ramadugu, Kirtana; Shedden, Kerby A.; Neiswanger, Katherine; Trumble, Erika; Li, Jiean J.; McNeil, Daniel W.; Crout, Richard J.; Weyant, Robert J.; Marazita, Mary L.

    2016-01-01

    , resources available at a study site, and shipping requirements. The research presented in this paper measures the effects of multiple storage parameters and collection methodologies on the measured ecology of the oral microbiome from healthy adults and children. These results will potentially enable investigators to conduct oral microbiome studies at maximal efficiency by guiding informed administrative decisions pertaining to the necessary field or clinical work. PMID:27371581

  11. Rater methodology for stroboscopy: a systematic review.

    PubMed

    Bonilha, Heather Shaw; Focht, Kendrea L; Martin-Harris, Bonnie

    2015-01-01

    Laryngeal endoscopy with stroboscopy (LES) remains the clinical gold standard for assessing vocal fold function. LES is used to evaluate the efficacy of voice treatments in research studies and clinical practice. LES as a voice treatment outcome tool is only as good as the clinician interpreting the recordings. Research using LES as a treatment outcome measure should be evaluated based on rater methodology and reliability. The purpose of this literature review was to evaluate the rater-related methodology from studies that use stroboscopic findings as voice treatment outcome measures. Systematic literature review. Computerized journal databases were searched for relevant articles using terms: stroboscopy and treatment. Eligible articles were categorized and evaluated for the use of rater-related methodology, reporting of number of raters, types of raters, blinding, and rater reliability. Of the 738 articles reviewed, 80 articles met inclusion criteria. More than one-third of the studies included in the review did not report the number of raters who participated in the study. Eleven studies reported results of rater reliability analysis with only two studies reporting good inter- and intrarater reliability. The comparability and use of results from treatment studies that use LES are limited by a lack of rigor in rater methodology and variable, mostly poor, inter- and intrarater reliability. To improve our ability to evaluate and use the findings from voice treatment studies that use LES features as outcome measures, greater consistency of reporting rater methodology characteristics across studies and improved rater reliability is needed. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  12. Measurement of Survival Time in Brachionus Rotifers: Synchronization of Maternal Conditions.

    PubMed

    Kaneko, Gen; Yoshinaga, Tatsuki; Gribble, Kristin E; Welch, David M; Ushio, Hideki

    2016-07-22

    Rotifers are microscopic cosmopolitan zooplankton used as models in ecotoxicological and aging studies due to their several advantages such as short lifespan, ease of culture, and parthenogenesis that enables clonal culture. However, caution is required when measuring their survival time as it is affected by maternal age and maternal feeding conditions. Here we provide a protocol for powerful and reproducible measurement of the survival time in Brachionus rotifers following a careful synchronization of culture conditions over several generations. Empirically, poor synchronization results in early mortality and a gradual decrease in survival rate, thus resulting in weak statistical power. Indeed, under such conditions, calorie restriction (CR) failed to significantly extend the lifespan of B. plicatilis although CR-induced longevity has been demonstrated with well-synchronized rotifer samples in past and present studies. This protocol is probably useful for other invertebrate models, including the fruitfly Drosophila melanogaster and the nematode Caenorhabditis elegans, because maternal age effects have also been reported in these species.

  13. Measurement of Survival Time in Brachionus Rotifers: Synchronization of Maternal Conditions

    PubMed Central

    Kaneko, Gen; Yoshinaga, Tatsuki; Gribble, Kristin E.; Welch, David M.; Ushio, Hideki

    2016-01-01

    Rotifers are microscopic cosmopolitan zooplankton used as models in ecotoxicological and aging studies due to their several advantages such as short lifespan, ease of culture, and parthenogenesis that enables clonal culture. However, caution is required when measuring their survival time as it is affected by maternal age and maternal feeding conditions. Here we provide a protocol for powerful and reproducible measurement of the survival time in Brachionus rotifers following a careful synchronization of culture conditions over several generations. Empirically, poor synchronization results in early mortality and a gradual decrease in survival rate, thus resulting in weak statistical power. Indeed, under such conditions, calorie restriction (CR) failed to significantly extend the lifespan of B. plicatilis although CR-induced longevity has been demonstrated with well-synchronized rotifer samples in past and present studies. This protocol is probably useful for other invertebrate models, including the fruitfly Drosophila melanogaster and the nematode Caenorhabditis elegans, because maternal age effects have also been reported in these species. PMID:27500471

  14. A review on human reinstatement studies: an overview and methodological challenges.

    PubMed

    Haaker, Jan; Golkar, Armita; Hermans, Dirk; Lonsdorf, Tina B

    2014-09-01

    In human research, studies of return of fear (ROF) phenomena, and reinstatement in particular, began only a decade ago and recently are more widely used, e.g., as outcome measures for fear/extinction memory manipulations (e.g., reconsolidation). As reinstatement research in humans is still in its infancy, providing an overview of its stability and boundary conditions and summarizing methodological challenges is timely to foster fruitful future research. As a translational endeavor, clarifying the circumstances under which (experimental) reinstatement occurs may offer a first step toward understanding relapse as a clinical phenomenon and pave the way for the development of new pharmacological or behavioral ways to prevent ROF. The current state of research does not yet allow pinpointing these circumstances in detail and we hope this review will aid the research field to advance in this direction. As an introduction, we begin with a synopsis of rodent work on reinstatement and theories that have been proposed to explain the findings. The review however mainly focuses on reinstatement in humans. We first describe details and variations of the experimental setup in reinstatement studies in humans and give a general overview of results. We continue with a compilation of possible experimental boundary conditions and end with the role of individual differences and behavioral and/or pharmacological manipulations. Furthermore, we compile important methodological and design details on the published studies in humans and end with open research questions and some important methodological and design recommendations as a guide for future research. © 2014 Haaker et al.; Published by Cold Spring Harbor Laboratory Press.

  15. A review on human reinstatement studies: an overview and methodological challenges

    PubMed Central

    Haaker, Jan; Golkar, Armita; Hermans, Dirk

    2014-01-01

    In human research, studies of return of fear (ROF) phenomena, and reinstatement in particular, began only a decade ago and recently are more widely used, e.g., as outcome measures for fear/extinction memory manipulations (e.g., reconsolidation). As reinstatement research in humans is still in its infancy, providing an overview of its stability and boundary conditions and summarizing methodological challenges is timely to foster fruitful future research. As a translational endeavor, clarifying the circumstances under which (experimental) reinstatement occurs may offer a first step toward understanding relapse as a clinical phenomenon and pave the way for the development of new pharmacological or behavioral ways to prevent ROF. The current state of research does not yet allow pinpointing these circumstances in detail and we hope this review will aid the research field to advance in this direction. As an introduction, we begin with a synopsis of rodent work on reinstatement and theories that have been proposed to explain the findings. The review however mainly focuses on reinstatement in humans. We first describe details and variations of the experimental setup in reinstatement studies in humans and give a general overview of results. We continue with a compilation of possible experimental boundary conditions and end with the role of individual differences and behavioral and/or pharmacological manipulations. Furthermore, we compile important methodological and design details on the published studies in humans and end with open research questions and some important methodological and design recommendations as a guide for future research. PMID:25128533

  16. LSU: The Library Space Utilization Methodology.

    ERIC Educational Resources Information Center

    Hall, Richard B.

    A computerized research technique for measuring the space utilization of public library facilities provides a behavioral activity and occupancy analysis for library planning purposes. The library space utilization (LSU) methodology demonstrates that significant information about the functional requirements of a library can be measured and…

  17. The exchangeability of self-reports and administrative health care resource use measurements: assessement of the methodological reporting quality.

    PubMed

    Noben, Cindy Yvonne; de Rijk, Angelique; Nijhuis, Frans; Kottner, Jan; Evers, Silvia

    2016-06-01

    To assess the exchangeability of self-reported and administrative health care resource use measurements for cost estimation. In a systematic review (NHS EED and MEDLINE), reviewers evaluate, in duplicate, the methodological reporting quality of studies comparing the validation evidence of instruments measuring health care resource use. The appraisal tool Methodological Reporting Quality (MeRQ) is developed by merging aspects form the Guidelines for Reporting Reliability and Agreement Studies and the Standards for Reporting Diagnostic Accuracy. Out of 173 studies, 35 full-text articles are assessed for eligibility. Sixteen articles are included in this study. In seven articles, more than 75% of the reporting criteria assessed by MERQ are considered "good." Most studies score at least "fair" on most of the reporting quality criteria. In the end, six studies score "good" on the minimal criteria for reporting. Varying levels of agreement among the different data sources are found, with correlations ranging from 0.14 up to 0.93 and with occurrences of both random and systematic errors. The validation evidence of the small number of studies with adequate MeRQ cautiously supports the exchangeability of both the self-reported and administrative resource use measurement methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Acoustic methodology review

    NASA Technical Reports Server (NTRS)

    Schlegel, R. G.

    1982-01-01

    It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.

  19. Instrumentation and methodology for quantifying GFP fluorescence in intact plant organs

    NASA Technical Reports Server (NTRS)

    Millwood, R. J.; Halfhill, M. D.; Harkins, D.; Russotti, R.; Stewart, C. N. Jr

    2003-01-01

    The General Fluorescence Plant Meter (GFP-Meter) is a portable spectrofluorometer that utilizes a fiber-optic cable and a leaf clip to gather spectrofluorescence data. In contrast to traditional analytical systems, this instrument allows for the rapid detection and fluorescence measurement of proteins under field conditions with no damage to plant tissue. Here we discuss the methodology of gathering and standardizing spectrofluorescence data from tobacco and canola plants expressing GFP. Furthermore, we demonstrate the accuracy and effectiveness of the GFP-Meter. We first compared GFP fluorescence measurements taken by the GFP-Meter to those taken by a standard laboratory-based spectrofluorometer, the FluoroMax-2. Spectrofluorescence measurements were taken from the same location on intact leaves. When these measurements were tested by simple linear regression analysis, we found that there was a positive functional relationship between instruments. Finally, to exhibit that the GFP-Meter recorded accurate measurements over a span of time, we completed a time-course analysis of GFP fluorescence measurements. We found that only initial measurements were accurate; however, subsequent measurements could be used for qualitative purposes.

  20. An Assessment Methodology to Evaluate In-Flight Engine Health Management Effectiveness

    NASA Astrophysics Data System (ADS)

    Maggio, Gaspare; Belyeu, Rebecca; Pelaccio, Dennis G.

    2002-01-01

    flight effectiveness of candidate engine health management system concepts. A next generation engine health management system will be required to be both reliable and robust in terms of anomaly detection capability. The system must be able to operate successfully in the hostile, high-stress engine system environment. This implies that its system components, such as the instrumentation, process and control, and vehicle interface and support subsystems, must be highly reliable. Additionally, the system must be able to address a vast range of possible engine operation anomalies through a host of different types of measurements supported by a fast algorithm/architecture processing capability that can identify "true" (real) engine operation anomalies. False anomaly condition reports for such a system must be essentially eliminated. The accuracy of identifying only real anomaly conditions has been an issue with the Space Shuttle Main Engine (SSME) in the past. Much improvement in many of the technologies to address these areas is required. The objectives of this study were to identify and demonstrate a consistent assessment methodology that can evaluate the capability of next generation engine health management system concepts to respond in a correct, timely manner to alleviate an operational engine anomaly condition during flight. Science Applications International Corporation (SAIC), with support from NASA Marshall Space Flight Center, identified a probabilistic modeling approach to assess engine health management system concept effectiveness using a deterministic anomaly-time event assessment modeling approach that can be applied in the engine preliminary design stage of development to assess engine health management system concept effectiveness. Much discussion in this paper focuses on the formulation and application approach in performing this assessment. This includes detailed discussion of key modeling assumptions, the overall assessment methodology approach

  1. Integrating uniform design and response surface methodology to optimize thiacloprid suspension

    PubMed Central

    Li, Bei-xing; Wang, Wei-chang; Zhang, Xian-peng; Zhang, Da-xia; Mu, Wei; Liu, Feng

    2017-01-01

    A model 25% suspension concentrate (SC) of thiacloprid was adopted to evaluate an integrative approach of uniform design and response surface methodology. Tersperse2700, PE1601, xanthan gum and veegum were the four experimental factors, and the aqueous separation ratio and viscosity were the two dependent variables. Linear and quadratic polynomial models of stepwise regression and partial least squares were adopted to test the fit of the experimental data. Verification tests revealed satisfactory agreement between the experimental and predicted data. The measured values for the aqueous separation ratio and viscosity were 3.45% and 278.8 mPa·s, respectively, and the relative errors of the predicted values were 9.57% and 2.65%, respectively (prepared under the proposed conditions). Comprehensive benefits could also be obtained by appropriately adjusting the amount of certain adjuvants based on practical requirements. Integrating uniform design and response surface methodology is an effective strategy for optimizing SC formulas. PMID:28383036

  2. Investigation of optimal conditions for production of highly crystalline nanocellulose with increased yield via novel Cr(III)-catalyzed hydrolysis: Response surface methodology.

    PubMed

    Chen, You Wei; Lee, Hwei Voon; Abd Hamid, Sharifah Bee

    2017-12-15

    For the first time, a highly efficient Cr(NO 3 ) 3 catalysis system was proposed for optimization the yield and crystallinity of nanocellulose end product. A five-level three-factor central composite design coupled with response surface methodology was employed to elucidate parameters interactions between three design factors, namely reaction temperature (x 1 ), reaction time (x 2 ) and concentration of Cr(NO 3 ) 3 (x 3 ) over a broad range of process conditions and determine the effect on crystallinity index and product yield. The developed models predicted the maximum nanocellulose yield of 87% at optimum process conditions of 70.6°C, 1.48h, and 0.48M Cr(NO 3 ) 3 . At these conditions, the obtained nanocellulose presented high crystallinity index (75.3%), spider-web-like interconnected network morphology with the average width of 31.2±14.3nm. In addition, the yielded nanocellulose rendered a higher thermal stability than that of original cellulosic source and expected to be widely used as reinforcement agent in bio-nanocomposites materials. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Optimization on condition of epigallocatechin-3-gallate (EGCG) nanoliposomes by response surface methodology and cellular uptake studies in Caco-2 cells

    NASA Astrophysics Data System (ADS)

    Luo, Xiaobo; Guan, Rongfa; Chen, Xiaoqiang; Tao, Miao; Ma, Jieqing; Zhao, Jin

    2014-06-01

    The major component in green tea polyphenols, epigallocatechin-3-gallate (EGCG), has been demonstrated to prevent carcinogenesis. To improve the effectiveness of EGCG, liposomes were used as a carrier in this study. Reverse-phase evaporation method besides response surface methodology is a simple, rapid, and beneficial approach for liposome preparation and optimization. The optimal preparation conditions were as follows: phosphatidylcholine-to-cholesterol ratio of 4.00, EGCG concentration of 4.88 mg/mL, Tween 80 concentration of 1.08 mg/mL, and rotary evaporation temperature of 34.51°C. Under these conditions, the experimental encapsulation efficiency and size of EGCG nanoliposomes were 85.79% ± 1.65% and 180 nm ± 4 nm, which were close with the predicted value. The malondialdehyde value and the release test in vitro indicated that the prepared EGCG nanoliposomes were stable and suitable for more widespread application. Furthermore, compared with free EGCG, encapsulation of EGCG enhanced its inhibitory effect on tumor cell viability at higher concentrations.

  4. Statistical optimization of ultraviolet irradiate conditions for vitamin D₂ synthesis in oyster mushrooms (Pleurotus ostreatus) using response surface methodology.

    PubMed

    Wu, Wei-Jie; Ahn, Byung-Yong

    2014-01-01

    Response surface methodology (RSM) was used to determine the optimum vitamin D2 synthesis conditions in oyster mushrooms (Pleurotus ostreatus). Ultraviolet B (UV-B) was selected as the most efficient irradiation source for the preliminary experiment, in addition to the levels of three independent variables, which included ambient temperature (25-45°C), exposure time (40-120 min), and irradiation intensity (0.6-1.2 W/m2). The statistical analysis indicated that, for the range which was studied, irradiation intensity was the most critical factor that affected vitamin D2 synthesis in oyster mushrooms. Under optimal conditions (ambient temperature of 28.16°C, UV-B intensity of 1.14 W/m2, and exposure time of 94.28 min), the experimental vitamin D2 content of 239.67 µg/g (dry weight) was in very good agreement with the predicted value of 245.49 µg/g, which verified the practicability of this strategy. Compared to fresh mushrooms, the lyophilized mushroom powder can synthesize remarkably higher level of vitamin D2 (498.10 µg/g) within much shorter UV-B exposure time (10 min), and thus should receive attention from the food processing industry.

  5. Methodological Issues in Examining Measurement Equivalence in Patient Reported Outcomes Measures: Methods Overview to the Two-Part Series, "Measurement Equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS®) Short Forms".

    PubMed

    Teresi, Jeanne A; Jones, Richard N

    2016-01-01

    The purpose of this article is to introduce the methods used and challenges confronted by the authors of this two-part series of articles describing the results of analyses of measurement equivalence of the short form scales from the Patient Reported Outcomes Measurement Information System ® (PROMIS ® ). Qualitative and quantitative approaches used to examine differential item functioning (DIF) are reviewed briefly. Qualitative methods focused on generation of DIF hypotheses. The basic quantitative approaches used all rely on a latent variable model, and examine parameters either derived directly from item response theory (IRT) or from structural equation models (SEM). A key methods focus of these articles is to describe state-of-the art approaches to examination of measurement equivalence in eight domains: physical health, pain, fatigue, sleep, depression, anxiety, cognition, and social function. These articles represent the first time that DIF has been examined systematically in the PROMIS short form measures, particularly among ethnically diverse groups. This is also the first set of analyses to examine the performance of PROMIS short forms in patients with cancer. Latent variable model state-of-the-art methods for examining measurement equivalence are introduced briefly in this paper to orient readers to the approaches adopted in this set of papers. Several methodological challenges underlying (DIF-free) anchor item selection and model assumption violations are presented as a backdrop for the articles in this two-part series on measurement equivalence of PROMIS measures.

  6. A proposed standard methodology for estimating the wounding capacity of small calibre projectiles or other missiles.

    PubMed

    Berlin, R H; Janzon, B; Rybeck, B; Schantz, B; Seeman, T

    1982-01-01

    A standard methodology for estimating the energy transfer characteristics of small calibre bullets and other fast missiles is proposed, consisting of firings against targets made of soft soap. The target is evaluated by measuring the size of the permanent cavity remaining in it after the shot. The method is very simple to use and does not require access to any sophisticated measuring equipment. It can be applied under all circumstances, even under field conditions. Adequate methods of calibration to ensure good accuracy are suggested. The precision and limitations of the method are discussed.

  7. Methodological Challenges in Measuring Child Maltreatment

    ERIC Educational Resources Information Center

    Fallon, Barbara; Trocme, Nico; Fluke, John; MacLaurin, Bruce; Tonmyr, Lil; Yuan, Ying-Ying

    2010-01-01

    Objective: This article reviewed the different surveillance systems used to monitor the extent of reported child maltreatment in North America. Methods: Key measurement and definitional differences between the surveillance systems are detailed and their potential impact on the measurement of the rate of victimization. The infrastructure…

  8. Stabilizing Conditional Standard Errors of Measurement in Scale Score Transformations

    ERIC Educational Resources Information Center

    Moses, Tim; Kim, YoungKoung

    2017-01-01

    The focus of this article is on scale score transformations that can be used to stabilize conditional standard errors of measurement (CSEMs). Three transformations for stabilizing the estimated CSEMs are reviewed, including the traditional arcsine transformation, a recently developed general variance stabilization transformation, and a new method…

  9. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  10. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  11. Measurement of availability and accessibility of food among youth: a systematic review of methodological studies.

    PubMed

    Gebremariam, Mekdes K; Vaqué-Crusellas, Cristina; Andersen, Lene F; Stok, F Marijn; Stelmach-Mardas, Marta; Brug, Johannes; Lien, Nanna

    2017-02-14

    Comprehensive and psychometrically tested measures of availability and accessibility of food are needed in order to explore availability and accessibility as determinants and predictors of dietary behaviors. The main aim of this systematic review was to update the evidence regarding the psychometric properties of measures of food availability and accessibility among youth. A secondary objective was to assess how availability and accessibility were conceptualized in the included studies. A systematic literature search was conducted using Medline, Embase, PsycINFO and Web of Science. Methodological studies published between January 2010 and March 2016 and reporting on at least one psychometric property of a measure of availability and/or accessibility of food among youth were included. Two reviewers independently extracted data and assessed study quality. Existing criteria were used to interpret reliability and validity parameters. A total of 20 studies were included. While 16 studies included measures of food availability, three included measures of both availability and accessibility; one study included a measure of accessibility only. Different conceptualizations of availability and accessibility were used across the studies. The measures aimed at assessing availability and/or accessibility in the home environment (n = 11), the school (n = 4), stores (n = 3), childcare/early care and education services (n = 2) and restaurants (n = 1). Most studies followed systematic steps in the development of the measures. The most common psychometrics tested for these measures were test-retest reliability and criterion validity. The majority of the measures had satisfactory evidence of reliability and/or validity. None of the included studies assessed the responsiveness of the measures. The review identified several measures of food availability or accessibility among youth with satisfactory evidence of reliability and/or validity. Findings indicate a need

  12. Patient empowerment in long-term conditions: development and preliminary testing of a new measure

    PubMed Central

    2013-01-01

    Background Patient empowerment is viewed by policy makers and health care practitioners as a mechanism to help patients with long-term conditions better manage their health and achieve better outcomes. However, assessing the role of empowerment is dependent on effective measures of empowerment. Although many measures of empowerment exist, no measure has been developed specifically for patients with long-term conditions in the primary care setting. This study presents preliminary data on the development and validation of such a measure. Methods We conducted two empirical studies. Study one was an interview study to understand empowerment from the perspective of patients living with long-term conditions. Qualitative analysis identified dimensions of empowerment, and the qualitative data were used to generate items relating to these dimensions. Study two was a cross-sectional postal study involving patients with different types of long-term conditions recruited from general practices. The survey was conducted to test and validate our new measure of empowerment. Factor analysis and regression were performed to test scale structure, internal consistency and construct validity. Results Sixteen predominately elderly patients with different types of long-term conditions described empowerment in terms of 5 dimensions (identity, knowledge and understanding, personal control, personal decision-making, and enabling other patients). One hundred and ninety seven survey responses were received from mainly older white females, with relatively low levels of formal education, with the majority retired from paid work. Almost half of the sample reported cardiovascular, joint or diabetes long-term conditions. Factor analysis identified a three factor solution (positive attitude and sense of control, knowledge and confidence in decision making and enabling others), although the structure lacked clarity. A total empowerment score across all items showed acceptable levels of internal

  13. Consistency of Self-Ratings Across Measurement Conditions and Test Administrations.

    ERIC Educational Resources Information Center

    Froberg, Debra G.

    1984-01-01

    Using participants in a continuing education conference, an experiment was conducted to examine: (1) the consistency of self-ratings of ability across four different measurement conditions; and, (2) the relative leniency of retrospective pretest self-ratings compared to pretest self-ratings. Implications for evaluators are discussed. (Author/BS)

  14. Towards standardized testing methodologies for optical properties of components in concentrating solar thermal power plants

    NASA Astrophysics Data System (ADS)

    Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian

    2017-06-01

    Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.

  15. A Call for a New National Norming Methodology.

    ERIC Educational Resources Information Center

    Ligon, Glynn; Mangino, Evangelina

    Issues related to achieving adequate national norms are reviewed, and a new methodology is proposed that would work to provide a true measure of national achievement levels on an annual basis and would enable reporting results in current-year norms. Statistical methodology and technology could combine to create a national norming process that…

  16. Methodological Advances in the Study of Self-Concept.

    ERIC Educational Resources Information Center

    Schwartz, Terrence J.

    Critical review of previous techniques for the measurement of an individual's self-concept (SC) is a necessary prelude to the development of more adequate methodologies. This paper focuses on recent methodological innovations in the study of the self, namely, those derived from cognitive social psychology. A view of the self as a cognitive…

  17. Quantitative approach for optimizing e-beam condition of photoresist inspection and measurement

    NASA Astrophysics Data System (ADS)

    Lin, Chia-Jen; Teng, Chia-Hao; Cheng, Po-Chung; Sato, Yoshishige; Huang, Shang-Chieh; Chen, Chu-En; Maruyama, Kotaro; Yamazaki, Yuichiro

    2018-03-01

    Severe process margin in advanced technology node of semiconductor device is controlled by e-beam metrology system and e-beam inspection system with scanning electron microscopy (SEM) image. By using SEM, larger area image with higher image quality is required to collect massive amount of data for metrology and to detect defect in a large area for inspection. Although photoresist is the one of the critical process in semiconductor device manufacturing, observing photoresist pattern by SEM image is crucial and troublesome especially in the case of large image. The charging effect by e-beam irradiation on photoresist pattern causes deterioration of image quality, and it affect CD variation on metrology system and causes difficulties to continue defect inspection in a long time for a large area. In this study, we established a quantitative approach for optimizing e-beam condition with "Die to Database" algorithm of NGR3500 on photoresist pattern to minimize charging effect. And we enhanced the performance of measurement and inspection on photoresist pattern by using optimized e-beam condition. NGR3500 is the geometry verification system based on "Die to Database" algorithm which compares SEM image with design data [1]. By comparing SEM image and design data, key performance indicator (KPI) of SEM image such as "Sharpness", "S/N", "Gray level variation in FOV", "Image shift" can be retrieved. These KPIs were analyzed with different e-beam conditions which consist of "Landing Energy", "Probe Current", "Scanning Speed" and "Scanning Method", and the best e-beam condition could be achieved with maximum image quality, maximum scanning speed and minimum image shift. On this quantitative approach of optimizing e-beam condition, we could observe dependency of SEM condition on photoresist charging. By using optimized e-beam condition, measurement could be continued on photoresist pattern over 24 hours stably. KPIs of SEM image proved image quality during measurement and

  18. Modeling of the effect of freezer conditions on the principal constituent parameters of ice cream by using response surface methodology.

    PubMed

    Inoue, K; Ochi, H; Taketsuka, M; Saito, H; Sakurai, K; Ichihashi, N; Iwatsuki, K; Kokubo, S

    2008-05-01

    A systematic analysis was carried out by using response surface methodology to create a quantitative model of the synergistic effects of conditions in a continuous freezer [mix flow rate (L/h), overrun (%), cylinder pressure (kPa), drawing temperature ( degrees C), and dasher speed (rpm)] on the principal constituent parameters of ice cream [rate of fat destabilization (%), mean air cell diameter (mum), and mean ice crystal diameter (mum)]. A central composite face-centered design was used for this study. Thirty-one combinations of the 5 above-mentioned freezer conditions were designed (including replicates at the center point), and ice cream samples were manufactured and examined in a continuous freezer under the selected conditions. The responses were the 3 variables given above. A quadratic model was constructed, with the freezer conditions as the independent variables and the ice cream characteristics as the dependent variables. The coefficients of determination (R(2)) were greater than 0.9 for all 3 responses, but Q(2), the index used here for the capability of the model for predicting future observed values of the responses, was negative for both the mean ice crystal diameter and the mean air cell diameter. Therefore, pruned models were constructed by removing terms that had contributed little to the prediction in the original model and by refitting the regression model. It was demonstrated that these pruned models provided good fits to the data in terms of R(2), Q(2), and ANOVA. The effects of freezer conditions were expressed quantitatively in terms of the 3 responses. The drawing temperature ( degrees C) was found to have a greater effect on ice cream characteristics than any of the other factors.

  19. Structural acoustic control of plates with variable boundary conditions: design methodology.

    PubMed

    Sprofera, Joseph D; Cabell, Randolph H; Gibbs, Gary P; Clark, Robert L

    2007-07-01

    A method for optimizing a structural acoustic control system subject to variations in plate boundary conditions is provided. The assumed modes method is used to build a plate model with varying levels of rotational boundary stiffness to simulate the dynamics of a plate with uncertain edge conditions. A transducer placement scoring process, involving Hankel singular values, is combined with a genetic optimization routine to find spatial locations robust to boundary condition variation. Predicted frequency response characteristics are examined, and theoretically optimized results are discussed in relation to the range of boundary conditions investigated. Modeled results indicate that it is possible to minimize the impact of uncertain boundary conditions in active structural acoustic control by optimizing the placement of transducers with respect to those uncertainties.

  20. Doctoral training in statistics, measurement, and methodology in psychology: replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America.

    PubMed

    Aiken, Leona S; West, Stephen G; Millsap, Roger E

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.

  1. Application of truss analysis for the quantification of changes in fish condition

    USGS Publications Warehouse

    Fitzgerald, Dean G.; Nanson, Jeffrey W.; Todd, Thomas N.; Davis, Bruce M.

    2002-01-01

    Conservation of skeletal structure and unique body ratios in fishes facilitated the development of truss analysis as a taxonomic tool to separate physically-similar species. The methodology is predicated on the measurement of across-body distances from a sequential series of connected polygons. Changes in body shape or condition among members of the same species can be quantified with the same technique, and we conducted a feeding experiment using yellow perch (Perca flavescens) to examine the utility of this approach. Ration size was used as a surrogate for fish condition, with fish receiving either a high (3.0% body wt/d) or a low ration (0.5%). Sequentially over our 11-week experiment, replicate ration groups of fish were removed and photographed while control fish were repeatedly weighed and measured. Standard indices of condition (total lipids, weight-length ratios, Fulton's condition) were compared to truss measurements determined from digitized pictures of fish. Condition indices showed similarity between rations while truss measures from the caudal region were important for quantifying changing body shape. These findings identify truss analysis as having use beyond traditional applications. It can potentially be used as a cheap, accurate, and precise descriptor of fish condition in the lab as shown here, and we hypothesize that it would be applicable in field studies.

  2. Methodological Issues in Examining Measurement Equivalence in Patient Reported Outcomes Measures: Methods Overview to the Two-Part Series, “Measurement Equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS®) Short Forms”

    PubMed Central

    Teresi, Jeanne A.; Jones, Richard N.

    2017-01-01

    The purpose of this article is to introduce the methods used and challenges confronted by the authors of this two-part series of articles describing the results of analyses of measurement equivalence of the short form scales from the Patient Reported Outcomes Measurement Information System® (PROMIS®). Qualitative and quantitative approaches used to examine differential item functioning (DIF) are reviewed briefly. Qualitative methods focused on generation of DIF hypotheses. The basic quantitative approaches used all rely on a latent variable model, and examine parameters either derived directly from item response theory (IRT) or from structural equation models (SEM). A key methods focus of these articles is to describe state-of-the art approaches to examination of measurement equivalence in eight domains: physical health, pain, fatigue, sleep, depression, anxiety, cognition, and social function. These articles represent the first time that DIF has been examined systematically in the PROMIS short form measures, particularly among ethnically diverse groups. This is also the first set of analyses to examine the performance of PROMIS short forms in patients with cancer. Latent variable model state-of-the-art methods for examining measurement equivalence are introduced briefly in this paper to orient readers to the approaches adopted in this set of papers. Several methodological challenges underlying (DIF-free) anchor item selection and model assumption violations are presented as a backdrop for the articles in this two-part series on measurement equivalence of PROMIS measures. PMID:28983448

  3. Chronic condition self-management surveillance: what is and what should be measured?

    PubMed

    Ruiz, Sarah; Brady, Teresa J; Glasgow, Russell E; Birkel, Richard; Spafford, Michelle

    2014-06-19

    The rapid growth in chronic disease prevalence, in particular the prevalence of multiple chronic conditions, poses a significant and increasing burden on the health of Americans. Maximizing the use of proven self-management (SM) strategies is a core goal of the US Department of Health and Human Services. Yet, there is no systematic way to assess how much SM or self-management support (SMS) is occurring in the United States. The purpose of this project was to identify appropriate concepts or measures to incorporate into national SM and SMS surveillance. A multistep process was used to identify candidate concepts, assess existing measures, and select high-priority concepts for further development. A stakeholder survey, an environmental scan, subject matter expert feedback, and a stakeholder priority-setting exercise were all used to select the high-priority concepts for development. The stakeholder survey gathered feedback on 32 candidate concepts; 9 concepts were endorsed by more than 66% of respondents. The environmental scan indicated few existing measures that adequately reflected the candidate concepts, and those that were identified were generally specific to a defined condition and not gathered on a population basis. On the basis of the priority setting exercises and environmental scan, we selected 1 concept from each of 5 levels of behavioral influence for immediate development as an SM or SMS indicator. The absence of any available measures to assess SM or SMS across the population highlights the need to develop chronic condition SM surveillance that uses national surveys and other data sources to measure national progress in SM and SMS.

  4. Measuring reward with the conditioned place preference (CPP) paradigm: update of the last decade.

    PubMed

    Tzschentke, Thomas M

    2007-09-01

    Conditioned place preference (CPP) continues to be one of the most popular models to study the motivational effects of drugs and non-drug treatments in experimental animals. This is obvious from a steady year-to-year increase in the number of publications reporting the use this model. Since the compilation of the preceding review in 1998, more than 1000 new studies using place conditioning have been published, and the aim of the present review is to provide an overview of these recent publications. There are a number of trends and developments that are obvious in the literature of the last decade. First, as more and more knockout and transgenic animals become available, place conditioning is increasingly used to assess the motivational effects of drugs or non-drug rewards in genetically modified animals. Second, there is a still small but growing literature on the use of place conditioning to study the motivational aspects of pain, a field of pre-clinical research that has so far received little attention, because of the lack of appropriate animal models. Third, place conditioning continues to be widely used to study tolerance and sensitization to the rewarding effects of drugs induced by pre-treatment regimens. Fourth, extinction/reinstatement procedures in place conditioning are becoming increasingly popular. This interesting approach is thought to model certain aspects of relapse to addictive behavior and has previously almost exclusively been studied in drug self-administration paradigms. It has now also become established in the place conditioning literature and provides an additional and technically easy approach to this important phenomenon. The enormous number of studies to be covered in this review prevented in-depth discussion of many methodological, pharmacological or neurobiological aspects; to a large extent, the presentation of data had to be limited to a short and condensed summary of the most relevant findings.

  5. School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.

    ERIC Educational Resources Information Center

    Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others

    1998-01-01

    Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…

  6. Characterization of Melanogenesis Inhibitory Constituents of Morus alba Leaves and Optimization of Extraction Conditions Using Response Surface Methodology.

    PubMed

    Jeong, Ji Yeon; Liu, Qing; Kim, Seon Beom; Jo, Yang Hee; Mo, Eun Jin; Yang, Hyo Hee; Song, Dae Hye; Hwang, Bang Yeon; Lee, Mi Kyeong

    2015-05-14

    Melanin is a natural pigment that plays an important role in the protection of skin, however, hyperpigmentation cause by excessive levels of melatonin is associated with several problems. Therefore, melanogenesis inhibitory natural products have been developed by the cosmetic industry as skin medications. The leaves of Morus alba (Moraceae) have been reported to inhibit melanogenesis, therefore, characterization of the melanogenesis inhibitory constituents of M. alba leaves was attempted in this study. Twenty compounds including eight benzofurans, 10 flavonoids, one stilbenoid and one chalcone were isolated from M. alba leaves and these phenolic constituents were shown to significantly inhibit tyrosinase activity and melanin content in B6F10 melanoma cells. To maximize the melanogenesis inhibitory activity and active phenolic contents, optimized M. alba leave extraction conditions were predicted using response surface methodology as a methanol concentration of 85.2%; an extraction temperature of 53.2 °C and an extraction time of 2 h. The tyrosinase inhibition and total phenolic content under optimal conditions were found to be 74.8% inhibition and 24.8 μg GAE/mg extract, which were well-matched with the predicted values of 75.0% inhibition and 23.8 μg GAE/mg extract. These results shall provide useful information about melanogenesis inhibitory constituents and optimized extracts from M. alba leaves as cosmetic therapeutics to reduce skin hyperpigmentation.

  7. Methodological Choices in Peer Nomination Research

    ERIC Educational Resources Information Center

    Cillessen, Antonius H. N.; Marks, Peter E. L.

    2017-01-01

    Although peer nomination measures have been used by researchers for nearly a century, common methodological practices and rules of thumb (e.g., which variables to measure; use of limited vs. unlimited nomination methods) have continued to develop in recent decades. At the same time, other key aspects of the basic nomination procedure (e.g.,…

  8. Assessing the use of remotely sensed measurements for characterizing rangeland condition

    NASA Astrophysics Data System (ADS)

    Folker, Geoffrey P.

    There are over 233 million hectares (ha) of nonfederal grazing lands in the United States. Conventional field observation and sampling techniques are insufficient methods to monitor such large areas frequently enough to confidently quantify the biophysical state and assess rangeland condition over large geographic areas. In an attempt to enhance rangeland resource managers' abilities to monitor and assess these factors, remote sensing scientists and land resource managers have worked together to determine whether remotely sensed measurements can improve the ability to measure rangeland response to land management practices. The relationship between spectral reflectance patterns and plant species composition was investigated on six south-central Kansas ranches. Airborne multispectral color infrared images for 2002 through 2004 were collected at multiple times in the growing season over the study area. Concurrent with the image acquisition periods, ground cover estimates of plant species composition and biomass by growth form were collected. Correlation analysis was used to examine relationships among spectral and biophysical field measurements. Results indicate that heavily grazed sites exhibited the highest spectral vegetation index values. This was attributed to increases in low forage quality broadleaf forbs such as annual ragweed (Ambrosia artemisiifolia L.). Although higher vegetation index values have a positive correlation with overall above ground primary productivity, species composition may be the best indicator of healthy rangeland condition. A Weediness Index, which was found to be correlated with range condition, was also strongly linked to spectral reflectance patterns recorded in the airborne imagery.

  9. Multi-Population Invariance with Dichotomous Measures: Combining Multi-Group and MIMIC Methodologies in Evaluating the General Aptitude Test in the Arabic Language

    ERIC Educational Resources Information Center

    Sideridis, Georgios D.; Tsaousis, Ioannis; Al-harbi, Khaleel A.

    2015-01-01

    The purpose of the present study was to extend the model of measurement invariance by simultaneously estimating invariance across multiple populations in the dichotomous instrument case using multi-group confirmatory factor analytic and multiple indicator multiple causes (MIMIC) methodologies. Using the Arabic version of the General Aptitude Test…

  10. Methodological concerns for determining power output in the jump squat.

    PubMed

    Cormie, Prue; Deane, Russell; McBride, Jeffrey M

    2007-05-01

    The purpose of this study was to investigate the validity of power measurement techniques during the jump squat (JS) utilizing various combinations of a force plate and linear position transducer (LPT) devices. Nine men with at least 6 months of prior resistance training experience participated in this acute investigation. One repetition maximums (1RM) in the squat were determined, followed by JS testing under 2 loading conditions (30% of 1RM [JS30] and 90% of 1RM [JS90]). Three different techniques were used simultaneously in data collection: (a) 1 linear position transducer (1-LPT); (b) 1 linear position transducer and a force plate (1-LPT + FP); and (c) 2 linear position transducers and a force place (2-LPT + FP). Vertical velocity-, force-, and power-time curves were calculated for each lift using these methodologies and were compared. Peak force and peak power were overestimated by 1-LPT in both JS30 and JS90 compared with 2-LPT + FP and 1-LPT + FP (p condition. Peak vertical velocity determined by 2-LPT + FP was significantly lower than that determined by either 1-LPT and 1-LPT + FP in JS90. This investigation indicates that peak power and the timing of power output in the jump squat varies according to the measurement technique utilized. The 1-LPT methodology is not a valid means of determining power output in the jump squat. Furthermore, the 1-LPT + FP method may not accurately represent power output in free weight movements that involve a significant amount of horizontal motion.

  11. Human and methodological sources of variability in the measurement of urinary 8-oxo-7,8-dihydro-2'-deoxyguanosine.

    PubMed

    Barregard, Lars; Møller, Peter; Henriksen, Trine; Mistry, Vilas; Koppen, Gudrun; Rossner, Pavel; Sram, Radim J; Weimann, Allan; Poulsen, Henrik E; Nataf, Robert; Andreoli, Roberta; Manini, Paola; Marczylo, Tim; Lam, Patricia; Evans, Mark D; Kasai, Hiroshi; Kawai, Kazuaki; Li, Yun-Shan; Sakai, Kazuo; Singh, Rajinder; Teichert, Friederike; Farmer, Peter B; Rozalski, Rafal; Gackowski, Daniel; Siomek, Agnieszka; Saez, Guillermo T; Cerda, Concha; Broberg, Karin; Lindh, Christian; Hossain, Mohammad Bakhtiar; Haghdoost, Siamak; Hu, Chiung-Wen; Chao, Mu-Rong; Wu, Kuen-Yuh; Orhan, Hilmi; Senduran, Nilufer; Smith, Raymond J; Santella, Regina M; Su, Yali; Cortez, Czarina; Yeh, Susan; Olinski, Ryszard; Loft, Steffen; Cooke, Marcus S

    2013-06-20

    Urinary 8-oxo-7,8-dihydro-2'-deoxyguanosine (8-oxodG) is a widely used biomarker of oxidative stress. However, variability between chromatographic and ELISA methods hampers interpretation of data, and this variability may increase should urine composition differ between individuals, leading to assay interference. Furthermore, optimal urine sampling conditions are not well defined. We performed inter-laboratory comparisons of 8-oxodG measurement between mass spectrometric-, electrochemical- and ELISA-based methods, using common within-technique calibrants to analyze 8-oxodG-spiked phosphate-buffered saline and urine samples. We also investigated human subject- and sample collection-related variables, as potential sources of variability. Chromatographic assays showed high agreement across urines from different subjects, whereas ELISAs showed far more inter-laboratory variation and generally overestimated levels, compared to the chromatographic assays. Excretion rates in timed 'spot' samples showed strong correlations with 24 h excretion (the 'gold' standard) of urinary 8-oxodG (rp 0.67-0.90), although the associations were weaker for 8-oxodG adjusted for creatinine or specific gravity (SG). The within-individual excretion of 8-oxodG varied only moderately between days (CV 17% for 24 h excretion and 20% for first void, creatinine-corrected samples). This is the first comprehensive study of both human and methodological factors influencing 8-oxodG measurement, providing key information for future studies with this important biomarker. ELISA variability is greater than chromatographic assay variability, and cannot determine absolute levels of 8-oxodG. Use of standardized calibrants greatly improves intra-technique agreement and, for the chromatographic assays, importantly allows integration of results for pooled analyses. If 24 h samples are not feasible, creatinine- or SG-adjusted first morning samples are recommended.

  12. Feasibility, strategy, methodology, and analysis of probe measurements in plasma under high gas pressure

    NASA Astrophysics Data System (ADS)

    Demidov, V. I.; Koepke, M. E.; Kurlyandskaya, I. P.; Malkov, M. A.

    2018-02-01

    This paper reviews existing theories for interpreting probe measurements of electron distribution functions (EDF) at high gas pressure when collisions of electrons with atoms and/or molecules near the probe are pervasive. An explanation of whether or not the measurements are realizable and reliable, an enumeration of the most common sources of measurement error, and an outline of proper probe-experiment design elements that inherently limit or avoid error is presented. Additionally, we describe recent expanded plasma-condition compatibility for EDF measurement, including in applications of large wall probe plasma diagnostics. This summary of the authors’ experiences gained over decades of practicing and developing probe diagnostics is intended to inform, guide, suggest, and detail the advantages and disadvantages of probe application in plasma research.

  13. Extension of laboratory-measured soil spectra to field conditions

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.; Weismiller, R. A.; Biehl, L. L.; Robinson, B. F.

    1982-01-01

    Spectral responses of two glaciated soils, Chalmers silty clay loam and Fincastle silt loam, formed under prairie grass and forest vegetation, respectively, were measured in the laboratory under controlled moisture equilibria using an Exotech Model 20C spectroradiometer to obtain spectral data in the laboratory under artificial illumination. The same spectroradiometer was used outdoors under solar illumination to obtain spectral response from dry and moistened field plots with and without corn residue cover, representing the two different soils. Results indicate that laboratory-measured spectra of moist soil are directly proportional to the spectral response of that same field-measured moist bare soil over the 0.52 micrometer to 1.75 micrometer wavelength range. The magnitudes of difference in spectral response between identically treated Chalmers and Fincastle soils are greatest in the 0.6 micrometers to 0.8 micrometer transition region between the visible and near infrared, regardless of field condition or laboratory preparation studied.

  14. Conditional clustering of temporal expression profiles

    PubMed Central

    Wang, Ling; Montano, Monty; Rarick, Matt; Sebastiani, Paola

    2008-01-01

    Background Many microarray experiments produce temporal profiles in different biological conditions but common cluster techniques are not able to analyze the data conditional on the biological conditions. Results This article presents a novel technique to cluster data from time course microarray experiments performed across several experimental conditions. Our algorithm uses polynomial models to describe the gene expression patterns over time, a full Bayesian approach with proper conjugate priors to make the algorithm invariant to linear transformations, and an iterative procedure to identify genes that have a common temporal expression profile across two or more experimental conditions, and genes that have a unique temporal profile in a specific condition. Conclusion We use simulated data to evaluate the effectiveness of this new algorithm in finding the correct number of clusters and in identifying genes with common and unique profiles. We also use the algorithm to characterize the response of human T cells to stimulations of antigen-receptor signaling gene expression temporal profiles measured in six different biological conditions and we identify common and unique genes. These studies suggest that the methodology proposed here is useful in identifying and distinguishing uniquely stimulated genes from commonly stimulated genes in response to variable stimuli. Software for using this clustering method is available from the project home page. PMID:18334028

  15. A Case Study of Measuring Process Risk for Early Insights into Software Safety

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.

    2011-01-01

    In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.

  16. Numerical abilities in fish: A methodological review.

    PubMed

    Agrillo, Christian; Miletto Petrazzini, Maria Elena; Bisazza, Angelo

    2017-08-01

    The ability to utilize numerical information can be adaptive in a number of ecological contexts including foraging, mating, parental care, and anti-predator strategies. Numerical abilities of mammals and birds have been studied both in natural conditions and in controlled laboratory conditions using a variety of approaches. During the last decade this ability was also investigated in some fish species. Here we reviewed the main methods used to study this group, highlighting the strengths and weaknesses of each of the methods used. Fish have only been studied under laboratory conditions and among the methods used with other species, only two have been systematically used in fish-spontaneous choice tests and discrimination learning procedures. In the former case, the choice between two options is observed in a biologically relevant situation and the degree of preference for the larger/smaller group is taken as a measure of the capacity to discriminate the two quantities (e.g., two shoals differing in number). In discrimination learning tasks, fish are trained to select the larger or the smaller of two sets of abstract objects, typically two-dimensional geometric figures, using food or social companions as reward. Beyond methodological differences, what emerges from the literature is a substantial similarity of the numerical abilities of fish with those of other vertebrates studied. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Developing a spectroradiometer data uncertainty methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Josh; Vignola, Frank; Habte, Aron

    The proper calibration and measurement uncertainty of spectral data obtained from spectroradiometers is essential in accurately quantifying the output of photovoltaic (PV) devices. PV cells and modules are initially characterized using solar simulators but field performance is evaluated using natural sunlight. Spectroradiometers are used to measure the spectrum of both these light sources in an effort to understand the spectral dependence of various PV output capabilities. These chains of characterization and measurement are traceable to National Metrology Institutes such as National Institute of Standards and Technology, and therefore there is a need for a comprehensive uncertainty methodology to determine themore » accuracy of spectroradiometer data. In this paper, the uncertainties associated with the responsivity of a spectroradiometer are examined using the Guide to the Expression of Uncertainty in Measurement (GUM) protocols. This is first done for a generic spectroradiometer, and then, to illustrate the methodology, the calibration of a LI-COR 1800 spectroradiometer is performed. The reader should be aware that the implementation of this methodology will be specific to the spectroradiometer being analyzed and the experimental setup that is used. Depending of the characteristics of the spectroradiometer being evaluated additional sources of uncertainty may need to be included, but the general GUM methodology is the same. Several sources of uncertainty are associated with the spectroradiometer responsivity. Major sources of uncertainty associated with the LI-COR spectroradiometer are noise in the signal at wavelengths less than 400 nm. At wavelengths more than 400 nm, the responsivity can vary drastically, and it is dependent on the wavelength of light, the temperature dependence, the angle of incidence, and the azimuthal orientation of the sensor to the light source. As a result, the expanded uncertainties in the responsivity of the LI-COR spectroradiometer

  18. Developing a spectroradiometer data uncertainty methodology

    DOE PAGES

    Peterson, Josh; Vignola, Frank; Habte, Aron; ...

    2017-04-11

    The proper calibration and measurement uncertainty of spectral data obtained from spectroradiometers is essential in accurately quantifying the output of photovoltaic (PV) devices. PV cells and modules are initially characterized using solar simulators but field performance is evaluated using natural sunlight. Spectroradiometers are used to measure the spectrum of both these light sources in an effort to understand the spectral dependence of various PV output capabilities. These chains of characterization and measurement are traceable to National Metrology Institutes such as National Institute of Standards and Technology, and therefore there is a need for a comprehensive uncertainty methodology to determine themore » accuracy of spectroradiometer data. In this paper, the uncertainties associated with the responsivity of a spectroradiometer are examined using the Guide to the Expression of Uncertainty in Measurement (GUM) protocols. This is first done for a generic spectroradiometer, and then, to illustrate the methodology, the calibration of a LI-COR 1800 spectroradiometer is performed. The reader should be aware that the implementation of this methodology will be specific to the spectroradiometer being analyzed and the experimental setup that is used. Depending of the characteristics of the spectroradiometer being evaluated additional sources of uncertainty may need to be included, but the general GUM methodology is the same. Several sources of uncertainty are associated with the spectroradiometer responsivity. Major sources of uncertainty associated with the LI-COR spectroradiometer are noise in the signal at wavelengths less than 400 nm. At wavelengths more than 400 nm, the responsivity can vary drastically, and it is dependent on the wavelength of light, the temperature dependence, the angle of incidence, and the azimuthal orientation of the sensor to the light source. As a result, the expanded uncertainties in the responsivity of the LI-COR spectroradiometer

  19. Condition-specific quality of life questionnaires for caregivers of children with pediatric conditions: a systematic review.

    PubMed

    Chow, Maria Yui Kwan; Morrow, Angela M; Cooper Robbins, Spring Chenoa; Leask, Julie

    2013-10-01

    Childhood illness or disability can affect the quality of life (QoL) of the child's primary caregiver. Our aim was to identify, describe the content and systematically review the psychometric properties of condition-specific QoL questionnaires for caregivers of children. Medline, PsycInfo, Embase, CINAHL, and the Cochrane library databases were searched from 1 January 1990 to 30 June 2011. Articles related to the development and measurement of caregiver QoL were screened to identify condition-specific questionnaires. The characteristics of the questionnaires were extracted, and their psychometric properties were evaluated using the consensus-based standards for the selection of health measurement instruments checklist with 4-point scale. We identified 25 condition-specific caregiver QoL questionnaires covering 16 conditions. Conditions included atopic dermatitis, asthma, diabetes, oro-facial disorders, and two acute illnesses. Questionnaires were developed predominantly in high-income countries. Questionnaires had the highest quality rating for content validity, followed by hypothesis testing. Methodological quality was satisfactory for criterion validity; fair in reliability and responsiveness; and poor in internal consistency and structural validity. The increasing number of questionnaires developed over time shows improved recognition of the importance of caregiver QoL. There is a paucity of QoL questionnaires for caregivers of otherwise healthy children suffering from physical injuries and acute conditions associated with significant caregiver burden. Cultural validation of existing and new questionnaires in lower-income countries is necessary. Data collected by condition-specific questionnaires can assist clinicians and health economists in estimating caregiver burden and the types of healthcare services caregivers require and may be useful for healthcare administrators to evaluate interventions.

  20. Methodological considerations in service use assessment for children and youth with mental health conditions; issues for economic evaluation.

    PubMed

    Woolderink, M; Lynch, F L; van Asselt, A D I; Beecham, J; Evers, S M A A; Paulus, A T G; van Schayck, C P

    2015-05-01

    Economic evaluations are increasingly used in decision-making. Accurate measurement of service use is critical to economic evaluation. This qualitative study, based on expert interviews, aims to identify best approaches to service use measurement for child mental health conditions, and to identify problems in current methods. Results suggest considerable agreement on strengths (e.g., availability of accurate instruments to measure service use) and weaknesses, (e.g., lack of unit prices for services outside the health sector) or alternative approaches to service use measurement. Experts also identified some unresolved problems, for example the lack of uniform definitions for some mental health services.

  1. Measuring fish body condition with or without parasites: does it matter?

    PubMed

    Lagrue, C; Poulin, R

    2015-10-01

    A fish body condition index was calculated twice for each individual fish, including or excluding parasite mass from fish body mass, and index values were compared to test the effects of parasite mass on measurement of body condition. Potential correlations between parasite load and the two alternative fish condition index values were tested to assess how parasite mass may influence the perception of the actual effects of parasitism on fish body condition. Helminth parasite mass was estimated in common bully Gobiomorphus cotidianus from four New Zealand lakes and used to assess the biasing effects of parasite mass on body condition indices. Results showed that the inclusion or exclusion of parasite mass from fish body mass in index calculations significantly influenced correlation patterns between parasite load and fish body condition indices. When parasite mass was included, there was a positive correlation between parasite load and fish body condition, seemingly indicating that fish in better condition supported higher parasite loads. When parasite mass was excluded, there was no correlation between parasite load and fish body condition, i.e. there was no detectable effect of helminth parasites on fish condition or fish condition on parasite load. Fish body condition tended to be overestimated when parasite mass was not accounted for; results showed a positive correlation between relative parasite mass and the degree to which individual fish condition was overestimated. Regardless of the actual effects of helminth parasites on fish condition, parasite mass contained within a fish should be taken into account when estimating fish condition. Parasite tissues are not host tissues and should not be included in fish mass when calculating a body condition index, especially when looking at potential effects of helminth infections on fish condition. © 2015 The Fisheries Society of the British Isles.

  2. The ethics of placebo-controlled trials: methodological justifications.

    PubMed

    Millum, Joseph; Grady, Christine

    2013-11-01

    The use of placebo controls in clinical trials remains controversial. Ethical analysis and international ethical guidance permit the use of placebo controls in randomized trials when scientifically indicated in four cases: (1) when there is no proven effective treatment for the condition under study; (2) when withholding treatment poses negligible risks to participants; (3) when there are compelling methodological reasons for using placebo, and withholding treatment does not pose a risk of serious harm to participants; and, more controversially, (4) when there are compelling methodological reasons for using placebo, and the research is intended to develop interventions that can be implemented in the population from which trial participants are drawn, and the trial does not require participants to forgo treatment they would otherwise receive. The concept of methodological reasons is essential to assessing the ethics of placebo controls in these controversial last two cases. This article sets out key considerations relevant to considering whether methodological reasons for a placebo control are compelling. © 2013.

  3. Enhanced methodology of focus control and monitoring on scanner tool

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus

  4. Evaluating indices that measure departure of current landscape composition from historical conditions

    Treesearch

    Robert E. Keane; Lisa Holsinger; Russell A. Parsons

    2011-01-01

    A measure of the degree of departure of a landscape from its range of historical conditions can provide a means for prioritizing and planning areas for restoration treatments. There are few statistics or indices that provide a quantitative context for measuring departure across landscapes. This study evaluated a set of five similarity indices commonly used in...

  5. The comparison of various approach to evaluation erosion risks and design control erosion measures

    NASA Astrophysics Data System (ADS)

    Kapicka, Jiri

    2015-04-01

    In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas

  6. Temperature Distribution Measurement of The Wing Surface under Icing Conditions

    NASA Astrophysics Data System (ADS)

    Isokawa, Hiroshi; Miyazaki, Takeshi; Kimura, Shigeo; Sakaue, Hirotaka; Morita, Katsuaki; Japan Aerospace Exploration Agency Collaboration; Univ of Notre Dame Collaboration; Kanagawa Institute of Technology Collaboration; Univ of Electro-(UEC) Team, Comm

    2016-11-01

    De- or anti-icing system of an aircraft is necessary for a safe flight operation. Icing is a phenomenon which is caused by a collision of supercooled water frozen to an object. For the in-flight icing, it may cause a change in the wing cross section that causes stall, and in the worst case, the aircraft would fall. Therefore it is important to know the surface temperature of the wing for de- or anti-icing system. In aerospace field, temperature-sensitive paint (TSP) has been widely used for obtaining the surface temperature distribution on a testing article. The luminescent image from the TSP can be related to the temperature distribution. (TSP measurement system) In icing wind tunnel, we measured the surface temperature distribution of the wing model using the TSP measurement system. The effect of icing conditions on the TSP measurement system is discussed.

  7. Mathematical simulation of power conditioning systems. Volume 1: Simulation of elementary units. Report on simulation methodology

    NASA Technical Reports Server (NTRS)

    Prajous, R.; Mazankine, J.; Ippolito, J. C.

    1978-01-01

    Methods and algorithms used for the simulation of elementary power conditioning units buck, boost, and buck-boost, as well as shunt PWM are described. Definitions are given of similar converters and reduced parameters. The various parts of the simulation to be carried out are dealt with; local stability, corrective network, measurements of input-output impedance and global stability. A simulation example is given.

  8. Measuring and modeling maize evapotranspiration under plastic film-mulching condition

    NASA Astrophysics Data System (ADS)

    Li, Sien; Kang, Shaozhong; Zhang, Lu; Ortega-Farias, Samuel; Li, Fusheng; Du, Taisheng; Tong, Ling; Wang, Sufen; Ingman, Mark; Guo, Weihua

    2013-10-01

    Plastic film-mulching techniques have been widely used over a variety of agricultural crops for saving water and improving yield. Accurate estimation of crop evapotranspiration (ET) under the film-mulching condition is critical for optimizing crop water management. After taking the mulching effect on soil evaporation (Es) into account, our study adjusted the original Shuttleworth-Wallace model (MSW) in estimating maize ET and Es under the film-mulching condition. Maize ET and Es respectively measured by eddy covariance and micro-lysimeter methods during 2007 and 2008 were used to validate the performance of the Penman-Monteith (PM), the original Shuttleworth-Wallace (SW) and the MSW models in arid northwest China. Results indicate that all three models significantly overestimated ET during the initial crop stage in the both years, which may be due to the underestimation of canopy resistance induced by the Jarvis model for the drought stress in the stage. For the entire experimental period, the SW model overestimated half-hourly maize ET by 17% compared with the eddy covariance method (ETEC) and overestimated daily Es by 241% compared with the micro-lysimeter measurements (EL), while the PM model only underestimated daily maize ET by 6%, and the MSW model only underestimated half-hourly maize ET by 2% and Es by 7% during the whole period. Thus the PM and MSW models significantly improved the accuracy against the original SW model and can be used to estimate ET and Es under the film-mulching condition.

  9. Detection of Chamber Conditioning Through Optical Emission and Impedance Measurements

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.; Rao, M. V. V. S.; Sharma, Surendra P.; Meyyappan, Meyya

    2001-01-01

    During oxide etch processes, buildup of fluorocarbon residues on reactor sidewalls can cause run-to-run drift and will necessitate some time for conditioning and seasoning of the reactor. Though diagnostics can be applied to study and understand these phenomena, many of them are not practical for use in an industrial reactor. For instance, measurements of ion fluxes and energy by mass spectrometry show that the buildup of insulating fluorocarbon films on the reactor surface will cause a shift in both ion energy and current in an argon plasma. However, such a device cannot be easily integrated into a processing system. The shift in ion energy and flux will be accompanied by an increase in the capacitance of the plasma sheath. The shift in sheath capacitance can be easily measured by a common commercially available impedance probe placed on the inductive coil. A buildup of film on the chamber wall is expected to affect the production of fluorocarbon radicals, and thus the presence of such species in the optical emission spectrum of the plasma can be monitored as well. These two techniques are employed on a GEC (Gaseous Electronics Conference) Reference Cell to assess the validity of optical emission and impedance monitoring as a metric of chamber conditioning. These techniques are applied to experimental runs with CHF3 and CHF3/O2/Ar plasmas, with intermediate monitoring of pure argon plasmas as a reference case for chamber conditions.

  10. Improving Training in Methodology Enriches the Science of Psychology

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2009-01-01

    Replies to the comment Ramifications of increased training in quantitative methodology by Herbet Zimiles on the current authors original article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America". The…

  11. Measuring alterations in oscillatory brain networks in schizophrenia with resting-state MEG: State-of-the-art and methodological challenges.

    PubMed

    Alamian, Golnoush; Hincapié, Ana-Sofía; Pascarella, Annalisa; Thiery, Thomas; Combrisson, Etienne; Saive, Anne-Lise; Martel, Véronique; Althukov, Dmitrii; Haesebaert, Frédéric; Jerbi, Karim

    2017-09-01

    Neuroimaging studies provide evidence of disturbed resting-state brain networks in Schizophrenia (SZ). However, untangling the neuronal mechanisms that subserve these baseline alterations requires measurement of their electrophysiological underpinnings. This systematic review specifically investigates the contributions of resting-state Magnetoencephalography (MEG) in elucidating abnormal neural organization in SZ patients. A systematic literature review of resting-state MEG studies in SZ was conducted. This literature is discussed in relation to findings from resting-state fMRI and EEG, as well as to task-based MEG research in SZ population. Importantly, methodological limitations are considered and recommendations to overcome current limitations are proposed. Resting-state MEG literature in SZ points towards altered local and long-range oscillatory network dynamics in various frequency bands. Critical methodological challenges with respect to experiment design, and data collection and analysis need to be taken into consideration. Spontaneous MEG data show that local and global neural organization is altered in SZ patients. MEG is a highly promising tool to fill in knowledge gaps about the neurophysiology of SZ. However, to reach its fullest potential, basic methodological challenges need to be overcome. MEG-based resting-state power and connectivity findings could be great assets to clinical and translational research in psychiatry, and SZ in particular. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  12. Radiometric ratio characterization for low-to-mid CPV modules operating in variable irradiance conditions

    NASA Astrophysics Data System (ADS)

    Vorndran, Shelby; Russo, Juan; Zhang, Deming; Gordon, Michael; Kostuk, Raymond

    2012-10-01

    In this work, a concentrating photovoltaic (CPV) design methodology is proposed which aims to maximize system efficiency for a given irradiance condition. In this technique, the acceptance angle of the system is radiometrically matched to the angular spread of the site's average irradiance conditions using a simple geometric ratio. The optical efficiency of CPV systems from flat-plate to high-concentration is plotted at all irradiance conditions. Concentrator systems are measured outdoors in various irradiance conditions to test the methodology. This modeling technique is valuable at the design stage to determine the ideal level of concentration for a CPV module. It requires only two inputs: the acceptance angle profile of the system and the site's average direct and diffuse irradiance fractions. Acceptance angle can be determined by raytracing or testing a fabricated prototype in the lab with a solar simulator. The average irradiance conditions can be found in the Typical Metrological Year (TMY3) database. Additionally, the information gained from this technique can be used to determine tracking tolerance, quantify power loss during an isolated weather event, and do more sophisticated analysis such as I-V curve simulation.

  13. A new methodology for the measurement of the root canal curvature and its 3D modification after instrumentation.

    PubMed

    Christodoulou, Asterios; Mikrogeorgis, Georgios; Vouzara, Triantafillia; Papachristou, Konstantinos; Angelopoulos, Christos; Nikolaidis, Nikolaos; Pitas, Ioannis; Lyroudia, Kleoniki

    2018-02-15

    In this study, the three-dimensional (3D) modification of root canal curvature was measured, after the application of Reciproc instrumentation technique, by using cone beam computed tomography (CBCT) imaging and a special algorithm developed for the 3D measurement of the curvature of the root canal. Thirty extracted upper molars were selected. Digital radiographs for each tooth were taken. Root curvature was measured by using Schneider method and they were divided into three groups, each one consisting of 10 roots, according to their curvature: Group 1 (0°-20°), Group 2 (21°-40°), Group 3 (41°-60°). CBCT imaging was applied to each tooth before and after its instrumentation, and the data were examined by using a specially developed CBCT image analysis algorithm. The instrumentation with Reciproc led to a decrease of the curvature by 30.23% (on average) in all groups. The proposed methodology proved to be able to measure the curvature of the root canal and its 3D modification after the instrumentation.

  14. System for Measuring Conditional Amplitude, Phase, or Time Distributions of Pulsating Phenomena

    PubMed Central

    Van Brunt, Richard J.; Cernyar, Eric W.

    1992-01-01

    A detailed description is given of an electronic stochastic analyzer for use with direct “real-time” measurements of the conditional distributions needed for a complete stochastic characterization of pulsating phenomena that can be represented as random point processes. The measurement system described here is designed to reveal and quantify effects of pulse-to-pulse or phase-to-phase memory propagation. The unraveling of memory effects is required so that the physical basis for observed statistical properties of pulsating phenomena can be understood. The individual unique circuit components that comprise the system and the combinations of these components for various measurements, are thoroughly documented. The system has been applied to the measurement of pulsating partial discharges generated by applying alternating or constant voltage to a discharge gap. Examples are shown of data obtained for conditional and unconditional amplitude, time interval, and phase-of-occurrence distributions of partial-discharge pulses. The results unequivocally show the existence of significant memory effects as indicated, for example, by the observations that the most probable amplitudes and phases-of-occurrence of discharge pulses depend on the amplitudes and/or phases of the preceding pulses. Sources of error and fundamental limitations of the present measurement approach are analyzed. Possible extensions of the method are also discussed. PMID:28053450

  15. Measuring Value Added in Higher Education: A Proposed Methodology for Developing a Performance Indicator Based on the Economic Value Added to Graduates

    ERIC Educational Resources Information Center

    Rodgers, Timothy

    2007-01-01

    The 2003 UK higher education White Paper suggested that the sector needed to re-examine the potential of the value added concept. This paper describes a possible methodology for developing a performance indicator based on the economic value added to graduates. The paper examines how an entry-quality-adjusted measure of a graduate's…

  16. Relating the 2010 signalized intersection methodology to alternate approaches in the context of NYC conditions.

    DOT National Transportation Integrated Search

    2013-11-01

    The Highway Capacity Manual (HCM) has had a delay-based level of service methodology for signalized intersections since 1985. : The 2010 HCM has revised the method for calculating delay. This happened concurrent with such jurisdictions as NYC reviewi...

  17. Leaf wound induced ultraweak photon emission is suppressed under anoxic stress: Observations of Spathiphyllum under aerobic and anaerobic conditions using novel in vivo methodology.

    PubMed

    Oros, Carl L; Alves, Fabio

    2018-01-01

    Plants have evolved a variety of means to energetically sense and respond to abiotic and biotic environmental stress. Two typical photochemical signaling responses involve the emission of volatile organic compounds and light. The emission of certain leaf wound volatiles and light are mutually dependent upon oxygen which is subsequently required for the wound-induced lipoxygenase reactions that trigger the formation of fatty acids and hydroperoxides; ultimately leading to photon emission by chlorophyll molecules. A low noise photomultiplier with sensitivity in the visible spectrum (300-720 nm) is used to continuously measure long duration ultraweak photon emission of dark-adapting whole Spathiphyllum leaves (in vivo). Leaves were mechanically wounded after two hours of dark adaptation in aerobic and anaerobic conditions. It was found that (1) nitrogen incubation did not affect the pre-wound basal photocounts; (2) wound induced leaf biophoton emission was significantly suppressed when under anoxic stress; and (3) the aerobic wound induced emission spectra observed was > 650 nm, implicating chlorophyll as the likely emitter. Limitations of the PMT photocathode's radiant sensitivity, however, prevented accurate analysis from 700-720 nm. Further examination of leaf wounding profile photon counts revealed that the pre-wounding basal state (aerobic and anoxic), the anoxic wounding state, and the post-wounding aerobic state statistics all approximate a Poisson distribution. It is additionally observed that aerobic wounding induces two distinct exponential decay events. These observations contribute to the body of plant wound-induced luminescence research and provide a novel methodology to measure this phenomenon in vivo.

  18. Innovative Methodologies for thermal Energy Release Measurement: case of La Solfatara volcano (Italy)

    NASA Astrophysics Data System (ADS)

    Marfe`, Barbara; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Marotta, Enrica; Peluso, Rosario

    2015-04-01

    This work is devoted to improve the knowledge on the parameters that control the heat flux anomalies associated with the diffuse degassing processes of volcanic and hydrothermal areas. The methodologies currently used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. A new method, based on the use of thermal imaging cameras, has been applied to estimate the heat flux and its time variations. This approach will allow faster heat flux measurement than already accredited methods, improving in this way the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The idea is to extrapolate the heat flux from the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. We use thermal imaging cameras, at short distances (meters to hundreds of meters), to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature. Preliminary studies have been carried out throughout the whole of the La Solfatara crater in order to investigate a possible correlation between the surface temperature and the shallow thermal gradient. We have used a FLIR SC640 thermal camera and K type thermocouples to assess the two measurements at the same time. Results suggest a good correlation between the shallow temperature gradient ΔTs and the surface temperature Ts depurated from background, and despite the campaigns took place during a period of time of a few years, this correlation seems to be stable over the time. This is an extremely motivating result for a further development of a measurement method based only on the use of small range thermal imaging camera. Surveys with thermal cameras may be manually done using a tripod to take thermal images of small contiguous areas and then joining

  19. Fluorescent nanosensors for intracellular measurements: synthesis, characterization, calibration, and measurement

    PubMed Central

    Desai, Arpan S.; Chauhan, Veeren M.; Johnston, Angus P. R.; Esler, Tim; Aylott, Jonathan W.

    2013-01-01

    Measurement of intracellular acidification is important for understanding fundamental biological pathways as well as developing effective therapeutic strategies. Fluorescent pH nanosensors are an enabling technology for real-time monitoring of intracellular acidification. The physicochemical characteristics of nanosensors can be engineered to target specific cellular compartments and respond to external stimuli. Therefore, nanosensors represent a versatile approach for probing biological pathways inside cells. The fundamental components of nanosensors comprise a pH-sensitive fluorophore (signal transducer) and a pH-insensitive reference fluorophore (internal standard) immobilized in an inert non-toxic matrix. The inert matrix prevents interference of cellular components with the sensing elements as well as minimizing potentially harmful effects of some fluorophores on cell function. Fluorescent nanosensors are synthesized using standard laboratory equipment and are detectable by non-invasive widely accessible imaging techniques. The outcomes of studies employing this technology are dependent on reliable methodology for performing measurements. In particular, special consideration must be given to conditions for sensor calibration, uptake conditions and parameters for image analysis. We describe procedures for: (1) synthesis and characterization of polyacrylamide and silica based nanosensors, (2) nanosensor calibration and (3) performing measurements using fluorescence microscopy. PMID:24474936

  20. Level-Set Methodology on Adaptive Octree Grids

    NASA Astrophysics Data System (ADS)

    Gibou, Frederic; Guittet, Arthur; Mirzadeh, Mohammad; Theillard, Maxime

    2017-11-01

    Numerical simulations of interfacial problems in fluids require a methodology capable of tracking surfaces that can undergo changes in topology and capable to imposing jump boundary conditions in a sharp manner. In this talk, we will discuss recent advances in the level-set framework, in particular one that is based on adaptive grids.

  1. Measurement of Workforce Readiness Competencies: Design of Prototype Measures.

    ERIC Educational Resources Information Center

    O'Neil, Harold F., Jr.; And Others

    A general methodology approach is suggested for measurement of workforce readiness competencies in the context of overall work by the National Center for Research on Evaluation, Standards, and Student Testing on the domain-independent measurement of workforce readiness skills. The methodology consists of 14 steps, from the initial selection of a…

  2. New methodology to reconstruct in 2-D the cuspal enamel of modern human lower molars.

    PubMed

    Modesto-Mata, Mario; García-Campos, Cecilia; Martín-Francés, Laura; Martínez de Pinillos, Marina; García-González, Rebeca; Quintino, Yuliet; Canals, Antoni; Lozano, Marina; Dean, M Christopher; Martinón-Torres, María; Bermúdez de Castro, José María

    2017-08-01

    In the last years different methodologies have been developed to reconstruct worn teeth. In this article, we propose a new 2-D methodology to reconstruct the worn enamel of lower molars. Our main goals are to reconstruct molars with a high level of accuracy when measuring relevant histological variables and to validate the methodology calculating the errors associated with the measurements. This methodology is based on polynomial regression equations, and has been validated using two different dental variables: cuspal enamel thickness and crown height of the protoconid. In order to perform the validation process, simulated worn modern human molars were employed. The associated errors of the measurements were also estimated applying methodologies previously proposed by other authors. The mean percentage error estimated in reconstructed molars for these two variables in comparison with their own real values is -2.17% for the cuspal enamel thickness of the protoconid and -3.18% for the crown height of the protoconid. This error significantly improves the results of other methodologies, both in the interobserver error and in the accuracy of the measurements. The new methodology based on polynomial regressions can be confidently applied to the reconstruction of cuspal enamel of lower molars, as it improves the accuracy of the measurements and reduces the interobserver error. The present study shows that it is important to validate all methodologies in order to know the associated errors. This new methodology can be easily exportable to other modern human populations, the human fossil record and forensic sciences. © 2017 Wiley Periodicals, Inc.

  3. Thrust Measurement of Dielectric Barrier Discharge (DBD) Plasma Actuators: New Anti-Thrust Hypothesis, Frequency Sweeps Methodology, Humidity and Enclosure Effects

    NASA Technical Reports Server (NTRS)

    Ashpis, David E.; Laun, Matthew C.

    2014-01-01

    We discuss thrust measurements of Dielectric Barrier Discharge (DBD) plasma actuators devices used for aerodynamic active flow control. After a review of our experience with conventional thrust measurement and significant non-repeatability of the results, we devised a suspended actuator test setup, and now present a methodology of thrust measurements with decreased uncertainty. The methodology consists of frequency scans at constant voltages. The procedure consists of increasing the frequency in a step-wise fashion from several Hz to the maximum frequency of several kHz, followed by frequency decrease back down to the start frequency of several Hz. This sequence is performed first at the highest voltage of interest, then repeated at lower voltages. The data in the descending frequency direction is more consistent and selected for reporting. Sample results show strong dependence of thrust on humidity which also affects the consistency and fluctuations of the measurements. We also observed negative values of thrust, or "anti-thrust", at low frequencies between 4 Hz and up to 64 Hz. The anti-thrust is proportional to the mean-squared voltage and is frequency independent. Departures from the parabolic anti-thrust curve are correlated with appearance of visible plasma discharges. We propose the anti-thrust hypothesis. It states that the measured thrust is a sum of plasma thrust and anti-thrust, and assumes that the anti-thrust exists at all frequencies and voltages. The anti-thrust depends on actuator geometry and materials and on the test installation. It enables the separation of the plasma thrust from the measured total thrust. This approach enables more meaningful comparisons between actuators at different installations and laboratories. The dependence on test installation was validated by surrounding the actuator with a grounded large-diameter metal sleeve. Strong dependence on humidity is also shown; the thrust significantly increased with decreasing humidity, e

  4. Comparison of fungal spores concentrations measured with wideband integrated bioaerosol sensor and Hirst methodology

    NASA Astrophysics Data System (ADS)

    Fernández-Rodríguez, S.; Tormo-Molina, R.; Lemonis, N.; Clot, B.; O'Connor, D. J.; Sodeau, John R.

    2018-02-01

    The aim of this work was to provide both a comparison of traditional and novel methodologies for airborne spores detection (i.e. the Hirst Burkard trap and WIBS-4) and the first quantitative study of airborne fungal concentrations in Payerne (Western Switzerland) as well as their relation to meteorological parameters. From the traditional method -Hirst trap and microscope analysis-, sixty-three propagule types (spores, sporangia and hyphae) were identified and the average spore concentrations measured over the full period amounted to 4145 ± 263.0 spores/m3. Maximum values were reached on July 19th and on August 6th. Twenty-six spore types reached average levels above 10 spores/m3. Airborne fungal propagules in Payerne showed a clear seasonal pattern, increasing from low values in early spring to maxima in summer. Daily average concentrations above 5000 spores/m3 were almost constant in summer from mid-June onwards. Weather parameters showed a relevant role for determining the observed spore concentrations. Coniferous forest, dominant in the surroundings, may be a relevant source for airborne fungal propagules as their distribution and predominant wind directions are consistent with the origin. The comparison between the two methodologies used in this campaign showed remarkably consistent patterns throughout the campaign. A correlation coefficient of 0.9 (CI 0.76-0.96) was seen between the two over the time period for daily resolutions (Hirst trap and WIBS-4). This apparent co-linearity was seen to fall away once increased resolution was employed. However at higher resolutions upon removal of Cladosporium species from the total fungal concentrations (Hirst trap), an increased correlation coefficient was again noted between the two instruments (R = 0.81 with confidence intervals of 0.74 and 0.86).

  5. Health economic assessment: a methodological primer.

    PubMed

    Simoens, Steven

    2009-12-01

    This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.

  6. Health Economic Assessment: A Methodological Primer

    PubMed Central

    Simoens, Steven

    2009-01-01

    This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments. PMID:20049237

  7. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    NASA Astrophysics Data System (ADS)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  8. Influence of activated carbon characteristics on toluene and hexane adsorption: Application of surface response methodology

    NASA Astrophysics Data System (ADS)

    Izquierdo, Mª Teresa; de Yuso, Alicia Martínez; Valenciano, Raquel; Rubio, Begoña; Pino, Mª Rosa

    2013-01-01

    The objective of this study was to evaluate the adsorption capacity of toluene and hexane over activated carbons prepared according an experimental design, considering as variables the activation temperature, the impregnation ratio and the activation time. The response surface methodology was applied to optimize the adsorption capacity of the carbons regarding the preparation conditions that determine the physicochemical characteristics of the activated carbons. The methodology of preparation produced activated carbons with surface areas and micropore volumes as high as 1128 m2/g and 0.52 cm3/g, respectively. Moreover, the activated carbons exhibit mesoporosity, ranging from 64.6% to 89.1% the percentage of microporosity. The surface chemistry was characterized by TPD, FTIR and acid-base titration obtaining different values of surface groups from the different techniques because the limitation of each technique, but obtaining similar trends for the activated carbons studied. The exhaustive characterization of the activated carbons allows to state that the measured surface area does not explain the adsorption capacity for either toluene or n-hexane. On the other hand, the surface chemistry does not explain the adsorption results either. A compromise between physical and chemical characteristics can be obtained from the appropriate activation conditions, and the response surface methodology gives the optimal activated carbon to maximize adsorption capacity. Low activation temperature, intermediate impregnation ratio lead to high toluene and n-hexane adsorption capacities depending on the activation time, which a determining factor to maximize toluene adsorption.

  9. Human and Methodological Sources of Variability in the Measurement of Urinary 8-Oxo-7,8-dihydro-2′-deoxyguanosine

    PubMed Central

    Møller, Peter; Henriksen, Trine; Mistry, Vilas; Koppen, Gudrun; Rossner, Pavel; Sram, Radim J.; Weimann, Allan; Poulsen, Henrik E.; Nataf, Robert; Andreoli, Roberta; Manini, Paola; Marczylo, Tim; Lam, Patricia; Evans, Mark D.; Kasai, Hiroshi; Kawai, Kazuaki; Li, Yun-Shan; Sakai, Kazuo; Singh, Rajinder; Teichert, Friederike; Farmer, Peter B.; Rozalski, Rafal; Gackowski, Daniel; Siomek, Agnieszka; Saez, Guillermo T.; Cerda, Concha; Broberg, Karin; Lindh, Christian; Hossain, Mohammad Bakhtiar; Haghdoost, Siamak; Hu, Chiung-Wen; Chao, Mu-Rong; Wu, Kuen-Yuh; Orhan, Hilmi; Senduran, Nilufer; Smith, Raymond J.; Santella, Regina M.; Su, Yali; Cortez, Czarina; Yeh, Susan; Olinski, Ryszard; Loft, Steffen

    2013-01-01

    Abstract Aims: Urinary 8-oxo-7,8-dihydro-2′-deoxyguanosine (8-oxodG) is a widely used biomarker of oxidative stress. However, variability between chromatographic and ELISA methods hampers interpretation of data, and this variability may increase should urine composition differ between individuals, leading to assay interference. Furthermore, optimal urine sampling conditions are not well defined. We performed inter-laboratory comparisons of 8-oxodG measurement between mass spectrometric-, electrochemical- and ELISA-based methods, using common within-technique calibrants to analyze 8-oxodG-spiked phosphate-buffered saline and urine samples. We also investigated human subject- and sample collection-related variables, as potential sources of variability. Results: Chromatographic assays showed high agreement across urines from different subjects, whereas ELISAs showed far more inter-laboratory variation and generally overestimated levels, compared to the chromatographic assays. Excretion rates in timed ‘spot’ samples showed strong correlations with 24 h excretion (the ‘gold’ standard) of urinary 8-oxodG (rp 0.67–0.90), although the associations were weaker for 8-oxodG adjusted for creatinine or specific gravity (SG). The within-individual excretion of 8-oxodG varied only moderately between days (CV 17% for 24 h excretion and 20% for first void, creatinine-corrected samples). Innovation: This is the first comprehensive study of both human and methodological factors influencing 8-oxodG measurement, providing key information for future studies with this important biomarker. Conclusion: ELISA variability is greater than chromatographic assay variability, and cannot determine absolute levels of 8-oxodG. Use of standardized calibrants greatly improves intra-technique agreement and, for the chromatographic assays, importantly allows integration of results for pooled analyses. If 24 h samples are not feasible, creatinine- or SG-adjusted first morning samples

  10. Health economic evaluation: important principles and methodology.

    PubMed

    Rudmik, Luke; Drummond, Michael

    2013-06-01

    To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.

  11. Forest and rangeland ecosystem condition indicators: identifying national areas of opportunity using data development analysis

    Treesearch

    John G. Hof; Curtis H. Flather; Tony J. Baltic; Rudy M. King

    2004-01-01

    This article reports the methodology and results of a data envelopment analysis (DEA) that attempts to identify areas in the country where there is maximum potential for improving the forest and rangeland condition, based on 12 indicator variables. This analysis differs from previous DEA studies in that the primary variables are measures of human activity and...

  12. Measuring Sound-Processor Threshold Levels for Pediatric Cochlear Implant Recipients Using Conditioned Play Audiometry via Telepractice

    PubMed Central

    Goehring, Jenny L.

    2017-01-01

    Purpose This study evaluated the use of telepractice for measuring cochlear implant (CI) behavioral threshold (T) levels in children using conditioned play audiometry (CPA). The goals were to determine whether (a) T levels measured via telepractice were not significantly different from those obtained in person, (b) response probability differed between remote and in-person conditions, and (c) the remote visit required more time than the in-person condition. Method An ABBA design (A, in-person; B, remote) was split across 2 visits. Nineteen children aged 2.6–7.1 years participated. T levels were measured using CPA for 3 electrodes per session. A “hit” rate was calculated to determine whether the likelihood of obtaining responses differed between conditions. Test time was compared across conditions. A questionnaire was administered to assess parent/caregiver attitudes about telepractice. Results Results indicated no significant difference in T levels between conditions. Hit rates were not significantly different between in-person and remote conditions (98% vs. 97%, respectively). Test time was similar between conditions. Questionnaire results revealed that 100% of caregivers would use telepractice for CI appointments either some or all of the time. Conclusion Telepractice is a viable option for routine pediatric programming appointments for children using CPA to set behavioral thresholds. PMID:28257529

  13. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    PubMed

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  14. Fast methodology of analysing major steviol glycosides from Stevia rebaudiana leaves.

    PubMed

    Lorenzo, Cándida; Serrano-Díaz, Jéssica; Plaza, Miguel; Quintanilla, Carmen; Alonso, Gonzalo L

    2014-08-15

    The aim of this work is to propose an HPLC method for analysing major steviol glycosides as well as to optimise the extraction and clarification conditions for obtaining these compounds. Toward this aim, standards of stevioside and rebaudioside A with purities ⩾99.0%, commercial samples from different companies and Stevia rebaudiana Bertoni leaves from Paraguay supplied by Insobol, S.L., were used. The analytical method proposed is adequate in terms of selectivity, sensitivity and accuracy. Optimum extraction conditions and adequate clarification conditions have been set. Moreover, this methodology is safe and eco-friendly, as we use only water for extraction and do not use solid-phase extraction, which requires solvents that are banned in the food industry to condition the cartridge and elute the steviol glycosides. In addition, this methodology consumes little time as leaves are not ground and the filtration is faster, and the peak resolution is better as we used an HPLC method with gradient elution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Innovative methodology for electrical conductivity measurements and metal partition in biosolid pellets

    NASA Astrophysics Data System (ADS)

    Jordan, Manuel Miguel; Rincón-Mora, Beatriz; Belén Almendro-Candel, María; Navarro-Pedreño, José; Gómez-Lucas, Ignacio; Bech, Jaume

    2017-04-01

    Use of biosolids to improve the nutrient content in a soil is a common practice. The obligation to restore abandoned mine and the correct application of biosolids is guaranteed by the legislation on waste management, biosolids and soil conservation (Jordán et al. 2008). The present research was conducted to determine electrical conductivity in dry wastes (pellets) using a innovative methodology (Camilla and Jordán, 2009). On the other hand, the present study was designed to examine the distribution of selected heavy metals in biosolid pellets, and also to relate the distribution patterns of these metals. In this context, heavy metal concentrations were studied in biosolid pellets under different pressures. Electrical conductivity measurements were taken in biosolid pellets under pressures on the order of 50 to 150 MPa and with currents of 10-15 A. Measurements of electrical conductivity and heavy metal content for different areas (H1, H2, and H3) were taken. Total content of metals was determined following microwave digestion and analysed by ICP/MS. Triplicate portions were weighed in polycarbonate centrifuge tubes and sequentially extracted. The distribution of chemical forms of Cd, Ni, Cr, and Pb in the biosolids was studied using a sequential extraction procedure that fractionates the metal into soluble-exchangeable, specifically sorbed-carbonate bound, oxidizable, reducible, and residual forms. The residual, reducible, and carbonate-sorbed forms were dominant. Higher Cr and Ni content were detected in pellets made with biosolids from the H3. The highest Cd and Ni values were detected in the H2. The trends of the conductivity curves were similar for the sludge from the isolation surface (H1) and for the mesophilous area (H2). In the case of the thermophilous area (H3), the electrical conductivity showed extremely high values. This behaviour was similar in the case of the Cr and Ni content. However, in the case of Cd and Pb, the highest values were detected in

  16. Conceptual and Methodological Issues in Research on Mindfulness and Meditation

    PubMed Central

    Davidson, Richard J.; Kaszniak, Alfred W.

    2015-01-01

    Both basic science and clinical research on mindfulness, meditation, and related constructs has dramatically increased in recent years. However, interpretation of these research results has been challenging. The present article addresses unique conceptual and methodological problems posed by research in this area. Included among the key topics is the role of first person experience and how it can be best studied; the challenges posed by intervention research designs in which true double-blinding is not possible; the nature of control and comparison conditions for research that includes mindfulness or other meditation-based interventions; issues in the adequate description of mindfulness and related trainings and interventions; the question of how mindfulness can be measured; questions regarding what can and cannot be inferred from self-report measures; and considerations regarding the structure of study design and data analyses. Most of these topics are germane to both basic and clinical research studies and have important bearing on the future scientific understanding of mindfulness and meditation. PMID:26436310

  17. PIV measurements in a compact return diffuser under multi-conditions

    NASA Astrophysics Data System (ADS)

    Zhou, L.; Lu, W. G.; Shi, W. D.

    2013-12-01

    Due to the complex three-dimensional geometries of impellers and diffusers, their design is a delicate and difficult task. Slight change could lead to significant changes in hydraulic performance and internal flow structure. Conversely, the grasp of the pump's internal flow pattern could benefit from pump design improvement. The internal flow fields in a compact return diffuser have been investigated experimentally under multi-conditions. A special Particle Image Velocimetry (PIV) test rig is designed, and the two-dimensional PIV measurements are successfully conducted in the diffuser mid-plane to capture the complex flow patterns. The analysis of the obtained results has been focused on the flow structure in diffuser, especially under part-load conditions. The vortex and recirculation flow patterns in diffuser are captured and analysed accordingly. Strong flow separation and back flow appeared at the part-load flow rates. Under the design and over-load conditions, the flow fields in diffuser are uniform, and the flow separation and back flow appear at the part-load flow rates, strong back flow is captured at one diffuser passage under 0.2Qdes.

  18. The Ocean Colour Climate Change Initiative: I. A Methodology for Assessing Atmospheric Correction Processors Based on In-Situ Measurements

    NASA Technical Reports Server (NTRS)

    Muller, Dagmar; Krasemann, Hajo; Brewin, Robert J. W.; Deschamps, Pierre-Yves; Doerffer, Roland; Fomferra, Norman; Franz, Bryan A.; Grant, Mike G.; Groom, Steve B.; Melin, Frederic; hide

    2015-01-01

    The Ocean Colour Climate Change Initiative intends to provide a long-term time series of ocean colour data and investigate the detectable climate impact. A reliable and stable atmospheric correction procedure is the basis for ocean colour products of the necessary high quality. In order to guarantee an objective selection from a set of four atmospheric correction processors, the common validation strategy of comparisons between in-situ and satellite derived water leaving reflectance spectra, is extended by a ranking system. In principle, the statistical parameters such as root mean square error, bias, etc. and measures of goodness of fit, are transformed into relative scores, which evaluate the relationship of quality dependent on the algorithms under study. The sensitivity of these scores to the selected database has been assessed by a bootstrapping exercise, which allows identification of the uncertainty in the scoring results. Although the presented methodology is intended to be used in an algorithm selection process, this paper focusses on the scope of the methodology rather than the properties of the individual processors.

  19. Measuring Sense of Community: A Methodological Interpretation of the Factor Structure Debate

    ERIC Educational Resources Information Center

    Peterson, N. Andrew; Speer, Paul W.; Hughey, Joseph

    2006-01-01

    Instability in the factor structure of the Sense of Community Index (SCI) was tested as a methodological artifact. Confirmatory factor analyses, tested with two data sets, supported neither the proposed one-factor nor the four-factor (needs fulfillment, group membership, influence, and emotional connection) SCI. Results demonstrated that the SCI…

  20. A combination of body condition measurements is more informative than conventional condition indices: temporal variation in body condition and corticosterone in brown tree snakes (Boiga irregularis).

    PubMed

    Waye, Heather L; Mason, Robert T

    2008-02-01

    The body condition index is a common method for quantifying the energy reserves of individual animals. Because good body condition is necessary for reproduction in many species, body condition indices can indicate the potential reproductive output of a population. Body condition is related to glucocorticoid production, in that low body condition is correlated to high concentrations of corticosterone in reptiles. We compared the body condition index and plasma corticosterone levels of brown tree snakes on Guam in 2003 to those collected in 1992/1993 to determine whether that population still showed the chronic stress and poor condition apparent in the earlier study. We also examined the relationship between fat mass, body condition and plasma corticosterone concentrations as indicators of physiological condition of individuals in the population. Body condition was significantly higher in 2003 than in the earlier sample for mature male and female snakes, but not for juveniles. The significantly lower levels of corticosterone in all three groups in 2003 suggests that although juveniles did not have significantly improved energy stores they, along with the mature males and females, were no longer under chronic levels of stress. Although the wet season of 2002 was unusually rainy, low baseline levels of corticosterone measured in 2000 indicate that the improved body condition of snakes in 2003 is likely the result of long-term changes in prey populations rather than annual variation in response to environmental conditions.

  1. Performance in wild ungulates: measuring population density and condition of individuals

    Treesearch

    John G. Kie

    1988-01-01

    Measures of performance in wild ungulates can include characteristics indicative of condition and health such as body weights, fat reserves, blood values, reproductive rates, and parasite loads. Performance may also be inferred from habitat-related factors, such as diet and nutritional intake. However, these parameters interact with population density to form a...

  2. The relation of functional visual acuity measurement methodology to tear functions and ocular surface status.

    PubMed

    Kaido, Minako; Ishida, Reiko; Dogru, Murat; Tsubota, Kazuo

    2011-09-01

    To investigate the relation of functional visual acuity (FVA) measurements with dry eye test parameters and to compare the testing methods with and without blink suppression and anesthetic instillation. A prospective comparative case series. Thirty right eyes of 30 dry eye patients and 25 right eyes of 25 normal subjects seen at Keio University School of Medicine, Department of Ophthalmology were studied. FVA testing was performed using a FVA measurement system with two different approaches, one in which measurements were made under natural blinking conditions without topical anesthesia (FVA-N) and the other in which the measurements were made under the blink suppression condition with topical anesthetic eye drops (FVA-BS). Tear function examinations, such as the Schirmer test, tear film break-up time, and fluorescein and Rose Bengal vital staining as ocular surface evaluation, were performed. The mean logMAR FVA-N scores and logMAR Landolt visual acuity scores were significantly lower in the dry eye subjects than in the healthy controls (p < 0.05), while there were no statistical differences between the logMAR FVA-BS scores of the dry eye subjects and those of the healthy controls. There was a significant correlation between the logMAR Landolt visual acuities and the logMAR FVA-N and logMAR FVA-BS scores. The FVA-N scores correlated significantly with tear quantities, tear stability and, especially, the ocular surface vital staining scores. FVA measurements performed under natural blinking significantly reflected the tear functions and ocular surface status of the eye and would appear to be a reliable method of FVA testing. FVA measurement is also an accurate predictor of dry eye status.

  3. Vibration condition measure instrument of motor using MEMS accelerometer

    NASA Astrophysics Data System (ADS)

    Chen, Jun

    2018-04-01

    In this work, a novel vibration condition measure instrument of motor using a digital micro accelerometer is proposed. In order to reduce the random noise found in the data, the sensor modeling is established and also the Kalman filter (KMF) is developed. According to these data from KMF, the maximum vibration displacement is calculated by the integration algorithm with the DC bias removed. The high performance micro controller unit (MCU) is used in the implementation of controller. By the IIC digital interface port, the data are transmitted from sensor to controller. The hardware circuits of the sensor and micro controller are designed and tested. With the computational formula of maximum displacement and FFT, the high precession results of displacement and frequency are gotten. Finally, the paper presents various experimental results to prove that this instrument is suitable for application in electrical motor vibration measurement.

  4. High Accuracy Temperature Measurements Using RTDs with Current Loop Conditioning

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.

    1997-01-01

    To measure temperatures with a greater degree of accuracy than is possible with thermocouples, RTDs (Resistive Temperature Detectors) are typically used. Calibration standards use specialized high precision RTD probes with accuracies approaching 0.001 F. These are extremely delicate devices, and far too costly to be used in test facility instrumentation. Less costly sensors which are designed for aeronautical wind tunnel testing are available and can be readily adapted to probes, rakes, and test rigs. With proper signal conditioning of the sensor, temperature accuracies of 0.1 F is obtainable. For reasons that will be explored in this paper, the Anderson current loop is the preferred method used for signal conditioning. This scheme has been used in NASA Lewis Research Center's 9 x 15 Low Speed Wind Tunnel, and is detailed.

  5. Wastewater Sampling Methodologies and Flow Measurement Techniques.

    ERIC Educational Resources Information Center

    Harris, Daniel J.; Keffer, William J.

    This document provides a ready source of information about water/wastewater sampling activities using various commercial sampling and flow measurement devices. The report consolidates the findings and summarizes the activities, experiences, sampling methods, and field measurement techniques conducted by the Environmental Protection Agency (EPA),…

  6. Lifesource XL-18 pedometer for measuring steps under controlled and free-living conditions.

    PubMed

    Liu, Sam; Brooks, Dina; Thomas, Scott; Eysenbach, Gunther; Nolan, Robert Peter

    2015-01-01

    The primary aim was to examine the criterion and construct validity and test-retest reliability of the Lifesource XL-18 pedometer (A&D Medical, Toronto, ON, Canada) for measuring steps under controlled and free-living activities. The influence of body mass index, waist size and walking speed on the criterion validity of XL-18 was also explored. Forty adults (35-74 years) performed a 6-min walk test in the controlled condition, and the criterion validity of XL-18 was assessed by comparing it to steps counted manually. Thirty-five adults participated in the free-living condition and the construct validity of XL-18 was assessed by comparing it to Yamax SW-200 (YAMAX Health & Sports, Inc., San Antonio, TX, USA). During the controlled condition, XL-18 did not significantly differ from criterion (P > 0.05) and no systematic error was found using Bland-Altman analysis. The accuracy of XL-18 decreased with slower walking speed (P = 0.001). During the free-living condition, Bland-Altman analysis revealed that XL-18 overestimated daily steps by 327 ± 118 than Yamax (P = 0.004). However, the absolute percent error (APE) (6.5 ± 0.58%) was still within an acceptable range. XL-18 did not differ statistically between pant pockets. XL-18 is suitable for measuring steps in controlled and free-living conditions. However, caution may be required when interpreting the steps recorded under slower speeds and free-living conditions.

  7. The Impact of Indoor and Outdoor Radiometer Calibration on Solar Measurements: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, Aron; Sengupta, Manajit; Andreas, Afshin

    2016-07-01

    Accurate solar radiation data sets are critical to reducing the expenses associated with mitigating performance risk for solar energy conversion systems, and they help utility planners and grid system operators understand the impacts of solar resource variability. The accuracy of solar radiation measured by radiometers depends on the instrument performance specification, installation method, calibration procedure, measurement conditions, maintenance practices, location, and environmental conditions. This study addresses the effect of calibration methodologies and the resulting calibration responsivities provided by radiometric calibration service providers such as the National Renewable Energy Laboratory (NREL) and manufacturers of radiometers. Some of these radiometers are calibratedmore » indoors, and some are calibrated outdoors. To establish or understand the differences in calibration methodology, we processed and analyzed field-measured data from these radiometers. This study investigates calibration responsivities provided by NREL's broadband outdoor radiometer calibration (BORCAL) and a few prominent manufacturers. The reference radiometer calibrations are traceable to the World Radiometric Reference. These different methods of calibration demonstrated 1% to 2% differences in solar irradiance measurement. Analyzing these values will ultimately assist in determining the uncertainties of the radiometer data and will assist in developing consensus on a standard for calibration.« less

  8. Putting the methodological brakes on claims to measure national happiness through Twitter: Methodological limitations in social media analytics.

    PubMed

    Jensen, Eric Allen

    2017-01-01

    With the rapid global proliferation of social media, there has been growing interest in using this existing source of easily accessible 'big data' to develop social science knowledge. However, amidst the big data gold rush, it is important that long-established principles of good social research are not ignored. This article critically evaluates Mitchell et al.'s (2013) study, 'The Geography of Happiness: Connecting Twitter Sentiment and Expression, Demographics, and Objective Characteristics of Place', demonstrating the importance of attending to key methodological issues associated with secondary data analysis.

  9. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  10. Recurrence measure of conditional dependence and applications.

    PubMed

    Ramos, Antônio M T; Builes-Jaramillo, Alejandro; Poveda, Germán; Goswami, Bedartha; Macau, Elbert E N; Kurths, Jürgen; Marwan, Norbert

    2017-05-01

    Identifying causal relations from observational data sets has posed great challenges in data-driven causality inference studies. One of the successful approaches to detect direct coupling in the information theory framework is transfer entropy. However, the core of entropy-based tools lies on the probability estimation of the underlying variables. Here we propose a data-driven approach for causality inference that incorporates recurrence plot features into the framework of information theory. We define it as the recurrence measure of conditional dependence (RMCD), and we present some applications. The RMCD quantifies the causal dependence between two processes based on joint recurrence patterns between the past of the possible driver and present of the potentially driven, excepting the contribution of the contemporaneous past of the driven variable. Finally, it can unveil the time scale of the influence of the sea-surface temperature of the Pacific Ocean on the precipitation in the Amazonia during recent major droughts.

  11. Recurrence measure of conditional dependence and applications

    NASA Astrophysics Data System (ADS)

    Ramos, Antônio M. T.; Builes-Jaramillo, Alejandro; Poveda, Germán; Goswami, Bedartha; Macau, Elbert E. N.; Kurths, Jürgen; Marwan, Norbert

    2017-05-01

    Identifying causal relations from observational data sets has posed great challenges in data-driven causality inference studies. One of the successful approaches to detect direct coupling in the information theory framework is transfer entropy. However, the core of entropy-based tools lies on the probability estimation of the underlying variables. Here we propose a data-driven approach for causality inference that incorporates recurrence plot features into the framework of information theory. We define it as the recurrence measure of conditional dependence (RMCD), and we present some applications. The RMCD quantifies the causal dependence between two processes based on joint recurrence patterns between the past of the possible driver and present of the potentially driven, excepting the contribution of the contemporaneous past of the driven variable. Finally, it can unveil the time scale of the influence of the sea-surface temperature of the Pacific Ocean on the precipitation in the Amazonia during recent major droughts.

  12. EEG alpha spindle measures as indicators of driver fatigue under real traffic conditions.

    PubMed

    Simon, Michael; Schmidt, Eike A; Kincses, Wilhelm E; Fritzsche, Martin; Bruns, Andreas; Aufmuth, Claus; Bogdan, Martin; Rosenstiel, Wolfgang; Schrauf, Michael

    2011-06-01

    The purpose of this study is to show the effectiveness of EEG alpha spindles, defined by short narrowband bursts in the alpha band, as an objective measure for assessing driver fatigue under real driving conditions. An algorithm for the identification of alpha spindles is described. The performance of the algorithm is tested based on simulated data. The method is applied to real data recorded under real traffic conditions and compared with the performance of traditional EEG fatigue measures, i.e. alpha-band power. As a highly valid fatigue reference, the last 20 min of driving from participants who aborted the drive due to heavy fatigue were used in contrast to the initial 20 min of driving. Statistical analysis revealed significant increases from the first to the last driving section of several alpha spindle parameters and among all traditional EEG frequency bands, only of alpha-band power; with larger effect sizes for the alpha spindle based measures. An increased level of fatigue over the same time periods for drop-outs, as compared to participants who did not abort the drive, was observed only by means of alpha spindle parameters. EEG alpha spindle parameters increase both fatigue detection sensitivity and specificity as compared to EEG alpha-band power. It is demonstrated that alpha spindles are superior to EEG band power measures for assessing driver fatigue under real traffic conditions. Copyright © 2011 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Field testing and adaptation of a methodology to measure "in-stream" values in the Tongue River, northern Great Plains (NGP) region

    USGS Publications Warehouse

    Bovee, Ken D.; Gore, James A.; Silverman, Arnold J.

    1978-01-01

    A comprehensive, multi-component in-stream flow methodology was developed and field tested in the Tongue River in southeastern Montana. The methodology incorporates a sensitivity for the flow requirements of a wide variety of in-stream uses, and the flexibility to adjust flows to accommodate seasonal and sub-seasonal changes in the flow requirements for different areas. In addition, the methodology provides the means to accurately determine the magnitude of the water requirement for each in-stream use. The methodology can be a powerful water management tool in that it provides the flexibility and accuracy necessary in water use negotiations and evaluation of trade-offs. In contrast to most traditional methodologies, in-stream flow requirements were determined by additive independent methodologies developed for: 1) fisheries, including spawning, rearing, and food production; 2) sediment transport; 3) the mitigation of adverse impacts of ice; and 4) evapotranspiration losses. Since each flow requirement varied in important throughout the year, the consideration of a single in-stream use as a basis for a flow recommendation is inadequate. The study shows that the base flow requirement for spawning shovelnose sturgeon was 13.0 m3/sec. During the same period of the year, the flow required to initiate the scour of sediment from pools is 18.0 m3/sec, with increased scour efficiency occurring at flows between 20.0 and 25.0 m3/sec. An over-winter flow of 2.83 m3/sec. would result in the loss of approximately 80% of the riffle areas to encroachment by surface ice. At the base flow for insect production, approximately 60% of the riffle area is lost to ice. Serious damage to the channel could be incurred from ice jams during the spring break-up period. A flow of 12.0 m3/sec. is recommended to alleviate this problem. Extensive ice jams would be expected at the base rearing and food production levels. The base rearing flow may be profoundly influenced by the loss of streamflow

  14. Characterizing wood-plastic composites via data-driven methodologies

    Treesearch

    John G. Michopoulos; John C. Hermanson; Robert Badaliance

    2007-01-01

    The recent increase of wood-plastic composite materials in various application areas has underlined the need for an efficient and robust methodology to characterize their nonlinear anisotropic constitutive behavior. In addition, the multiplicity of various loading conditions in structures utilizing these materials further increases the need for a characterization...

  15. Measurement and evaluation of percolation drainage systems capacity in real conditions

    NASA Astrophysics Data System (ADS)

    Markovic, G.; Zelenakova, M.

    2017-10-01

    The drainage system must ensure a safe disposal of the surface water without endangering the buildings and safety of people. Despite the common use of rainwater infiltration facilities, there is still only limited data available evaluating the long-term capacity of such systems especially for underground infiltration facilities. This study presents experimental measurements and evaluation of long-term infiltration efficiency in real conditions and emphasizes the importance of hydrogeological survey. The measurements of infiltration efficiency were applied to an existing percolation drainage system - infiltration shafts. Infiltration shafts were made in year 2007 so that its drainage operation takes more than 8 years. This study was started in 2011 and still continues and presents 5 years measurements of infiltration efficiency for this infiltration facility.

  16. Measurement of Two-Phase Flow Characteristics Under Microgravity Conditions

    NASA Technical Reports Server (NTRS)

    Keshock, E. G.; Lin, C. S.; Edwards, L. G.; Knapp, J.; Harrison, M. E.; Xhang, X.

    1999-01-01

    This paper describes the technical approach and initial results of a test program for studying two-phase annular flow under the simulated microgravity conditions of KC-135 aircraft flights. A helical coil flow channel orientation was utilized in order to circumvent the restrictions normally associated with drop tower or aircraft flight tests with respect to two-phase flow, namely spatial restrictions preventing channel lengths of sufficient size to accurately measure pressure drops. Additionally, the helical coil geometry is of interest in itself, considering that operating in a microgravity environment vastly simplifies the two-phase flows occurring in coiled flow channels under 1-g conditions for virtually any orientation. Pressure drop measurements were made across four stainless steel coil test sections, having a range of inside tube diameters (0.95 to 1.9 cm), coil diameters (25 - 50 cm), and length-to-diameter ratios (380 - 720). High-speed video photographic flow observations were made in the transparent straight sections immediately preceding and following the coil test sections. A transparent coil of tygon tubing of 1.9 cm inside diameter was also used to obtain flow visualization information within the coil itself. Initial test data has been obtained from one set of KC-135 flight tests, along with benchmark ground tests. Preliminary results appear to indicate that accurate pressure drop data is obtainable using a helical coil geometry that may be related to straight channel flow behavior. Also, video photographic results appear to indicate that the observed slug-annular flow regime transitions agree quite reasonably with the Dukler microgravity map.

  17. A brief review of strength and ballistic assessment methodologies in sport.

    PubMed

    McMaster, Daniel Travis; Gill, Nicholas; Cronin, John; McGuigan, Michael

    2014-05-01

    An athletic profile should encompass the physiological, biomechanical, anthropometric and performance measures pertinent to the athlete's sport and discipline. The measurement systems and procedures used to create these profiles are constantly evolving and becoming more precise and practical. This is a review of strength and ballistic assessment methodologies used in sport, a critique of current maximum strength [one-repetition maximum (1RM) and isometric strength] and ballistic performance (bench throw and jump capabilities) assessments for the purpose of informing practitioners and evolving current assessment methodologies. The reliability of the various maximum strength and ballistic assessment methodologies were reported in the form of intra-class correlation coefficients (ICC) and coefficient of variation (%CV). Mean percent differences (Mdiff = [/Xmethod1 - Xmethod2/ / (Xmethod1 + Xmethod2)] x 100) and effect size (ES = [Xmethod2 - Xmethod1] ÷ SDmethod1) calculations were used to assess the magnitude and spread of methodological differences for a given performance measure of the included studies. Studies were grouped and compared according to their respective performance measure and movement pattern. The various measurement systems (e.g., force plates, position transducers, accelerometers, jump mats, optical motion sensors and jump-and-reach apparatuses) and assessment procedures (i.e., warm-up strategies, loading schemes and rest periods) currently used to assess maximum isometric squat and mid-thigh pull strength (ICC > 0.95; CV < 2.0%), 1RM bench press, back squat and clean strength (ICC > 0.91; CV < 4.3%), and ballistic (vertical jump and bench throw) capabilities (ICC > 0.82; CV < 6.5%) were deemed highly reliable. The measurement systems and assessment procedures employed to assess maximum isometric strength [M(Diff) = 2-71%; effect size (ES) = 0.13-4.37], 1RM strength (M(Diff) = 1-58%; ES = 0.01-5.43), vertical jump capabilities (M(Diff) = 2-57%; ES

  18. Measuring financial protection against catastrophic health expenditures: methodological challenges for global monitoring.

    PubMed

    Hsu, Justine; Flores, Gabriela; Evans, David; Mills, Anne; Hanson, Kara

    2018-05-31

    Monitoring financial protection against catastrophic health expenditures is important to understand how health financing arrangements in a country protect its population against high costs associated with accessing health services. While catastrophic health expenditures are generally defined to be when household expenditures for health exceed a given threshold of household resources, there is no gold standard with several methods applied to define the threshold and household resources. These different approaches to constructing the indicator might give different pictures of a country's progress towards financial protection. In order for monitoring to effectively provide policy insight, it is critical to understand the sensitivity of measurement to these choices. This paper examines the impact of varying two methodological choices by analysing household expenditure data from a sample of 47 countries. We assess sensitivity of cross-country comparisons to a range of thresholds by testing for restricted dominance. We further assess sensitivity of comparisons to different methods for defining household resources (i.e. total expenditure, non-food expenditure and non-subsistence expenditure) by conducting correlation tests of country rankings. We found country rankings are robust to the choice of threshold in a tenth to a quarter of comparisons within the 5-85% threshold range and this increases to half of comparisons if the threshold is restricted to 5-40%, following those commonly used in the literature. Furthermore, correlations of country rankings using different methods to define household resources were moderate to high; thus, this choice makes less difference from a measurement perspective than from an ethical perspective as different definitions of available household resources reflect varying concerns for equity. Interpreting comparisons from global monitoring based on a single threshold should be done with caution as these may not provide reliable insight into

  19. How to Select a Questionnaire with a Good Methodological Quality?

    PubMed

    Paiva, Saul Martins; Perazzo, Matheus de França; Ortiz, Fernanda Ruffo; Pordeus, Isabela Almeida; Martins-Júnior, Paulo Antônio

    2018-01-01

    In the last decades, several instruments have been used to evaluate the impact of oral health problems on the oral health-related quality of life (OHRQoL) of individuals. However, some instruments lack thorough methodological validation or present conceptual differences that hinder comparisons with instruments. Thus, it can be difficult to clinicians and researchers to select a questionnaire that accurately reflect what are really meaningful to individuals. This short communication aimed to discuss the importance of use an appropriate checklist to select an instrument with a good methodological quality. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist was developed to provide tools for evidence-based instrument selection. The COSMIN checklist comprises ten boxes that evaluate whether a study meets the standard for good methodological quality and two additional boxes to meet studies that use the Item Response Theory method and general requirements for results generalization, resulting in four steps to be followed. In this way, it is required at least some expertise in psychometrics or clinimetrics to a wide-ranging use of this checklist. The COSMIN applications include its use to ensure the standardization of cross-cultural adaptations and safer comparisons between measurement studies and evaluation of methodological quality of systematic reviews of measurement properties. Also, it can be used by students when training about measurement properties and by editors and reviewers when revising manuscripts on this topic. The popularization of COSMIN checklist is therefore necessary to improve the selection and evaluation of health measurement instruments.

  20. Optimization of conditions for isolation of high quality chitin from shrimp processing raw byproducts using response surface methodology and its characterization.

    PubMed

    Nidheesh, T; Suresh, P V

    2015-06-01

    Chitin is one of the most abundant bioactive biopolymer on earth. It is commercially extracted from seafood processing crustacean shell byproducts by harsh thermochemical treatments. The extraction conditions, the source and pretreatment of raw material significantly affect its quality and bioactivity. In this investigation response surface methodology (RSM) has been applied to optimize and evaluate the interaction of variables for extraction of high quality chitin from shrimp processing raw byproducts. Variables such as, concentration of HCl (%, v/v) 4.5 (for wet) and 4.9 (for dry), reaction time 3 h, solid liquid ratio of HCl (w/v) 1:5.5 (for wet) and 1:7.9 (for dry) with two treatments achieved >98 % demineralization of shrimp byproduct. Variables such as, concentration of NaOH 3.6 % (w/v), reaction time 2.5 h, temperature 69.0 ± 1 °C, solid liquid ratio of NaOH 7.4 (w/v) and two treatments accomplished >98 % deproteinization of demineralized byproducts. Significant (p ≤ 0.05-0.001) interactive effects were observed between different variables. Chitin obtained in these conditions had residual content (%, w/w) of ash <0.4 and protein <0.8 and the degree of N-acetylation was >93 % with purity of >98 %. In conclusion, the optimized conditions by RSM can be applied for large scale preparation of high quality chitin from raw shrimp byproduct.

  1. Teaching Research Methodology through Active Learning

    ERIC Educational Resources Information Center

    Lundahl, Brad W.

    2008-01-01

    To complement traditional learning activities in a masters-level research methodology course, social work students worked on a formal research project which involved: designing the study, constructing measures, selecting a sampling strategy, collecting data, reducing and analyzing data, and finally interpreting and communicating the results. The…

  2. A new methodology based on sensitivity analysis to simplify the recalibration of functional-structural plant models in new conditions.

    PubMed

    Mathieu, Amélie; Vidal, Tiphaine; Jullien, Alexandra; Wu, QiongLi; Chambon, Camille; Bayol, Benoit; Cournède, Paul-Henry

    2018-06-19

    Functional-structural plant models (FSPMs) describe explicitly the interactions between plants and their environment at organ to plant scale. However, the high level of description of the structure or model mechanisms makes this type of model very complex and hard to calibrate. A two-step methodology to facilitate the calibration process is proposed here. First, a global sensitivity analysis method was applied to the calibration loss function. It provided first-order and total-order sensitivity indexes that allow parameters to be ranked by importance in order to select the most influential ones. Second, the Akaike information criterion (AIC) was used to quantify the model's quality of fit after calibration with different combinations of selected parameters. The model with the lowest AIC gives the best combination of parameters to select. This methodology was validated by calibrating the model on an independent data set (same cultivar, another year) with the parameters selected in the second step. All the parameters were set to their nominal value; only the most influential ones were re-estimated. Sensitivity analysis applied to the calibration loss function is a relevant method to underline the most significant parameters in the estimation process. For the studied winter oilseed rape model, 11 out of 26 estimated parameters were selected. Then, the model could be recalibrated for a different data set by re-estimating only three parameters selected with the model selection method. Fitting only a small number of parameters dramatically increases the efficiency of recalibration, increases the robustness of the model and helps identify the principal sources of variation in varying environmental conditions. This innovative method still needs to be more widely validated but already gives interesting avenues to improve the calibration of FSPMs.

  3. An integrative research review of instruments measuring religious involvement: implications for nursing research with African Americans.

    PubMed

    Mokel, Melissa Jennifer; Shellman, Juliette M

    2013-01-01

    Many instruments in which religious involvement is measured often (a) contain unclear, poorly developed constructs; (b) lack methodological rigor in scale development; and (c) contain language and content culturally incongruent with the religious experiences of diverse ethnic groups. The primary aims of this review were to (a) synthesize the research on instruments designed to measure religious involvement, (b) evaluate the methodological quality of instruments that measure religious involvement, and (c) examine these instruments for conceptual congruency with African American religious involvement. An updated integrative research review method guided the process (Whittemore & Knafl, 2005). 152 articles were reviewed and 23 articles retrieved. Only 3 retained instruments were developed under methodologically rigorous conditions. All 3 instruments were congruent with a conceptual model of African American religious involvement. The Fetzer Multidimensional Measure of Religious Involvement and Spirituality (FMMRS; Idler et al., 2003) was found to have favorable characteristics. Further examination and psychometric testing is warranted to determine its acceptability, readability, and cultural sensitivity in an African American population.

  4. Validation of a condition-specific measure for women having an abnormal screening mammography.

    PubMed

    Brodersen, John; Thorsen, Hanne; Kreiner, Svend

    2007-01-01

    The aim of this study is to assess the validity of a new condition-specific instrument measuring psychosocial consequences of abnormal screening mammography (PCQ-DK33). The draft version of the PCQ-DK33 was completed on two occasions by 184 women who had received an abnormal screening mammography and on one occasion by 240 women who had received a normal screening result. Item Response Theories and Classical Test Theories were used to analyze data. Construct validity, concurrent validity, known group validity, objectivity and reliability were established by item analysis examining the fit between item responses and Rasch models. Six dimensions covering anxiety, behavioral impact, sense of dejection, impact on sleep, breast examination, and sexuality were identified. One item belonging to the dejection dimension had uniform differential item functioning. Two items not fitting the Rasch models were retained because of high face validity. A sick leave item added useful information when measuring side effects and socioeconomic consequences of breast cancer screening. Five "poor items" were identified and should be deleted from the final instrument. Preliminary evidence for a valid and reliable condition-specific measure for women having an abnormal screening mammography was established. The measure includes 27 "good" items measuring different attributes of the same overall latent structure-the psychosocial consequences of abnormal screening mammography.

  5. Atmospheric conditions measured by a wireless sensor network on the local scale

    NASA Astrophysics Data System (ADS)

    Lengfeld, K.; Ament, F.

    2010-09-01

    Atmospheric conditions close to the surface, like temperature, wind speed and humidity, vary on small scales because of surface heterogeneities. Therefore, the traditional measuring approach of using a single, highly accurate station is of limited representativeness for a larger domain, because it is not able to determine these small scale variabilities. However, both the variability and the domain averages are important information for the development and validation of atmospheric models and soil-vegetation-atmosphere-transfer (SVAT) schemes. Due to progress in microelectronics it is possible to construct networks of comparably cheap meteorological stations with moderate accuracy. Such a network provides data in high spatial and temporal resolution. The EPFL Lausanne developed such a network called SensorScope, consisting of low cost autonomous stations. Each station observes air and surface temperature, humidity, wind direction and speed, incoming solar radiation, precipitation, soil moisture and soil temperature and sends the data via radio communication to a base station. This base station forwards the collected data via GSM/GPRS to a central server. The first measuring campaign took place within the FLUXPAT project in August 2009. We deployed 15 stations as a twin transect near Jülich, Germany. To test the quality of the low cost sensors we compared two of them to more accurate reference systems. It turned out, that although the network sensors are not highly accurate, the measurements are consistent. Consequently an analysis of the pattern of atmospheric conditions is feasible. The transect is 2.3 km long and covers different types of vegetation and a small river. Therefore, we analyse the influence of different land surfaces and the distance to the river on meteorological conditions. For example, we found a difference in air temperature of 0.8°C between the station closest to and the station farthest from the river. The decreasing relative humidity with

  6. Sustainable Food Security Measurement: A Systemic Methodology

    NASA Astrophysics Data System (ADS)

    Findiastuti, W.; Singgih, M. L.; Anityasari, M.

    2017-04-01

    Sustainable food security measures how a region provides food for its people without endangered the environment. In Indonesia, it was legally measured in Food Security and Vulnerability (FSVA). However, regard to sustainable food security policy, the measurement has not encompassed the environmental aspect. This will lead to lack of environmental aspect information for adjusting the next strategy. This study aimed to assess Sustainable Food security by encompassing both food security and environment aspect using systemic eco-efficiency. Given existing indicator of cereal production level, total emission as environment indicator was generated by constructing Causal Loop Diagram (CLD). Then, a stock-flow diagram was used to develop systemic simulation model. This model was demonstrated for Indonesian five provinces. The result showed there was difference between food security order with and without environmental aspect assessment.

  7. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  8. Cost-effectiveness methodology for computer systems selection

    NASA Technical Reports Server (NTRS)

    Vallone, A.; Bajaj, K. S.

    1980-01-01

    A new approach to the problem of selecting a computer system design has been developed. The purpose of this methodology is to identify a system design that is capable of fulfilling system objectives in the most economical way. The methodology characterizes each system design by the cost of the system life cycle and by the system's effectiveness in reaching objectives. Cost is measured by a 'system cost index' derived from an analysis of all expenditures and possible revenues over the system life cycle. Effectiveness is measured by a 'system utility index' obtained by combining the impact that each selection factor has on the system objectives and it is assessed through a 'utility curve'. A preestablished algorithm combines cost and utility and provides a ranking of the alternative system designs from which the 'best' design is selected.

  9. Safety Assessment for a Surface Repository in the Chernobyl Exclusion Zone - Methodology for Assessing Disposal under Intervention Conditions - 13476

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haverkamp, B.; Krone, J.; Shybetskyi, I.

    The Radioactive Waste Disposal Facility (RWDF) Buryakovka was constructed in 1986 as part of the intervention measures after the accident at Chernobyl NPP (ChNPP). Today, RWDF Buryakovka is still being operated but its maximum capacity is nearly reached. Plans for enlargement of the facility exist since more than 10 years but have not been implemented yet. In the framework of an European Commission Project DBE Technology GmbH prepared a safety analysis report of the facility in its current state (SAR) and a preliminary safety analysis report (PSAR) based on the planned enlargement. Due to its history RWDF Buryakovka does notmore » fully comply with today's best international practices and the latest Ukrainian regulations in this area. The most critical aspects are its inventory of long-lived radionuclides, and the non-existent multi-barrier waste confinement system. A significant part of the project was dedicated, therefore, to the development of a methodology for the safety assessment taking into consideration the facility's special situation and to reach an agreement with all stakeholders involved in the later review and approval procedure of the safety analysis reports. Main aspect of the agreed methodology was to analyze the safety, not strictly based on regulatory requirements but on the assessment of the actual situation of the facility including its location within the Exclusion Zone. For both safety analysis reports, SAR and PSAR, the assessment of the long-term safety led to results that were either within regulatory limits or within the limits allowing for a specific situational evaluation by the regulator. (authors)« less

  10. Intrarater reliability of measuring the patella position by ultrasonography in weight-bearing condition.

    PubMed

    Chen, Chia Lin; Lo, Chu Ling; Huang, Kai Chu; Huang, Chen Fu

    2017-10-01

    [Purpose] The aim of this study was to determine the intrarater reliability of using ultrasonography as a measurement tool to assess the patella position in a weight-bearing condition. [Subjects and Methods] Ten healthy adults participated in this study. Ultrasonography was used to assess the patella position during step down with the loading knee in flexion (0° and 20°). The distance between the patella and lateral condyle was measured to represent the patella position on the condylar groove. Two measurements were obtained on the first day and the day after 1 week by the same investigator. [Results] Excellent intrarater reliability, ranging from 0.83 to 0.93, was shown in both conditions. Standard errors of the measurements were 0.5 mm in the straight knee and 0.7 mm in the knee flexion at 20°. Minimal differences in knee flexion at 0° and knee flexion at 20° were 1.5 mm and 1.9 mm, respectively. [Conclusion] Ultrasonography is a reliable assessment tool for evaluating the positional changes of the patella in weight-bearing activities, and it can be easily used by practitioners in the clinical setting.

  11. Thermal-Diffusivity Measurements of Mexican Citrus Essential Oils Using Photoacoustic Methodology in the Transmission Configuration

    NASA Astrophysics Data System (ADS)

    Muñoz, G. A. López; González, R. F. López; López, J. A. Balderas; Martínez-Pérez, L.

    2011-05-01

    Photoacoustic methodology in the transmission configuration (PMTC) was used to study the thermophysical properties and their relation with the composition in Mexican citrus essential oils providing the viability of using photothermal techniques for quality control and for authentication of oils and their adulteration. Linear relations for the amplitude (on a semi-log scale) and phase, as functions of the sample's thickness, for the PMTC was obtained through a theoretical model fit to the experimental data for thermal-diffusivity measurements in Mexican orange, pink grapefruit, mandarin, lime type A, centrifuged essential oils, and Mexican distilled lime essential oil. Gas chromatography for distilled lime essential oil and centrifuged lime essential oil type A is reported to complement the study. Experimental results showed close thermal-diffusivity values between Mexican citrus essential oils obtained by centrifugation, but a significant difference of this physical property for distilled lime oil and the corresponding value obtained by centrifugation, which is due to their different chemical compositions involved with the extraction processes.

  12. Cryogenic Insulation Standard Data and Methodologies Project

    NASA Technical Reports Server (NTRS)

    Summerfield, Burton; Thompson, Karen; Zeitlin, Nancy; Mullenix, Pamela; Fesmire, James; Swanger, Adam

    2015-01-01

    Extending some recent developments in the area of technical consensus standards for cryogenic thermal insulation systems, a preliminary Inter-Laboratory Study of foam insulation materials was performed by NASA Kennedy Space Center and LeTourneau University. The initial focus was ambient pressure cryogenic boil off testing using the Cryostat-400 flat-plate instrument. Completion of a test facility at LETU has enabled direct, comparative testing, using identical cryostat instruments and methods, and the production of standard thermal data sets for a number of materials under sub-ambient conditions. The two sets of measurements were analyzed and indicate there is reasonable agreement between the two laboratories. Based on cryogenic boiloff calorimetry, new equipment and methods for testing thermal insulation systems have been successfully developed. These boiloff instruments (or cryostats) include both flat plate and cylindrical models and are applicable to a wide range of different materials under a wide range of test conditions. Test measurements are generally made at large temperature difference (boundary temperatures of 293 K and 78 K are typical) and include the full vacuum pressure range. Results are generally reported in effective thermal conductivity (ke) and mean heat flux (q) through the insulation system. The new cryostat instruments provide an effective and reliable way to characterize the thermal performance of materials under subambient conditions. Proven in through thousands of tests of hundreds of material systems, they have supported a wide range of aerospace, industry, and research projects. Boiloff testing technology is not just for cryogenic testing but is a cost effective, field-representative methodology to test any material or system for applications at sub-ambient temperatures. This technology, when adequately coupled with a technical standards basis, can provide a cost-effective, field-representative methodology to test any material or system

  13. Classicality condition on a system observable in a quantum measurement and a relative-entropy conservation law

    NASA Astrophysics Data System (ADS)

    Kuramochi, Yui; Ueda, Masahito

    2015-03-01

    We consider the information flow on a system observable X corresponding to a positive-operator-valued measure under a quantum measurement process Y described by a completely positive instrument from the viewpoint of the relative entropy. We establish a sufficient condition for the relative-entropy conservation law which states that the average decrease in the relative entropy of the system observable X equals the relative entropy of the measurement outcome of Y , i.e., the information gain due to measurement. This sufficient condition is interpreted as an assumption of classicality in the sense that there exists a sufficient statistic in a joint successive measurement of Y followed by X such that the probability distribution of the statistic coincides with that of a single measurement of X for the premeasurement state. We show that in the case when X is a discrete projection-valued measure and Y is discrete, the classicality condition is equivalent to the relative-entropy conservation for arbitrary states. The general theory on the relative-entropy conservation is applied to typical quantum measurement models, namely, quantum nondemolition measurement, destructive sharp measurements on two-level systems, a photon counting, a quantum counting, homodyne and heterodyne measurements. These examples except for the nondemolition and photon-counting measurements do not satisfy the known Shannon-entropy conservation law proposed by Ban [M. Ban, J. Phys. A: Math. Gen. 32, 1643 (1999), 10.1088/0305-4470/32/9/012], implying that our approach based on the relative entropy is applicable to a wider class of quantum measurements.

  14. A stochastic optimal feedforward and feedback control methodology for superagility

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Direskeneli, Haldun; Taylor, Deborah B.

    1992-01-01

    A new control design methodology is developed: Stochastic Optimal Feedforward and Feedback Technology (SOFFT). Traditional design techniques optimize a single cost function (which expresses the design objectives) to obtain both the feedforward and feedback control laws. This approach places conflicting demands on the control law such as fast tracking versus noise atttenuation/disturbance rejection. In the SOFFT approach, two cost functions are defined. The feedforward control law is designed to optimize one cost function, the feedback optimizes the other. By separating the design objectives and decoupling the feedforward and feedback design processes, both objectives can be achieved fully. A new measure of command tracking performance, Z-plots, is also developed. By analyzing these plots at off-nominal conditions, the sensitivity or robustness of the system in tracking commands can be predicted. Z-plots provide an important tool for designing robust control systems. The Variable-Gain SOFFT methodology was used to design a flight control system for the F/A-18 aircraft. It is shown that SOFFT can be used to expand the operating regime and provide greater performance (flying/handling qualities) throughout the extended flight regime. This work was performed under the NASA SBIR program. ICS plans to market the software developed as a new module in its commercial CACSD software package: ACET.

  15. Scanning Laser Doppler Vibrometer Measurements Inside Helicopter Cabins in Running Conditions: Problems and Mock-up Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Revel, G. M.; Castellini, P.; Chiariotti, P.

    2010-05-28

    The present work deals with the analysis of problems and potentials of laser vibrometer measurements inside helicopter cabins in running conditions. The paper describes the results of a systematic measurement campaign performed on an Agusta A109MKII mock-up. The aim is to evaluate the applicability of Scanning Laser Doppler Vibrometer (SLDV) for tests in simulated flying conditions and to understand how performances of the technique are affected when the laser head is placed inside the cabin, thus being subjected to interfering inputs. Firstly a brief description of the performed test cases and the used measuring set-ups are given. Comparative tests betweenmore » SLDV and accelerometers are presented, analyzing the achievable performances for the specific application. Results obtained measuring with SLDV placed inside the helicopter cabin during operative excitation conditions are compared with those performed with the laser lying outside the mock-up, these last being considered as 'reference measurements'. Finally, in order to give an estimate of the uncertainty level on measured signals, a study linking the admitted percentage of noise content on vibrometer signals due to laser head vibration levels will be introduced.« less

  16. Thermoregulatory responses in exercising rats: methodological aspects and relevance to human physiology.

    PubMed

    Wanner, Samuel Penna; Prímola-Gomes, Thales Nicolau; Pires, Washington; Guimarães, Juliana Bohnen; Hudson, Alexandre Sérvulo Ribeiro; Kunstetter, Ana Cançado; Fonseca, Cletiana Gonçalves; Drummond, Lucas Rios; Damasceno, William Coutinho; Teixeira-Coelho, Francisco

    2015-01-01

    Rats are used worldwide in experiments that aim to investigate the physiological responses induced by a physical exercise session. Changes in body temperature regulation, which may affect both the performance and the health of exercising rats, are evident among these physiological responses. Despite the universal use of rats in biomedical research involving exercise, investigators often overlook important methodological issues that hamper the accurate measurement of clear thermoregulatory responses. Moreover, much debate exists regarding whether the outcome of rat experiments can be extrapolated to human physiology, including thermal physiology. Herein, we described the impact of different exercise intensities, durations and protocols and environmental conditions on running-induced thermoregulatory changes. We focused on treadmill running because this type of exercise allows for precise control of the exercise intensity and the measurement of autonomic thermoeffectors associated with heat production and loss. Some methodological issues regarding rat experiments, such as the sites for body temperature measurements and the time of day at which experiments are performed, were also discussed. In addition, we analyzed the influence of a high body surface area-to-mass ratio and limited evaporative cooling on the exercise-induced thermoregulatory responses of running rats and then compared these responses in rats to those observed in humans. Collectively, the data presented in this review represent a reference source for investigators interested in studying exercise thermoregulation in rats. In addition, the present data indicate that the thermoregulatory responses of exercising rats can be extrapolated, with some important limitations, to human thermal physiology.

  17. Thermoregulatory responses in exercising rats: methodological aspects and relevance to human physiology

    PubMed Central

    Wanner, Samuel Penna; Prímola-Gomes, Thales Nicolau; Pires, Washington; Guimarães, Juliana Bohnen; Hudson, Alexandre Sérvulo Ribeiro; Kunstetter, Ana Cançado; Fonseca, Cletiana Gonçalves; Drummond, Lucas Rios; Damasceno, William Coutinho; Teixeira-Coelho, Francisco

    2015-01-01

    Rats are used worldwide in experiments that aim to investigate the physiological responses induced by a physical exercise session. Changes in body temperature regulation, which may affect both the performance and the health of exercising rats, are evident among these physiological responses. Despite the universal use of rats in biomedical research involving exercise, investigators often overlook important methodological issues that hamper the accurate measurement of clear thermoregulatory responses. Moreover, much debate exists regarding whether the outcome of rat experiments can be extrapolated to human physiology, including thermal physiology. Herein, we described the impact of different exercise intensities, durations and protocols and environmental conditions on running-induced thermoregulatory changes. We focused on treadmill running because this type of exercise allows for precise control of the exercise intensity and the measurement of autonomic thermoeffectors associated with heat production and loss. Some methodological issues regarding rat experiments, such as the sites for body temperature measurements and the time of day at which experiments are performed, were also discussed. In addition, we analyzed the influence of a high body surface area-to-mass ratio and limited evaporative cooling on the exercise-induced thermoregulatory responses of running rats and then compared these responses in rats to those observed in humans. Collectively, the data presented in this review represent a reference source for investigators interested in studying exercise thermoregulation in rats. In addition, the present data indicate that the thermoregulatory responses of exercising rats can be extrapolated, with some important limitations, to human thermal physiology. PMID:27227066

  18. Reliability of intraventricular pressure measurement with fiberoptic or solid-state transducers: avoidance of a methodological error.

    PubMed

    Raabe, A; Stöckel, R; Hohrein, D; Schöche, J

    1998-01-01

    The failure of intraventricular pressure measurement in cases of catheter blockage or dislodgement is thought to be eliminated by using intraventricular microtransducers. We report on an avoidable methodological error that may affect the reliability of intraventricular pressure measurement with these devices. Intraventricular fiberoptic or solid-state devices were implanted in 43 patients considered to be at risk for developing catheter occlusion. Two different types were used, i.e., devices in which the transducer is placed inside the ventriculostomy catheter (Type A) and devices in which the transducer is integrated in the external surface of the catheter (Type B). Type A devices were used in 15 patients and Type B devices in 28 patients. Pressure recordings were checked at bedside for the validity and reliability of the measurement. Of the 15 patients treated with Type A devices, no reliable pressure recordings were able to be obtained in three patients in whom ventricular punctures were not successful. In 4 of the remaining 12 patients, periods of erroneous pressure readings were detected. After opening of cerebrospinal fluid drainage, all Type A devices failed to reflect real intraventricular pressure. In patients treated with Type B devices, no erroneous pressure recordings were able to be identified, irrespective of whether cerebrospinal fluid drainage was performed. Even when ventricular puncture failed, pressure measurement was correct each time. Transducers that are simply placed inside the ventriculostomy catheter require fluid-coupling. They may fail, either during cerebrospinal fluid drainage or when the catheter is blocked or placed within the parenchyma.

  19. Methodology for astronaut reconditioning research.

    PubMed

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  1. Using Indigenist and Indigenous methodologies to connect to deeper understandings of Aboriginal and Torres Strait Islander peoples' quality of life.

    PubMed

    Kite, Elaine; Davy, Carol

    2015-12-01

    The lack of a common description makes measuring the concept of quality of life (QoL) a challenge. Whether QoL incorporates broader social features or is attributed to health conditions, the diverse range of descriptions applied by various disciplines has resulted in a concept that is multidimensional and vague. The variety of theoretical conceptualisations of QoL confounds and confuses even the most astute. Measuring QoL in Aboriginal and Torres Strait Islander populations is even more challenging. Instruments commonly developed and used to measure QoL are often derived from research methodologies shaped by Western cultural perspectives. Often they are simply translated for use among culturally and linguistically diverse Aboriginal and Torres Strait Islander peoples. This has implications for Aboriginal and Torres Strait Islander populations whose perceptions of health are derived from within their specific cultures, value systems and ways of knowing and being. Interconnections and relationships between themselves, their communities, their environment and the natural and spiritual worlds are complex. The way in which their QoL is currently measured indicates that very little attention is given to the diversity of Aboriginal and Torres Strait Islander peoples' beliefs or the ways in which those beliefs shape or give structure and meaning to their health and their lives. The use of Indigenist or Indigenous methodologies in defining what Aboriginal and Torres Strait Islander peoples believe gives quality to their lives is imperative. These methodologies have the potential to increase the congruency between their perceptions of QoL and instruments to measure it.

  2. [Optimization of extraction technology from Paeoniae Radix Alba using response surface methodology].

    PubMed

    Jin, Lin; Zhao, Wan-shun; Guo, Qiao-sheng; Zhang, Wen-sheng; Ye, Zheng-liang

    2015-08-01

    To ensure the stability of chemistry components and the convenience of operation, ultrasound method was chosen to study in this investigation. As the total common peaks area in chromatograms was set to be evaluation index, the influence on the technology caused by extraction time, ethanol concentration and liquid-to-solid ratio was studied by using single factor methodology, and the extraction technology of Paeoniae Radix Alba was optimized by using response surface methodology. The results showed that the extracting results were most affected by ethanol concentration; liquid-to-solid ratio came the second and extraction time thirdly. The optimum ultrasonic-assisted extraction conditions were as follow: the ultrasonic extraction time was 20.06 min, the ethanol concentration in solvent was 72.04%, and the liquid-to-solid ratio was 53.38 mL · g(-1), the predicted value of total common peaks area was 2.1608 x 10(8). Under the extraction conditions after optimization, the total common peaks area was 2.1422 x 10(8), and the relative deviation between the measured and predicted value was 0.86%, so the optimized extraction technology for Paeoniae Radix Alba is suitable and feasible. Besides, for the purpose of extracting more sufficiently and completely, the optimized extraction technology had more advantages than the extraction method recorded in the monogragh of Paeoniae Radix Alba in Chinese Pharmacopoeia, which will come true the assessment and utilization comprehensively.

  3. Optimization of hydrolysis conditions for the production of glucomanno-oligosaccharides from konjac using β-mannanase by response surface methodology.

    PubMed

    Chen, Junfan; Liu, Desheng; Shi, Bo; Wang, Hai; Cheng, Yongqiang; Zhang, Wenjing

    2013-03-01

    Glucomanno-oligosaccharides (GMO), usually produced from hydrolysis of konjac tubers with a high content of glucomannan, have a positive effect on Bifidobacterium as well as a variety of other physiological activities. Response surface methodology (RSM) was employed to optimize the hydrolysis time, hydrolysis temperature, pH and enzyme to substrate ratio (E/S) to obtain a high GMO yield from konjac tubers. From the signal-factor experiments, it was concluded that the change in the direct reducing sugar (DRS) is consistent with total reducing sugar (TRS) but contrary to the degree of polymerization (DP). DRS was used as an indicator of the content of GMO in the RSM study. The optimum RSM operating conditions were: reaction time of 3.4 h, reaction temperature of 41.0°C, pH of 7.1 and E/S of 0.49. The results suggested that the enzymatic hydrolysis was enhanced by temperature, pH and incubation time. Model validation showed good agreement between experimental results and the predicted responses. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Optimization of Culture Conditions for Enhanced Growth, Lipid and Docosahexaenoic Acid (DHA) Production of Aurantiochytrium SW1 by Response Surface Methodology.

    PubMed

    Nazir, Yusuf; Shuib, Shuwahida; Kalil, Mohd Sahaid; Song, Yuanda; Hamid, Aidil Abdul

    2018-06-11

    In this study, optimization of growth, lipid and DHA production of Aurantiochytrium SW1 was carried out using response surface methodology (RSM) in optimizing initial fructose concentration, agitation speed and monosodium glutamate (MSG) concentration. Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. ANOVA analysis revealed that the process which adequately represented by quadratic model was significant (p < 0.0001) for all the response. All the three factors were significant (p < 0.005) in influencing the biomass and lipid data while only two factors (agitation speed and MSG) gave significant effect on DHA production (p < 0.005). The estimated optimal conditions for enhanced growth, lipid and DHA production were 70 g/L fructose, 250 rpm agitation speed and 10 g/L MSG. Consequently, the quadratic model was validated by applying the estimated optimum conditions, which confirmed the model validity where 19.0 g/L biomass, 9.13 g/L lipid and 4.75 g/L of DHA were produced. The growth, lipid and DHA were 28, 36 and 35% respectively higher than that produced in the original medium prior to optimization.

  5. Measurements of elastic moduli of pharmaceutical compacts: a new methodology using double compaction on a compaction simulator.

    PubMed

    Mazel, Vincent; Busignies, Virginie; Diarra, Harona; Tchoreloff, Pierre

    2012-06-01

    The elastic properties of pharmaceutical powders play an important role during the compaction process. The elastic behavior can be represented by Young's modulus (E) and Poisson's ratio (v). However, during the compaction, the density of the powder bed changes and the moduli must be determined as a function of the porosity. This study proposes a new methodology to determine E and v as a function of the porosity using double compaction in an instrumented compaction simulator. Precompression is used to form the compact, and the elastic properties are measured during the beginning of the main compaction. By measuring the axial and radial pressure and the powder bed thickness, E and v can be determined as a function of the porosity. Two excipients were studied, microcrystalline cellulose (MCC) and anhydrous calcium phosphate (aCP). The values of E measured are comparable to those obtained using the classical three-point bending test. Poisson's ratio was found to be close to 0.24 for aCP with only small variations with the porosity, and to increase with a decreasing porosity for MCC (0.23-0.38). The classical approximation of a value of 0.3 for ν of pharmaceutical powders should therefore be taken with caution. Copyright © 2012 Wiley Periodicals, Inc.

  6. Issues Related to Measuring and Interpreting Objectively Measured Sedentary Behavior Data

    ERIC Educational Resources Information Center

    Janssen, Xanne; Cliff, Dylan P.

    2015-01-01

    The use of objective measures of sedentary behavior has increased over the past decade; however, as is the case for objectively measured physical activity, methodological decisions before and after data collection are likely to influence the outcomes. The aim of this article is to review the evidence on different methodological decisions made by…

  7. Optimization of the Expression Conditions of CGA-N46 in Bacillus subtilis DB1342(p-3N46) by Response Surface Methodology.

    PubMed

    Li, Rui-Fang; Wang, Bin; Liu, Shuai; Chen, Shi-Hua; Yu, Guang-Hai; Yang, Shuo-Ye; Huang, Liang; Yin, Yan-Li; Lu, Zhi-Fang

    2016-09-01

    CGA-N46 is a small antifungal-derived peptide and consists of the 31st-76th amino acids of the N-terminus of human chromogranin A. Polycistronic expression of recombinant CGA-N46 in Bacillus subtilis DB1342 was used to improve its production, but the yield of CGA-N46 was still low. In the present study, response surface methodology (RSM) was used to optimize culture medium composition and growth conditions of the engineered strain B. subtilis DB1342(p-3N46) for the further increase in CGA-N46 yield. The results of two-level factorial experiments indicated that dextrin and tryptone were significant factors affecting CGA-N46 expression. Central composite design (CCD) was used to determine the ideal conditions of each significant factors. From the results of CCD, the optimal medium composition was predicted to be dextrin 16.6 g/L, tryptone 19.2 g/L, KH2PO4·H2O 6 g/L, pH 6.5. And the optimal culture process indicated inoculation of B. subtilis DB1342(p-3N46) seed culture into fresh culture medium at 5 % (v/v), followed by expression of CGA-N46 for 56 hours at 30 °C induced by 2 % (v/v) sucrose after one hour of shaking culture. To test optimal CGA-N46 peptide expression, the yeast growth inhibition assay was employed and it was found that under optimal culture conditions, CGA-N46 inhibited the growth of Candida albican by 42.17, 30.86 % more than that in the pre-optimization conditions. In summary, RSM can be used to optimize expression conditions of CGA-N46 in engineered strains B. subtilis DB1342(p-3N46).

  8. Electrical conductivity measurement of granulite under mid- to lower crustal pressure-temperature conditions

    NASA Astrophysics Data System (ADS)

    Fuji-ta, K.; Katsura, T.; Tainosho, Y.

    2004-04-01

    We have developed a technique to measure electrical conductivity of crustal rocks with relatively low conductivity and complicated mineral components in order to compare with results given by magneto-telluric (MT) measurements. A granulite from Hidaka metamorphic belt (HMB) in Hokkaido, Japan at high temperature and pressure conditions was obtained. The granulite sample was ground and sintered under the conditions similar to those of mid- to lower crust. We have observed smooth and reversible change of conductivity with temperature up to about 900 K at 1 GPa. The results were consistent with the electrical conductivity structures suggested by the MT data analysis. Considering pore fluid conduction mechanism or the role of accessory minerals in the rock, the mechanisms of electrical conductivity paths in dry or basic rocks should be reconsidered.

  9. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Infrared Spectroscopy of Pollen Identifies Plant Species and Genus as Well as Environmental Conditions

    PubMed Central

    Zimmermann, Boris; Kohler, Achim

    2014-01-01

    Background It is imperative to have reliable and timely methodologies for analysis and monitoring of seed plants in order to determine climate-related plant processes. Moreover, impact of environment on plant fitness is predominantly based on studies of female functions, while the contribution of male gametophytes is mostly ignored due to missing data on pollen quality. We explored the use of infrared spectroscopy of pollen for an inexpensive and rapid characterization of plants. Methodology The study was based on measurement of pollen samples by two Fourier transform infrared techniques: single reflectance attenuated total reflectance and transmission measurement of sample pellets. The experimental set, with a total of 813 samples, included five pollination seasons and 300 different plant species belonging to all principal spermatophyte clades (conifers, monocotyledons, eudicots, and magnoliids). Results The spectroscopic-based methodology enables detection of phylogenetic variations, including the separation of confamiliar and congeneric species. Furthermore, the methodology enables measurement of phenotypic plasticity by the detection of inter-annual variations within the populations. The spectral differences related to environment and taxonomy are interpreted biochemically, specifically variations of pollen lipids, proteins, carbohydrates, and sporopollenins. The study shows large variations of absolute content of nutrients for congenital species pollinating in the same environmental conditions. Moreover, clear correlation between carbohydrate-to-protein ratio and pollination strategy has been detected. Infrared spectral database with respect to biochemical variation among the range of species, climate and biogeography will significantly improve comprehension of plant-environment interactions, including impact of global climate change on plant communities. PMID:24748390

  11. Spray ignition measurements in a constant volume combustion vessel under engine-relevant conditions

    NASA Astrophysics Data System (ADS)

    Ramesh, Varun

    Pressure-based and optical diagnostics for ignition delay (ID) measurement of a diesel spray from a multi-hole nozzle were investigated in a constant volume combustion vessel (CVCV) at conditions similar to those in a conventional diesel engine at the start of injection (SOI). It was first hypothesized that compared to an engine, the shorter ID in a CVCV was caused by NO, a byproduct of premixed combustion. The presence of a significant concentration of NO+NO2 was confirmed experimentally and by using a multi-zone model of premixed combustion. Experiments measuring the effect of NO on ID were performed at conditions relevant to a conventional diesel engine. Depending on the temperature regime and the nature of the fuel, NO addition was found to advance or retard ignition. Constant volume ignition simulations were capable of describing the observed trends; the magnitudes were different due to the physical processes involved in spray ignition, not modeled in the current study. The results of the study showed that ID is sensitive to low NO concentrations (<100 PPM) in the low-temperature regime. A second source of uncertainty in pressure-based ID measurement is the systematic error associated with the correction used to account for the speed of sound. Simultaneous measurements of volumetric OH chemiluminescence (OHC) and pressure during spray ignition found the OHC to closely resemble the pressure-based heat release rate for the full combustion duration. The start of OHC was always found to be shorter than the pressure-based ID for all fuels and conditions tested by 100 ms. Experiments were also conducted measuring the location and timing of high-temperature ignition and the steady-state lift-off length by high-speed imaging of OHC during spray ignition. The delay period calculated using the measured ignition location and the bulk average speed of sound was in agreement with the delay between OHC and the pressure-based ID. Results of the study show that start of OHC

  12. Effect of boundary conditions on measured water retention behavior within soils

    NASA Astrophysics Data System (ADS)

    Galindo-torres, S.; Scheuermann, A.; Pedroso, D.; Li, L.

    2013-12-01

    The Soil Water Characteristic Curve (SWCC) is a practical representation of the behavior of soil water by relating the suction (difference between the air and water pressures to the moisture content (water saturation). The SWCC is characterized by a hysteresis loop, which is thought to be unique in that any drainage-imbibition cycle lies within a main hysteresis loop limited by two different curves for drainage and imbibition. This 'uniqueness' is the main argument for considering the SWCC as a material-intrinsic feature that characterizes the pore structure and its interaction with fluids. Models have been developed with the SWCC as input data to describe the evolution of the water saturation and the suction within soils. One example of these models is the widely used Richard's equation [1]. In this work we present a series of numerical simulations to evaluate the 'unique' nature of the SWCC. The simulations involves the use of the Lattice Boltzmann Method (LBM) [2] within a regular soil, modelling the flow behavior of two immiscible fluids: wetting and non-wetting. The soil is packed within a cubic domain to resemble the experimental setups that are commonly used for measuring the SWCC[3]. The boundary conditions ensure that the non-wetting phase enters through one cubic face and the wetting phase enters trough the opposite phase, with no flow boundary conditions in the remaining 4 cubic faces. The SWCC known features are inspected including the presence of the common limit curves for different cycles involving varying limits for the suction. For this stage of simulations, the SWCC is indeed unique. Later, different boundary conditions are applied with the two fluids each injected from 3 opposing faces into the porous medium. The effect of this boundary condition change is a net flow direction, which is different from that in the previous case. A striking result is observed when both SWCC are compared and found to be noticeable different. Further analysis is

  13. Stomatal oscillations in olive trees: analysis and methodological implications.

    PubMed

    López-Bernal, Alvaro; García-Tejera, Omar; Testi, Luca; Orgaz, Francisco; Villalobos, Francisco J

    2018-04-01

    Stomatal oscillations have long been disregarded in the literature despite the fact that the phenomenon has been described for a variety of plant species. This study aims to characterize the occurrence of oscillations in olive trees (Olea europaea L.) under different growing conditions and its methodological implications. Three experiments with young potted olives and one with large field-grown trees were performed. Sap flow measurements were always used to monitor the occurrence of oscillations, with additional determinations of trunk diameter variations and leaf-level stomatal conductance, photosynthesis and water potential also conducted in some cases. Strong oscillations with periods of 30-60 min were generally observed for young trees, while large field trees rarely showed significant oscillations. Severe water stress led to the disappearance of oscillations, but moderate water deficits occasionally promoted them. Simultaneous oscillations were also found for leaf stomatal conductance, leaf photosynthesis and trunk diameter, with the former presenting the highest amplitudes. The strong oscillations found in young potted olive trees preclude the use of infrequent measurements of stomatal conductance and related variables to characterize differences between trees of different cultivars or subjected to different experimental treatments. Under these circumstances, our results suggest that reliable estimates could be obtained using measurement intervals below 15 min.

  14. Electrical Conductivity Measurement of Granulite Under Mid to Lower Crustal Pressure-Temperature Conditions

    NASA Astrophysics Data System (ADS)

    Fuji-Ta, K.; Katsura, T.; Tainosho, Y.

    2003-12-01

    We have developed a technique to measure electrical conductivity of crustal rocks with relatively low conductivity and complicated mineral components in order to compare with results given by Magneto-Telluric (MT) measurements. A granulite from Hidaka Metamorphic Belt (HMB) in Hokkaido, Japan at high temperature and pressure conditions was obtained. The granulite sample was ground and sintered under the conditions similar to those of mid to lower crust. We have observed smooth and reversible change of conductivity with temperature up to about 900 K at 1 GPa. Through the qualitative and quantitative evaluations using Electron Probe Micro Analysis (EPMA), microstructures of the sintered sample were inspected. This inspection is essential to confirm the sample was not affected by chemical interaction of minerals. We also examined the role of accessory minerals in the rock, and the mechanisms of electrical conductivity paths in _gdry_h or _gbasic_h rocks should be reconsidered. Finally, results from electrical conductivity measurements were consistent with the electrical conductivity structures suggested by the former MT data analysis.

  15. A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.

    In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.

  16. Importance of methodological standardization for the ektacytometric measures of red blood cell deformability in sickle cell anemia.

    PubMed

    Renoux, Céline; Parrow, Nermi; Faes, Camille; Joly, Philippe; Hardeman, Max; Tisdale, John; Levine, Mark; Garnier, Nathalie; Bertrand, Yves; Kebaili, Kamila; Cuzzubbo, Daniela; Cannas, Giovanna; Martin, Cyril; Connes, Philippe

    2016-01-01

    Red blood cell (RBC) deformability is severely decreased in patients with sickle cell anemia (SCA), which plays a role in the pathophysiology of the disease. However, investigation of RBC deformability from SCA patients demands careful methodological considerations. We assessed RBC deformability by ektacytometry (LORRCA MaxSis, Mechatronics, The Netherlands) in 6 healthy individuals and 49 SCA patients and tested the effects of different heights of the RBC diffraction patterns, obtained by altering the camera gain of the LORRCA, on the result of RBC deformability measurements, expressed as Elongation Index (EI). Results indicate that the pattern of RBCs from control subjects adopts an elliptical shape under shear stress, whereas the pattern of RBCs from individuals with SCA adopts a diamond shape arising from the superposition of elliptical and circular patterns. The latter represent rigid RBCs. While the EI measures did not change with the variations of the RBC diffraction pattern heights in the control subjects, we observed a decrease of EI when the RBC diffraction pattern height is increased in the SCA group. The differences in SCA EI values measured at 5 Pa between the different diffraction pattern heights correlated with the percent of hemoglobin S and the percent of sickled RBC observed by microscopy. Our study confirms that the camera gain or aperture of the ektacytometer should be used to standardize the size of the RBC diffraction pattern height when measuring RBC deformability in sickle cell patients and underscores the potential clinical utility of this technique.

  17. Opening up openness: a theoretical sort following critical incidents methodology and a meta-analytic investigation of the trait family measures.

    PubMed

    Connelly, Brian S; Ones, Deniz S; Davies, Stacy E; Birkland, Adib

    2014-01-01

    Existing taxonomies of Openness's facet structure have produced widely divergent results, and there is limited comprehensive empirical evidence about how Openness-related scales on existing personality inventories align within the 5-factor framework. In Study 1, we used a critical incidents sorting methodology to identify 11 categories of Openness measures; in Study 2, we meta-analyzed the relationships of these categories with global markers of the Big Five traits (utilizing data from 106 samples with a total sample size of N = 35,886). Our results identified 4 true facets of Openness: aestheticism, openness to sensations, nontraditionalism, and introspection. Measures of these facets were unadulterated by variance from other Big Five traits. Many traits frequently conceptualized as facets of Openness (e.g., innovation/creativity, variety-seeking, and tolerance) emerged as trait compounds that, although related to Openness, are also dependent on other Big Five traits. We discuss how Openness should be conceptualized, measured, and studied in light of the empirically based, refined taxonomy emerging from this research.

  18. An International Marine-Atmospheric 222Rn Measurement Intercomparison in Bermuda Part I: NIST Calibration and Methodology for Standardized Sample Additions

    PubMed Central

    Collé, R.; Unterweger, M. P.; Hodge, P. A.; Hutchinson, J. M. R.

    1996-01-01

    As part of an international 222Rn measurement intercomparison conducted at Bermuda in October 1991, NIST provided standardized sample additions of known, but undisclosed (“blind”) 222Rn concentrations that could be related to U.S. national standards. The standardized sample additions were obtained with a calibrated 226Ra source and a specially-designed manifold used to obtain well-known dilution factors from simultaneous flow-rate measurements. The additions were introduced over sampling periods of several hours (typically 4 h) into a common streamline on a sampling tower used by the participating laboratories for their measurements. The standardized 222Rn activity concentrations for the intercomparison ranged from approximately 2.5 Bq · m−3 to 35 Bq · m−3 (of which the lower end of this range approached concentration levels for ambient Bermudian air) and had overall uncertainties, approximating a 3 standard deviation uncertainty interval, of about 6 % to 13 %. This paper describes the calibration and methodology for the standardized sample additions. PMID:27805090

  19. Conceptual and methodological issues in research on mindfulness and meditation.

    PubMed

    Davidson, Richard J; Kaszniak, Alfred W

    2015-10-01

    Both basic science and clinical research on mindfulness, meditation, and related constructs have dramatically increased in recent years. However, interpretation of these research results has been challenging. The present article addresses unique conceptual and methodological problems posed by research in this area. Included among the key topics is the role of first-person experience and how it can be best studied, the challenges posed by intervention research designs in which true double-blinding is not possible, the nature of control and comparison conditions for research that includes mindfulness or other meditation-based interventions, issues in the adequate description of mindfulness and related trainings and interventions, the question of how mindfulness can be measured, questions regarding what can and cannot be inferred from self-report measures, and considerations regarding the structure of study design and data analyses. Most of these topics are germane to both basic and clinical research studies and have important bearing on the future scientific understanding of mindfulness and meditation. (c) 2015 APA, all rights reserved).

  20. Event-related potential components as measures of aversive conditioning in humans.

    PubMed

    Bacigalupo, Felix; Luck, Steven J

    2018-04-01

    For more than 60 years, the gold standard for assessing aversive conditioning in humans has been the skin conductance response (SCR), which arises from the activation of the peripheral nervous system. Although the SCR has been proven useful, it has some properties that impact the kinds of questions it can be used to answer. In particular, the SCR is slow, reaching a peak 4-5 s after stimulus onset, and it decreases in amplitude after a few trials (habituation). The present study asked whether the late positive potential (LPP) of the ERP waveform could be a useful complementary method for assessing aversive conditioning in humans. The SCR and LPP were measured in an aversive conditioning paradigm consisting of three blocks in which one color was paired with a loud noise (CS+) and other colors were not paired with the noise (CS-). Participants also reported the perceived likelihood of being exposed to the noise for each color. Both SCR and LPP were significantly larger on CS+ trials than on CS- trials. However, SCR decreased steeply after the first conditioning block, whereas LPP and self-reports were stable over blocks. These results indicate that the LPP can be used to assess aversive conditioning and has several useful properties: (a) it is a direct response of the central nervous system, (b) it is fast, with an onset latency of 300 ms, (c) it does not habituate over time. © 2017 Society for Psychophysiological Research.

  1. Discriminant content validity: a quantitative methodology for assessing content of theory-based measures, with illustrative applications.

    PubMed

    Johnston, Marie; Dixon, Diane; Hart, Jo; Glidewell, Liz; Schröder, Carin; Pollard, Beth

    2014-05-01

    In studies involving theoretical constructs, it is important that measures have good content validity and that there is not contamination of measures by content from other constructs. While reliability and construct validity are routinely reported, to date, there has not been a satisfactory, transparent, and systematic method of assessing and reporting content validity. In this paper, we describe a methodology of discriminant content validity (DCV) and illustrate its application in three studies. Discriminant content validity involves six steps: construct definition, item selection, judge identification, judgement format, single-sample test of content validity, and assessment of discriminant items. In three studies, these steps were applied to a measure of illness perceptions (IPQ-R) and control cognitions. The IPQ-R performed well with most items being purely related to their target construct, although timeline and consequences had small problems. By contrast, the study of control cognitions identified problems in measuring constructs independently. In the final study, direct estimation response formats for theory of planned behaviour constructs were found to have as good DCV as Likert format. The DCV method allowed quantitative assessment of each item and can therefore inform the content validity of the measures assessed. The methods can be applied to assess content validity before or after collecting data to select the appropriate items to measure theoretical constructs. Further, the data reported for each item in Appendix S1 can be used in item or measure selection. Statement of contribution What is already known on this subject? There are agreed methods of assessing and reporting construct validity of measures of theoretical constructs, but not their content validity. Content validity is rarely reported in a systematic and transparent manner. What does this study add? The paper proposes discriminant content validity (DCV), a systematic and transparent method

  2. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    PubMed

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  4. Diagnosis of Insulation Condition of MV Switchgears by Application of Different Partial Discharge Measuring Methods and Sensors

    PubMed Central

    2018-01-01

    Partial discharges (PD) measurement provides valuable information for the condition assessment of the insulation status of high-voltage (HV) electrical installations. During the last three decades, several PD sensors and measuring techniques have been developed to perform accurate diagnostics when PD measurements are carried out on-site and on-line. For utilities, the most attractive characteristics of on-line measurements are that once the sensors are installed in the grid, the electrical service is uninterrupted and that electrical systems are tested in real operating conditions. In medium-voltage (MV) and HV installations, one of the critical points where an insulation defect can occur is inside metal-clad switchgears (including the cable terminals connected to them). Thus, this kind of equipment is increasingly being monitored to carry out proper maintenance based on their condition. This paper presents a study concerning the application of different electromagnetic measuring techniques (compliant with IEC 62478 and IEC 60270 standards), together with the use of suitable sensors, which enable the evaluation of the insulation condition mainly in MV switchgears. The main scope is to give a general overview about appropriate types of electromagnetic measuring methods and sensors to be applied, while considering the level of detail and accuracy in the diagnosis and the particular fail-save requirements of the electrical installations where the switchgears are located. PMID:29495601

  5. Accurate measurement of junctional conductance between electrically coupled cells with dual whole-cell voltage-clamp under conditions of high series resistance.

    PubMed

    Hartveit, Espen; Veruki, Margaret Lin

    2010-03-15

    Accurate measurement of the junctional conductance (G(j)) between electrically coupled cells can provide important information about the functional properties of coupling. With the development of tight-seal, whole-cell recording, it became possible to use dual, single-electrode voltage-clamp recording from pairs of small cells to measure G(j). Experiments that require reduced perturbation of the intracellular environment can be performed with high-resistance pipettes or the perforated-patch technique, but an accompanying increase in series resistance (R(s)) compromises voltage-clamp control and reduces the accuracy of G(j) measurements. Here, we present a detailed analysis of methodologies available for accurate determination of steady-state G(j) and related parameters under conditions of high R(s), using continuous or discontinuous single-electrode voltage-clamp (CSEVC or DSEVC) amplifiers to quantify the parameters of different equivalent electrical circuit model cells. Both types of amplifiers can provide accurate measurements of G(j), with errors less than 5% for a wide range of R(s) and G(j) values. However, CSEVC amplifiers need to be combined with R(s)-compensation or mathematical correction for the effects of nonzero R(s) and finite membrane resistance (R(m)). R(s)-compensation is difficult for higher values of R(s) and leads to instability that can damage the recorded cells. Mathematical correction for R(s) and R(m) yields highly accurate results, but depends on accurate estimates of R(s) throughout an experiment. DSEVC amplifiers display very accurate measurements over a larger range of R(s) values than CSEVC amplifiers and have the advantage that knowledge of R(s) is unnecessary, suggesting that they are preferable for long-duration experiments and/or recordings with high R(s). Copyright (c) 2009 Elsevier B.V. All rights reserved.

  6. Measuring sporadic gastrointestinal illness associated with drinking water - an overview of methodologies.

    PubMed

    Bylund, John; Toljander, Jonas; Lysén, Maria; Rasti, Niloofar; Engqvist, Jannes; Simonsson, Magnus

    2017-06-01

    There is an increasing awareness that drinking water contributes to sporadic gastrointestinal illness (GI) in high income countries of the northern hemisphere. A literature search was conducted in order to review: (1) methods used for investigating the effects of public drinking water on GI; (2) evidence of possible dose-response relationship between sporadic GI and drinking water consumption; and (3) association between sporadic GI and factors affecting drinking water quality. Seventy-four articles were selected, key findings and information gaps were identified. In-home intervention studies have only been conducted in areas using surface water sources and intervention studies in communities supplied by ground water are therefore needed. Community-wide intervention studies may constitute a cost-effective alternative to in-home intervention studies. Proxy data that correlate with GI in the community can be used for detecting changes in the incidence of GI. Proxy data can, however, not be used for measuring the prevalence of illness. Local conditions affecting water safety may vary greatly, making direct comparisons between studies difficult unless sufficient knowledge about these conditions is acquired. Drinking water in high-income countries contributes to endemic levels of GI and there are public health benefits for further improvements of drinking water safety.

  7. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  8. Measurements of optical underwater turbulence under controlled conditions

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.; Gladysz, S.; Almeida de Sá Barros, R.; Matt, S.; Nootz, G. A.; Josset, D. B.; Hou, W.

    2016-05-01

    Laser beam propagation underwater is becoming an important research topic because of high demand for its potential applications. Namely, ability to image underwater at long distances is highly desired for scientific and military purposes, including submarine awareness, diver visibility, and mine detection. Optical communication in the ocean can provide covert data transmission with much higher rates than that available with acoustic techniques, and it is now desired for certain military and scientific applications that involve sending large quantities of data. Unfortunately underwater environment presents serious challenges for propagation of laser beams. Even in clean ocean water, the extinction due to absorption and scattering theoretically limit the useful range to few attenuation lengths. However, extending the laser light propagation range to the theoretical limit leads to significant beam distortions due to optical underwater turbulence. Experiments show that the magnitude of the distortions that are caused by water temperature and salinity fluctuations can significantly exceed the magnitude of the beam distortions due to atmospheric turbulence even for relatively short propagation distances. We are presenting direct measurements of optical underwater turbulence in controlled conditions of laboratory water tank using two separate techniques involving wavefront sensor and LED array. These independent approaches will enable development of underwater turbulence power spectrum model based directly on the spatial domain measurements and will lead to accurate predictions of underwater beam propagation.

  9. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less

  10. The influence of exogenous conditions on mobile measured gamma-ray spectrometry

    NASA Astrophysics Data System (ADS)

    Dierke, C.; Werban, U.; Dietrich, P.

    2012-12-01

    In the past, gamma ray measurements have been used for geological surveys and exploration using airborne and borehole logging systems. For these applications, the relationships between the measured physical parameter - the concentration of natural gamma emitters 40K, 238U and 232Th - and geological origin or sedimentary developments are well described. Based on these applications and knowledge in combination with adjusted sensor systems, gamma ray measurements are used to derive soil parameters to create detailed soil maps e.g., in digital soil mapping (DSM) and monitoring of soils. Therefore, not only qualitative but also quantitative comparability is necessary. Grain size distribution, type of clay minerals and organic matter content are soil parameters which directly influence the gamma ray emitter concentration. Additionally, the measured concentration is influenced by endogenous processes like soil moisture variation due to raining events, foggy weather conditions, or erosion and deposition of material. A time series of gamma ray measurements was used to observe changes in gamma ray concentration on a floodplain area in Central Germany. The study area is characterised by high variations in grain size distribution and occurrence of flooding events. For the survey, we used a 4l NaI(Tl) detector with GPS connection mounted on a sledge, which is towed across the field sites by a four-wheel-vehicle. The comparison of data from different time steps shows similar structures with minor variation between the data ranges and shape of structures. However, the data measured during different soil moisture contents differ in absolute value. An average increase of soil moisture of 36% leads to a decrease of Th (by 20%), K (by 29%), and U (by 41%). These differences can be explained by higher attenuation of radiation during higher soil moisture content. The different changes in nuclide concentration will also lead to varying ratios. We will present our experiences concerning

  11. Financial Measures Project. New Developments in Measuring Financial Conditions of Colleges and Universities: Papers Presented at a Working Conference.

    ERIC Educational Resources Information Center

    New York Community Trust, NY.

    Papers presented at a working conference on new developments in measuring financial conditions of colleges and universities included the following: "Using Financial Indicators for Public Policy Purposes," by George W. Bonham; "Conceptual Advances in Specifying Financial Indicators: Cash Flows in the Short and Long Run," by Hans…

  12. Methodological Issues in Mobile Computer-Supported Collaborative Learning (mCSCL): What Methods, What to Measure and When to Measure?

    ERIC Educational Resources Information Center

    Song, Yanjie

    2014-01-01

    This study aims to investigate (1) methods utilized in mobile computer-supported collaborative learning (mCSCL) research which focuses on studying, learning and collaboration mediated by mobile devices; (2) whether these methods have examined mCSCL effectively; (3) when the methods are administered; and (4) what methodological issues exist in…

  13. Light field and water clarity simulation of natural environments in laboratory conditions

    NASA Astrophysics Data System (ADS)

    Pe'eri, Shachak; Shwaery, Glenn

    2012-06-01

    Simulation of natural oceanic conditions in a laboratory setting is a challenging task, especially when that environment can be miles away. We present an attempt to replicate the solar radiation expected at different latitudes with varying water clarity conditions up to 30 m in depth using a 2.5 m deep engineering tank at the University of New Hampshire. The goals of the study were: 1) to configure an underwater light source that produced an irradiance spectrum similar to natural daylight with the sun at zenith and at 60° under clear atmospheric conditions, and 2) to monitor water clarity as a function of depth. Irradiance was measured using a spectra-radiometer with a cosine receiver to analyze the output spectrum of submersed lamps as a function of distance. In addition, an underwater reflection method was developed to measure the diffuse attenuation coefficient in real time. Two water clarity types were characterized, clear waters representing deep, open-ocean conditions, and murky waters representing littoral environments. Results showed good correlation between the irradiance measured at 400 nm to 600 nm and the natural daylight spectrum at 3 m from the light source. This can be considered the water surface conditions reference. Using these methodologies in a controlled laboratory setting, we are able to replicate illumination and water conditions to study the physical, chemical and biological processes on natural and man-made objects and/or systems in simulated, varied geographic locations and environments.

  14. Methodological assessment of skin and limb blood flows in the human forearm during thermal and baroreceptor provocations

    PubMed Central

    Brothers, R. Matthew; Wingo, Jonathan E.; Hubing, Kimberly A.

    2010-01-01

    Skin blood flow responses in the human forearm, assessed by three commonly used technologies—single-point laser-Doppler flowmetry, integrated laser-Doppler flowmetry, and laser-Doppler imaging—were compared in eight subjects during normothermic baseline, acute skin-surface cooling, and whole body heat stress (Δ internal temperature = 1.0 ± 0.2°C; P < 0.001). In addition, while normothermic and heat stressed, subjects were exposed to 30-mmHg lower-body negative pressure (LBNP). Skin blood flow was normalized to the maximum value obtained at each site during local heating to 42°C for at least 30 min. Furthermore, comparisons of forearm blood flow (FBF) measures obtained using venous occlusion plethysmography and Doppler ultrasound were made during the aforementioned perturbations. Relative to normothermic baseline, skin blood flow decreased during normothermia + LBNP (P < 0.05) and skin-surface cooling (P < 0.01) and increased during whole body heating (P < 0.001). Subsequent LBNP during whole body heating significantly decreased skin blood flow relative to control heat stress (P < 0.05). Importantly, for each of the aforementioned conditions, skin blood flow was similar between the three measurement devices (main effect of device: P > 0.05 for all conditions). Similarly, no differences were identified across all perturbations between FBF measures using plethysmography and Doppler ultrasound (P > 0.05 for all perturbations). These data indicate that when normalized to maximum, assessment of skin blood flow in response to vasoconstrictor and dilator perturbations are similar regardless of methodology. Likewise, FBF responses to these perturbations are similar between two commonly used methodologies of limb blood flow assessment. PMID:20634360

  15. Methodological assessment of skin and limb blood flows in the human forearm during thermal and baroreceptor provocations.

    PubMed

    Brothers, R Matthew; Wingo, Jonathan E; Hubing, Kimberly A; Crandall, Craig G

    2010-09-01

    Skin blood flow responses in the human forearm, assessed by three commonly used technologies-single-point laser-Doppler flowmetry, integrated laser-Doppler flowmetry, and laser-Doppler imaging-were compared in eight subjects during normothermic baseline, acute skin-surface cooling, and whole body heat stress (Δ internal temperature=1.0±0.2 degrees C; P<0.001). In addition, while normothermic and heat stressed, subjects were exposed to 30-mmHg lower-body negative pressure (LBNP). Skin blood flow was normalized to the maximum value obtained at each site during local heating to 42 degrees C for at least 30 min. Furthermore, comparisons of forearm blood flow (FBF) measures obtained using venous occlusion plethysmography and Doppler ultrasound were made during the aforementioned perturbations. Relative to normothermic baseline, skin blood flow decreased during normothermia+LBNP (P<0.05) and skin-surface cooling (P<0.01) and increased during whole body heating (P<0.001). Subsequent LBNP during whole body heating significantly decreased skin blood flow relative to control heat stress (P<0.05). Importantly, for each of the aforementioned conditions, skin blood flow was similar between the three measurement devices (main effect of device: P>0.05 for all conditions). Similarly, no differences were identified across all perturbations between FBF measures using plethysmography and Doppler ultrasound (P>0.05 for all perturbations). These data indicate that when normalized to maximum, assessment of skin blood flow in response to vasoconstrictor and dilator perturbations are similar regardless of methodology. Likewise, FBF responses to these perturbations are similar between two commonly used methodologies of limb blood flow assessment.

  16. Just Research in Contentious Times: Widening the Methodological Imagination

    ERIC Educational Resources Information Center

    Fine, Michelle

    2017-01-01

    In this intensely powerful and personal new text, Michelle Fine widens the methodological imagination for students, educators, scholars, and researchers interested in crafting research with communities. Fine shares her struggles over the course of 30 years to translate research into policy and practice that can enhance the human condition and…

  17. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  18. Ramifications of increased training in quantitative methodology.

    PubMed

    Zimiles, Herbert

    2009-01-01

    Comments on the article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America" by Aiken, West, and Millsap. The current author asks three questions that are provoked by the comprehensive identification of gaps and deficiencies in the training of quantitative methodology that led Aiken, West, and Millsap to call for expanded graduate instruction resources and programs. This comment calls for greater attention to how advances and expansion in the training of quantitative analysis are influencing who chooses to study psychology and how and what will be studied. PsycINFO Database Record 2009 APA.

  19. Use of Taguchi methodology to enhance the yield of caffeine removal with growing cultures of Pseudomonas pseudoalcaligenes.

    PubMed

    Ashengroph, Morahem; Ababaf, Sajad

    2014-12-01

    Microbial caffeine removal is a green solution for treatment of caffeinated products and agro-industrial effluents. We directed this investigation to optimizing a bio-decaffeination process with growing cultures of Pseudomonas pseudoalcaligenes through Taguchi methodology which is a structured statistical approach that can be lowered variations in a process through Design of Experiments (DOE). Five parameters, i.e. initial fructose, tryptone, Zn(+2) ion and caffeine concentrations and also incubation time selected and an L16 orthogonal array was applied to design experiments with four 4-level factors and one 3-level factor (4(4) × 1(3)). Data analysis was performed using the statistical analysis of variance (ANOVA) method. Furthermore, the optimal conditions were determined by combining the optimal levels of the significant factors and verified by a confirming experiment. Measurement of residual caffeine concentration in the reaction mixture was performed using high-performance liquid chromatography (HPLC). Use of Taguchi methodology for optimization of design parameters resulted in about 86.14% reduction of caffeine in 48 h incubation when 5g/l fructose, 3 mM Zn(+2) ion and 4.5 g/l of caffeine are present in the designed media. Under the optimized conditions, the yield of degradation of caffeine (4.5 g/l) by the native strain of Pseudomonas pseudoalcaligenes TPS8 has been increased from 15.8% to 86.14% which is 5.4 fold higher than the normal yield. According to the experimental results, Taguchi methodology provides a powerful methodology for identifying the favorable parameters on caffeine removal using strain TPS8 which suggests that the approach also has potential application with similar strains to improve the yield of caffeine removal from caffeine containing solutions.

  20. MODIFIED PATH METHODOLOGY FOR OBTAINING INTERVAL-SCALED POSTURAL ASSESSMENTS OF FARMWORKERS.

    PubMed

    Garrison, Emma B; Dropkin, Jonathan; Russell, Rebecca; Jenkins, Paul

    2018-01-29

    Agricultural workers perform tasks that frequently require awkward and extreme postures that are associated with musculoskeletal disorders (MSDs). The PATH (Posture, Activity, Tools, Handling) system currently provides a sound methodology for quantifying workers' exposure to these awkward postures on an ordinal scale of measurement, which places restrictions on the choice of analytic methods. This study reports a modification of the PATH methodology that instead captures these postures as degrees of flexion, an interval-scaled measurement. Rather than making live observations in the field, as in PATH, the postural assessments were performed on photographs using ImageJ photo analysis software. Capturing the postures in photographs permitted more careful measurement of the degrees of flexion. The current PATH methodology requires that the observer in the field be trained in the use of PATH, whereas the single photographer used in this modification requires only sufficient training to maintain the proper camera angle. Ultimately, these interval-scale measurements could be combined with other quantitative measures, such as those produced by electromyograms (EMGs), to provide more sophisticated estimates of future risk for MSDs. Further, these data can provide a baseline from which the effects of interventions designed to reduce hazardous postures can be calculated with greater precision. Copyright© by the American Society of Agricultural Engineers.

  1. Assessing the impact of healthcare research: A systematic review of methodological frameworks

    PubMed Central

    Keeley, Thomas J.; Calvert, Melanie J.

    2017-01-01

    Background Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Methods and findings Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) ‘primary research-related impact’, (2) ‘influence on policy making’, (3) ‘health and health systems impact’, (4) ‘health-related and societal impact’, and (5) ‘broader economic impact’. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. Conclusions The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the

  2. Assessing the impact of healthcare research: A systematic review of methodological frameworks.

    PubMed

    Cruz Rivera, Samantha; Kyte, Derek G; Aiyegbusi, Olalekan Lee; Keeley, Thomas J; Calvert, Melanie J

    2017-08-01

    Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) 'primary research-related impact', (2) 'influence on policy making', (3) 'health and health systems impact', (4) 'health-related and societal impact', and (5) 'broader economic impact'. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study

  3. [Studies on optimizing preparation technics of wumeitougu oral liquid by response surface methodology].

    PubMed

    Yu, Xiao-cui; Liu, Gao-feng; Wang, Xin

    2011-02-01

    To optimize the preparation technics of wumeitougu oral liquid (WTOL) by response surface methodology. Based on the single-factor tests, the times of WTOL extraction, alcohol precipitation concentration and pH value were selected as three factors for box-behnken central composite design. The response surface methodology was used to optimize the parameters of the preparation. Under the condition of extraction time 1.5 h, extraction times 2.772, the relative density 1.12, alcohol precipitation concentration 68.704%, and pH value 5.0, he theory highest content of Asperosaponin VI was up to 549.908 mg/L. Considering the actual situation, the conditions were amended to three extract times, alcohol precipitation concentration 69%, pH value 5.0, and the content of Dipsacaceae VI saponin examined was 548.63 mg/L which was closed to the theoretical value. The optimized preparation technics of WTOL by response surface methodology is reasonable and feasible.

  4. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, suchmore » as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.« less

  5. Methodological challenges in international performance measurement using patient-level administrative data.

    PubMed

    Kiivet, Raul; Sund, Reijo; Linna, Miika; Silverman, Barbara; Pisarev, Heti; Friedman, Nurit

    2013-09-01

    We conducted this case study in order to test how health system performance could be compared using the existing national administrative health databases containing individual data. In this comparative analysis we used national data set from three countries, Estonia, Israel and Finland to follow the medical history, treatment outcome and resource use of patients with a chronic disease (diabetes) for 8 years after medical treatment was initiated. This study showed that several clinically important aspects of quality of care as well as health policy issues of cost-effectiveness and efficiency of health systems can be assessed by using the national administrative health data systems, in case those collecting person-level health service data. We developed a structured study protocol and detailed data specifications to generate standardized data sets, in each country, for long-term follow up of incident cohort of diabetic persons as well as shared analyzing programs to produce performance measures from the standardized data sets. This stepwise decentralized approach and use of anonymous person-level data allowed us to mitigate any legal, ownership, confidentiality and privacy concerns and to create internationally comparative data with the extent of detail that is seldom seen before. For example, our preliminary performance comparisons indicate that higher mortality among relatively young diabetes patients in Estonia may be related to considerably higher rates of cardiovascular complications and lower use of statins. Modern administrative person-level health service databases contain sufficiently rich data in details to assess the performance of health systems in the management of chronic diseases. This paper presents and discusses the methodological challenges and the way the problems were solved or avoided to enhance the representativeness and comparability of results. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  6. Reliability of the Kinetic Measures under Different Heel Conditions during Normal Walking

    ERIC Educational Resources Information Center

    Liu, Yuanlong; Wang, Yong Tai

    2004-01-01

    The purpose of this study was to determine and compare the reliability of 3 dimension reaction forces and impulses in walking with 3 different heel shoe conditions. These results suggest that changing the height of the heels affects mainly the reliability of the ground reaction force and impulse measures on the medial and lateral dimension and not…

  7. Methodological challenges in measuring vaccine effectiveness using population cohorts in low resource settings

    PubMed Central

    King, C.; Beard, J.; Crampin, A.C.; Costello, A.; Mwansambo, C.; Cunliffe, N.A.; Heyderman, R.S.; French, N.; Bar-Zeev, N.

    2015-01-01

    Post-licensure real world evaluation of vaccine implementation is important for establishing evidence of vaccine effectiveness (VE) and programme impact, including indirect effects. Large cohort studies offer an important epidemiological approach for evaluating VE, but have inherent methodological challenges. Since March 2012, we have conducted an open prospective cohort study in two sites in rural Malawi to evaluate the post-introduction effectiveness of 13-valent pneumococcal conjugate vaccine (PCV13) against all-cause post-neonatal infant mortality and monovalent rotavirus vaccine (RV1) against diarrhoea-related post-neonatal infant mortality. Our study sites cover a population of 500,000, with a baseline post-neonatal infant mortality of 25 per 1000 live births. We conducted a methodological review of cohort studies for vaccine effectiveness in a developing country setting, applied to our study context. Based on published literature, we outline key considerations when defining the denominator (study population), exposure (vaccination status) and outcome ascertainment (mortality and cause of death) of such studies. We assess various definitions in these three domains, in terms of their impact on power, effect size and potential biases and their direction, using our cohort study for illustration. Based on this iterative process, we discuss the pros and cons of our final per-protocol analysis plan. Since no single set of definitions or analytical approach accounts for all possible biases, we propose sensitivity analyses to interrogate our assumptions and methodological decisions. In the poorest regions of the world where routine vital birth and death surveillance are frequently unavailable and the burden of disease and death is greatest We conclude that provided the balance between definitions and their overall assumed impact on estimated VE are acknowledged, such large scale real-world cohort studies can provide crucial information to policymakers by providing

  8. Measurements in Transitional Boundary Layers Under High Free-Stream Turbulence and Strong Acceleration Conditions

    NASA Technical Reports Server (NTRS)

    Volino, Ralph J.; Simon, Terrence W.

    1995-01-01

    Measurements from transitional, heated boundary layers along a concave-curved test wall are presented and discussed. A boundary layer subject to low free-stream turbulence intensity (FSTI), which contains stationary streamwise (Gortler) vortices, is documented. The low FSTI measurements are followed by measurements in boundary layers subject to high (initially 8%) free-stream turbulence intensity and moderate to strong streamwise acceleration. Conditions were chosen to simulate those present on the downstream half of the pressure side of a gas turbine airfoil. Mean flow characteristics as well as turbulence statistics, including the turbulent shear stress, turbulent heat flux, and turbulent Prandtl number, are documented. A technique called "octant analysis" is introduced and applied to several cases from the literature as well as to data from the present study. Spectral analysis was applied to describe the effects of turbulence scales of different sizes during transition. To the authors'knowledge, this is the first detailed documentation of boundary layer transition under such high free-stream turbulence conditions.

  9. The Continued Salience of Methodological Issues for Measuring Psychiatric Disorders in International Surveys

    ERIC Educational Resources Information Center

    Tausig, Mark; Subedi, Janardan; Broughton, Christopher; Pokimica, Jelena; Huang, Yinmei; Santangelo, Susan L.

    2011-01-01

    We investigated the extent to which methodological concerns explicitly addressed by the designers of the World Mental Health Surveys persist in the results that were obtained using the WMH-CIDI instrument. We compared rates of endorsement of mental illness symptoms in the United States (very high) and Nepal (very low) as they were affected by…

  10. Optimization of Reflux Conditions for Total Flavonoid and Total Phenolic Extraction and Enhanced Antioxidant Capacity in Pandan (Pandanus amaryllifolius Roxb.) Using Response Surface Methodology

    PubMed Central

    Ghasemzadeh, Ali; Jaafar, Hawa Z. E.

    2014-01-01

    Response surface methodology was applied to optimization of the conditions for reflux extraction of Pandan (Pandanus amaryllifolius Roxb.) in order to achieve a high content of total flavonoids (TF), total phenolics (TP), and high antioxidant capacity (AC) in the extracts. Central composite experimental design with three factors and three levels was employed to consider the effects of the operation parameters, including the methanol concentration (MC, 40%–80%), extraction temperature (ET, 40–70°C), and liquid-to-solid ratio (LS ratio, 20–40 mL/g) on the properties of the extracts. Response surface plots showed that increasing these operation parameters induced the responses significantly. The TF content and AC could be maximized when the extraction conditions (MC, ET, and LS ratio) were 78.8%, 69.5°C, and 32.4 mL/g, respectively, whereas the TP content was optimal when these variables were 75.1%, 70°C, and 31.8 mL/g, respectively. Under these optimum conditions, the experimental TF and TP content and AC were 1.78, 6.601 mg/g DW, and 87.38%, respectively. The optimized model was validated by a comparison of the predicted and experimental values. The experimental values were found to be in agreement with the predicted values, indicating the suitability of the model for optimizing the conditions for the reflux extraction of Pandan. PMID:25147852

  11. Arterial Stiffness in Children: Pediatric Measurement and Considerations

    PubMed Central

    Savant, Jonathan D.; Furth, Susan L.; Meyers, Kevin E.C.

    2014-01-01

    Background Arterial stiffness is a natural consequence of aging, accelerated in certain chronic conditions, and predictive of cardiovascular events in adults. Emerging research suggests the importance of arterial stiffness in pediatric populations. Methods There are different indices of arterial stiffness. The present manuscript focuses on carotid-femoral pulse wave velocity and pulse wave analysis, although other methodologies are discussed. Also reviewed are specific measurement considerations for pediatric populations and the literature describing arterial stiffness in children with certain chronic conditions (primary hypertension, obesity, diabetes, chronic kidney disease, hypercholesterolemia, genetic syndromes involving vasculopathy, and solid organ transplant recipients). Conclusions The measurement of arterial stiffness in children is feasible and, under controlled conditions, can give accurate information about the underlying state of the arteries. This potentially adds valuable information about the functionality of the cardiovascular system in children with a variety of chronic diseases well beyond that of the brachial artery blood pressure. PMID:26587447

  12. New methodology for fast prediction of wheel wear evolution

    NASA Astrophysics Data System (ADS)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  13. A methodology for investigating interdependencies between measured throughfall, meteorological variables and canopy structure on a small catchment.

    NASA Astrophysics Data System (ADS)

    Maurer, Thomas; Gustavos Trujillo Siliézar, Carlos; Oeser, Anne; Pohle, Ina; Hinz, Christoph

    2016-04-01

    In evolving initial landscapes, vegetation development depends on a variety of feedback effects. One of the less understood feedback loops is the interaction between throughfall and plant canopy development. The amount of throughfall is governed by the characteristics of the vegetation canopy, whereas vegetation pattern evolution may in turn depend on the spatio-temporal distribution of throughfall. Meteorological factors that may influence throughfall, while at the same time interacting with the canopy, are e.g. wind speed, wind direction and rainfall intensity. Our objective is to investigate how throughfall, vegetation canopy and meteorological variables interact in an exemplary eco-hydrological system in its initial development phase, in which the canopy is very heterogeneous and rapidly changing. For that purpose, we developed a methodological approach combining field methods, raster image analysis and multivariate statistics. The research area for this study is the Hühnerwasser ('Chicken Creek') catchment in Lower Lusatia, Brandenburg, Germany, where after eight years of succession, the spatial distribution of plant species is highly heterogeneous, leading to increasingly differentiated throughfall patterns. The constructed 6-ha catchment offers ideal conditions for our study due to the rapidly changing vegetation structure and the availability of complementary monitoring data. Throughfall data were obtained by 50 tipping bucket rain gauges arranged in two transects and connected via a wireless sensor network that cover the predominant vegetation types on the catchment (locust copses, dense sallow thorn bushes and reeds, base herbaceous and medium-rise small-reed vegetation, and open areas covered by moss and lichens). The spatial configuration of the vegetation canopy for each measurement site was described via digital image analysis of hemispheric photographs of the canopy using the ArcGIS Spatial Analyst, GapLight and ImageJ software. Meteorological data

  14. The Differential Effect of Attentional Condition on Subsequent Vocabulary Development

    ERIC Educational Resources Information Center

    Mohammed, Halah Abdulelah; Majid, Norazman Abdul; Abdullah, Tina

    2016-01-01

    This study addressed the potential methodological issues effect of attentional condition on subsequent vocabulary development from a different perspective, which addressed several potential methodological issues of previous research that have been based on psycholinguistic notion of second language learner as a limited capacity processor. The…

  15. Measuring the human psychophysiological conditions without contact

    NASA Astrophysics Data System (ADS)

    Scalise, L.; Casacanditella, L.; Cosoli, G.

    2017-08-01

    Heart Rate Variability, HRV, studies the variations of cardiac rhythm caused by the autonomic regulation. HRV analysis can be applied to the study of the effects of mental or physical stressors on the psychophysiological conditions. The present work is a pilot study performed on a 23-year-old healthy subject. The measurement of HRV was performed by means of two sensors, that is an electrocardiograph and a Laser Doppler Vibrometer, which is a non-contact device able to detect the skin vibrations related to the cardiac activity. The present study aims to evaluate the effects of a physical task on HRV parameters (in both time and frequency domain), and consequently on the autonomic regulation, and the capability of Laser Doppler Vibrometry in correctly detecting the effects of stress on the Heart Variability. The results show a significant reduction of HRV parameters caused by the execution of the physical task (i.e. variations of 25-40% for parameters in time domain, also higher in frequency domain); this is consistent with the fact that stress causes a reduced capability of the organism in varying the Heart Rate (and, consequently, a limited HRV). LDV was able to correctly detect this phenomenon in the time domain, while the parameters in the frequency domain show significant deviations with respect to the gold standard technique (i.e. ECG). This may be due to the movement artefacts that have consistently modified the shape of the vibration signal measured by means of LDV, after having performed the physical task. In the future, in order to avoid this drawback, the LDV technique could be used to evaluate the effects of a mental task on HRV signals (i.e. the evaluation of mental stress).

  16. Innovative methodology for intercomparison of radionuclide calibrators using short half-life in situ prepared radioactive sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira, P. A.; Santos, J. A. M., E-mail: joao.santos@ipoporto.min-saude.pt; Serviço de Física Médica do Instituto Português de Oncologia do Porto Francisco Gentil, EPE, Porto

    2014-07-15

    Purpose: An original radionuclide calibrator method for activity determination is presented. The method could be used for intercomparison surveys for short half-life radioactive sources used in Nuclear Medicine, such as{sup 99m}Tc or most positron emission tomography radiopharmaceuticals. Methods: By evaluation of the resulting net optical density (netOD) using a standardized scanning method of irradiated Gafchromic XRQA2 film, a comparison of the netOD measurement with a previously determined calibration curve can be made and the difference between the tested radionuclide calibrator and a radionuclide calibrator used as reference device can be calculated. To estimate the total expected measurement uncertainties, a carefulmore » analysis of the methodology, for the case of{sup 99m}Tc, was performed: reproducibility determination, scanning conditions, and possible fadeout effects. Since every factor of the activity measurement procedure can influence the final result, the method also evaluates correct syringe positioning inside the radionuclide calibrator. Results: As an alternative to using a calibrated source sent to the surveyed site, which requires a relatively long half-life of the nuclide, or sending a portable calibrated radionuclide calibrator, the proposed method uses a source preparedin situ. An indirect activity determination is achieved by the irradiation of a radiochromic film using {sup 99m}Tc under strictly controlled conditions, and cumulated activity calculation from the initial activity and total irradiation time. The irradiated Gafchromic film and the irradiator, without the source, can then be sent to a National Metrology Institute for evaluation of the results. Conclusions: The methodology described in this paper showed to have a good potential for accurate (3%) radionuclide calibrators intercomparison studies for{sup 99m}Tc between Nuclear Medicine centers without source transfer and can easily be adapted to other short half

  17. Measuring workload for tuberculosis service provision at primary care level: a methodology

    PubMed Central

    2012-01-01

    We developed and piloted a methodology to establish TB related work load at primary care level for clinical and laboratory staff. Workload is influenced by activities to be implemented, time to perform them, their frequency and patient load. Of particular importance is the patient pathway for diagnosis and treatment and the frequency of clinic visits. Using observation with checklists, clocking, interviews and review of registers, allows assessing the contribution of different factors on the workload. PMID:22640406

  18. Viability, Advantages and Design Methodologies of M-Learning Delivery

    ERIC Educational Resources Information Center

    Zabel, Todd W.

    2010-01-01

    The purpose of this study was to examine the viability and principle design methodologies of Mobile Learning models in developing regions. Demographic and market studies were utilized to determine the viability of M-Learning delivery as well as best uses for such technologies and methods given socioeconomic and political conditions within the…

  19. Motherhood, Migration and Methodology: Giving Voice to the "Other"

    ERIC Educational Resources Information Center

    De Souza, Ruth

    2004-01-01

    This paper discusses the need for multi-cultural methodologies that develop knowledge about the maternity experience of migrant women and that are attuned to womens maternity-related requirements under multi-cultural conditions. Little is known about the transition to parenthood for mothers in a new country, particularly when the country is New…

  20. The Measure of a Nation: The USDA and the Rise of Survey Methodology

    ERIC Educational Resources Information Center

    Mahoney, Kevin T.; Baker, David B.

    2007-01-01

    Survey research has played a major role in American social science. An outgrowth of efforts by the United States Department of Agriculture in the 1930s, the Division of Program Surveys (DPS) played an important role in the development of survey methodology. The DPS was headed by the ambitious and entrepreneurial Rensis Likert, populated by young…

  1. Mechanical measurement of hydrogen bonded host-guest systems under non-equilibrium, near-physiological conditions.

    PubMed

    Naranjo, Teresa; Cerrón, Fernando; Nieto-Ortega, Belén; Latorre, Alfonso; Somoza, Álvaro; Ibarra, Borja; Pérez, Emilio M

    2017-09-01

    Decades after the birth of supramolecular chemistry, there are many techniques to measure noncovalent interactions, such as hydrogen bonding, under equilibrium conditions. As ensembles of molecules rapidly lose coherence, we cannot extrapolate bulk data to single-molecule events under non-equilibrium conditions, more relevant to the dynamics of biological systems. We present a new method that exploits the high force resolution of optical tweezers to measure at the single molecule level the mechanical strength of a hydrogen bonded host-guest pair out of equilibrium and under near-physiological conditions. We utilize a DNA reporter to unambiguously isolate single binding events. The Hamilton receptor-cyanuric acid host-guest system is used as a test bed. The force required to dissociate the host-guest system is ∼17 pN and increases with the pulling rate as expected for a system under non-equilibrium conditions. Blocking one of the hydrogen bonding sites results in a significant decrease of the force-to-break by 1-2 pN, pointing out the ability of the method to resolve subtle changes in the mechanical strength of the binding due to the individual H-bonding components. We believe the method will prove to be a versatile tool to address important questions in supramolecular chemistry.

  2. Benzene exposure in the petroleum distribution industry associated with leukemia in the United Kingdom: overview of the methodology of a case-control study.

    PubMed Central

    Rushton, L

    1996-01-01

    This paper describes basic principles underlying the methodology for obtaining quantitative estimates of benzene exposure in the petroleum marketing and distribution industry. Work histories for 91 cases of leukemia and 364 matched controls (4 per case) identified for a cohort of oil distribution workers up to the end of 1992 were obtained, primarily from personnel records. Information on the distribution sites, more than 90% of which were closed at the time of data collection, was obtained from site visits and archive material. Industrial hygiene measurements measured under known conditions were assembled for different tasks. These were adjusted for conditions where measured data were not available using variables known to influence exposure, such as temperature, technology, percentage of benzene in fuel handled, products handled, number of loads, and job activity. A quantitative estimate of dermal contact and peak exposure was also made. PMID:9118922

  3. Increasing accuracy in the assessment of motion sickness: A construct methodology

    NASA Technical Reports Server (NTRS)

    Stout, Cynthia S.; Cowings, Patricia S.

    1993-01-01

    The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.

  4. A New Methodology for Open Pit Slope Design in Karst-Prone Ground Conditions Based on Integrated Stochastic-Limit Equilibrium Analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui

    2016-07-01

    Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.

  5. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia

    NASA Astrophysics Data System (ADS)

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  6. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia.

    PubMed

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  7. Methodology for senior-proof guidelines: A practice example from the Netherlands.

    PubMed

    van Munster, Barbara C; Portielje, Johanna E A; Maier, Andrea B; Arends, Arend J; de Beer, Johannes J A

    2018-02-01

    Evidence-based guidelines constitute a foundation for medical decision making. It is often unclear whether recommendations in general guidelines also apply to older people. This study aimed to develop a methodology to increase the focus on older people in the development of guidelines. The methodology distinguishes 4 groups of older people: (1) relatively healthy older people; (2) older people with 1 additional specific (interfering) comorbid condition; (3) older people with multimorbidity; and (4) vulnerable older people. The level of focus on older people required may be determined by the prevalence of the disease or condition, level of suffering, social relevance, and the expectation that a guideline may improve the quality of care. A specialist in geriatric medicine may be involved in the guideline process via participation, provision of feedback on drafts, or involvement in the analysis of problem areas. Regarding the patient perspective, it is advised to involve organisations for older people or informal carers in the inventory of problem areas, and additionally to perform literature research of patient values on the subject. If the guideline focuses on older people, then the relative importance of the various outcome measures for this target group needs to be explicitly stated. Search strategies for all the 4 groups are suggested. For clinical studies that focus on the treatment of diseases that frequently occur in older people, a check should be made regarding whether these studies produce the required evidence. This can be achieved by verifying if there is sufficient representation of older people in the studies and determining if there is a separate reporting of results applying to this age group. © 2017 John Wiley & Sons, Ltd.

  8. The Statistical point of view of Quality: the Lean Six Sigma methodology

    PubMed Central

    Viti, Andrea; Terzi, Alberto

    2015-01-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality. PMID:25973253

  9. The Statistical point of view of Quality: the Lean Six Sigma methodology.

    PubMed

    Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto

    2015-04-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.

  10. Simulations of the Richtmyer-Meshkov Instability with experimentally measured volumetric initial conditions

    NASA Astrophysics Data System (ADS)

    Ferguson, Kevin; Sewell, Everest; Krivets, Vitaliy; Greenough, Jeffrey; Jacobs, Jeffrey

    2016-11-01

    Initial conditions for the Richtmyer-Meshkov instability (RMI) are measured in three dimensions in the University of Arizona Vertical Shock Tube using a moving magnet galvanometer system. The resulting volumetric data is used as initial conditions for the simulation of the RMI using ARES at Lawrence-Livermore National Laboratory (LLNL). The heavy gas is sulfur hexafluoride (SF6), and the light gas is air. The perturbations are generated by harmonically oscillating the gasses vertically using two loudspeakers mounted to the shock tube which cause Faraday resonance, producing a random short wavelength perturbation on the interface. Planar Mie scattering is used to illuminate the flow field through the addition of propylene glycol particles seeded in the heavy gas. An M=1.2 shock impulsively accelerates the interface, initiating instability growth. Images of the initial condition and instability growth are captured at a rate of 6 kHz using high speed cameras. Comparisons between experimental and simulation results, mixing diagnostics, and mixing zone growth are presented.

  11. Methodological Gravitism

    ERIC Educational Resources Information Center

    Zaman, Muhammad

    2011-01-01

    In this paper the author presents the case of the exchange marriage system to delineate a model of methodological gravitism. Such a model is not a deviation from or alteration to the existing qualitative research approaches. I have adopted culturally specific methodology to investigate spouse selection in line with the Grounded Theory Method. This…

  12. Improving training in methodology enriches the science of psychology.

    PubMed

    Aiken, Leona S; West, Stephen G; Millsap, Roger E

    2009-01-01

    Replies to the comment Ramifications of increased training in quantitative methodology by Herbet Zimiles on the current authors original article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America". The current authors state that in their recent article, they reported the results of an extensive survey of quantitative training in all PhD programs in North America. They compared these results with those of a similar survey conducted 12 years earlier (Aiken, West, Sechrest, & Reno, 1990), and raised issues for the future methodological training of substantive and quantitative researchers in psychology. The authors then respond to Zimiles three questions. PsycINFO Database Record 2009 APA.

  13. Conditional standard errors of measurement for composite scores on the Wechsler Preschool and Primary Scale of Intelligence-Third Edition.

    PubMed

    Price, Larry R; Raju, Nambury; Lurie, Anna; Wilkins, Charles; Zhu, Jianjun

    2006-02-01

    A specific recommendation of the 1999 Standards for Educational and Psychological Testing by the American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education is that test publishers report estimates of the conditional standard error of measurement (SEM). Procedures for calculating the conditional (score-level) SEM based on raw scores are well documented; however, few procedures have been developed for estimating the conditional SEM of subtest or composite scale scores resulting from a nonlinear transformation. Item response theory provided the psychometric foundation to derive the conditional standard errors of measurement and confidence intervals for composite scores on the Wechsler Preschool and Primary Scale of Intelligence-Third Edition.

  14. Automated landmark extraction for orthodontic measurement of faces using the 3-camera photogrammetry methodology.

    PubMed

    Deli, Roberto; Di Gioia, Eliana; Galantucci, Luigi Maria; Percoco, Gianluca

    2010-01-01

    To set up a three-dimensional photogrammetric scanning system for precise landmark measurements, without any physical contact, using a low-cost and noninvasive digital photogrammetric solution, for supporting several necessity in clinical orthodontics and/or surgery diagnosis. Thirty coded targets were directly applied onto the subject's face on the soft tissue landmarks, and then, 3 simultaneous photos were acquired using photogrammetry, at room light conditions. For comparison, a dummy head was digitized both with a photogrammetric technique and with the laser scanner Minolta Vivid 910i (Konica Minolta, Tokyo, Japan). The precise measurement of the landmarks is ranged between 0.017 and 0.029 mm. The system automatically measures spatial position of face landmarks, from which distances and angles can be obtained. The facial measurements were compared with those done using laser scanning and manual caliper. The adopted method gives higher precision than the others (0.022-mm mean value on points and 0.038-mm mean value on linear distances on a dummy head), is simple, and can be used easily as a standard routine. The study demonstrated the validity of photogrammetry for accurate digitization of human face landmarks. This research points out the potential of this low-cost photogrammetry approach for medical digitization.

  15. Measures of upper limb function for people with neck pain: a systematic review of measurement and practical properties (protocol).

    PubMed

    Alreni, Ahmad Salah Eldin; Harrop, Deborah; Gumber, Anil; McLean, Sionnadh

    2015-04-07

    Upper limb disability is a common musculoskeletal condition frequently associated with neck pain. Recent literature has reported the need to utilise validated upper limb outcome measures in the assessment and management of patients with neck pain. However, there is a lack of clear guidance about the suitability of available measures, which may impede utilisation. This review will identify all available measures of upper limb function developed for use in neck pain patients and evaluate their measurement and practical properties in order to identify those measures that are most appropriate for use in clinical practice and research. This review will be performed in two phases. Phase one will identify all measures used to assess upper limb function for patients with neck pain. Phase two will identify all available studies of the measurement and practical properties of identified instrument. The COnsensus-based Standards for selection of health Measurement INstrument (COSMIN) will be used to evaluate the methodological quality of the included studies. To ensure methodological rigour, the findings of this review will be reported in accordance with the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) guideline. Optimal management of patients with neck pain should incorporate upper limb rehabilitation. The findings of this study will assist clinicians who seek to utilise suitable and accurate measures to assess upper limb function for a patient with neck pain. In addition, the findings of this study may suggest new research directions to support the development of upper limb outcome measures for patients with neck pain. PROSPERO CRD42015016624.

  16. Using QALYs in telehealth evaluations: a systematic review of methodology and transparency.

    PubMed

    Bergmo, Trine S

    2014-08-03

    The quality-adjusted life-year (QALY) is a recognised outcome measure in health economic evaluations. QALY incorporates individual preferences and identifies health gains by combining mortality and morbidity into one single index number. A literature review was conducted to examine and discuss the use of QALYs to measure outcomes in telehealth evaluations. Evaluations were identified via a literature search in all relevant databases. Only economic evaluations measuring both costs and QALYs using primary patient level data of two or more alternatives were included. A total of 17 economic evaluations estimating QALYs were identified. All evaluations used validated generic health related-quality of life (HRQoL) instruments to describe health states. They used accepted methods for transforming the quality scores into utility values. The methodology used varied between the evaluations. The evaluations used four different preference measures (EQ-5D, SF-6D, QWB and HUI3), and utility scores were elicited from the general population. Most studies reported the methodology used in calculating QALYs. The evaluations were less transparent in reporting utility weights at different time points and variability around utilities and QALYs. Few made adjustments for differences in baseline utilities. The QALYs gained in the reviewed evaluations varied from 0.001 to 0.118 in implying a small but positive effect of telehealth intervention on patient's health. The evaluations reported mixed cost-effectiveness results. The use of QALYs in telehealth evaluations has increased over the last few years. Different methodologies and utility measures have been used to calculate QALYs. A more harmonised methodology and utility measure is needed to ensure comparability across telehealth evaluations.

  17. Removing technical variability in RNA-seq data using conditional quantile normalization.

    PubMed

    Hansen, Kasper D; Irizarry, Rafael A; Wu, Zhijin

    2012-04-01

    The ability to measure gene expression on a genome-wide scale is one of the most promising accomplishments in molecular biology. Microarrays, the technology that first permitted this, were riddled with problems due to unwanted sources of variability. Many of these problems are now mitigated, after a decade's worth of statistical methodology development. The recently developed RNA sequencing (RNA-seq) technology has generated much excitement in part due to claims of reduced variability in comparison to microarrays. However, we show that RNA-seq data demonstrate unwanted and obscuring variability similar to what was first observed in microarrays. In particular, we find guanine-cytosine content (GC-content) has a strong sample-specific effect on gene expression measurements that, if left uncorrected, leads to false positives in downstream results. We also report on commonly observed data distortions that demonstrate the need for data normalization. Here, we describe a statistical methodology that improves precision by 42% without loss of accuracy. Our resulting conditional quantile normalization algorithm combines robust generalized regression to remove systematic bias introduced by deterministic features such as GC-content and quantile normalization to correct for global distortions.

  18. Public Relations Telephone Surveys: Avoiding Methodological Debacles.

    ERIC Educational Resources Information Center

    Stone, Gerald C.

    1996-01-01

    Reports that a study revealed a serious methodological flaw in interviewer bias in telephone surveys. States that most surveys, using standard detection measures, would not find the defect, but outcomes were so misleading that a campaign using the results would be doomed. Warns about practitioner telephone surveys; suggests special precautions if…

  19. Open path measurements of carbon dioxide and water vapor under foggy conditions - technical problems, approaches and effects on flux measurements and budget calculations

    NASA Astrophysics Data System (ADS)

    El-Madany, T.; Griessbaum, F.; Maneke, F.; Chu, H.-S.; Wu, C.-C.; Chang, S. C.; Hsia, Y.-J.; Juang, J.-Y.; Klemm, O.

    2010-07-01

    To estimate carbon dioxide or water vapor fluxes with the Eddy Covariance method high quality data sets are necessary. Under foggy conditions this is challenging, because open path measurements are influenced by the water droplets that cross the measurement path as well as deposit on the windows of the optical path. For the LI-7500 the deposition of droplets on the window results in an intensity reduction of the infrared beam. To keep the strength of the infrared beam under these conditions, the energy is increased. A measure for the increased energy is given by the AGC value (Automatic Gain Control). Up to a AGC threshold value of 70 % the data from the LI-7500 is assumed to be of good quality (personal communication with LICOR). Due to fog deposition on the windows, the AGC value rises above 70 % and stays there until the fog disappears and the water on the windows evaporates. To gain better data quality during foggy conditions, a blower system was developed that blows the deposited water droplets off the window. The system is triggered if the AGC value rises above 70 %. Then a pneumatic jack will lift the blower system towards the LI-7500 and the water-droplets get blown off with compressed air. After the AGC value drops below 70 %, the pneumatic jack will move back to the idle position. Using this technique showed that not only the fog droplets on the window causing significant problems to the measurement, but also the fog droplets inside the measurement path. Under conditions of very dense fog the measured values of carbon dioxide can get unrealistically high, and for water vapor, negative values can be observed even if the AGC value is below 70 %. The negative values can be explained by the scatter of the infrared beam on the fog droplets. It is assumed, that different types of fog droplet spectra are causing the various error patterns observed. For high quality flux measurements, not only the AGC threshold value of 70 % is important, but also the fluctuation

  20. Effective Wettability Measurements of CO2-Brine-Sandstone System at Different Reservoir Conditions

    NASA Astrophysics Data System (ADS)

    Al-Menhali, Ali; Krevor, Samuel

    2014-05-01

    The wetting properties of CO2-brine-rock systems will have a major impact on the management of CO2 injection processes. The wettability of a system controls the flow and trapping efficiency during the storage of CO2 in geological formations as well as the efficiency of enhanced oil recovery operations. Despite its utility in EOR and the continued development of CCS, little is currently known about the wetting properties of the CO2-brine system on reservoir rocks, and no investigations have been performed assessing the impact of these properties on CO2 flooding for CO2 storage or EOR. The wetting properties of multiphase fluid systems in porous media have major impacts on the multiphase flow properties such as the capillary pressure and relative permeability. While recent studies have shown CO2 to generally act as a non-wetting phase in siliciclastic rocks, some observations report that the contact angle varies with pressure, temperature and water salinity. Additionally, there is a wide range of reported contact angles for this system, from strongly to weakly water-wet. In the case of some minerals, intermediate wet contact angles have been observed. Uncertainty with regard to the wetting properties of CO2-brine systems is currently one of the remaining major unresolved issues with regards to reservoir management of CO2 storage. In this study, we make semi-dynamic capillary pressure measurements of supercritical CO2 and brine at reservoir conditions to observe shifts in the wetting properties. We utilize a novel core analysis technique recently developed by Pini et al in 2012 to evaluate a core-scale effective contact angle. Carbon dioxide is injected at constant flow rate into a core that is initially fully saturated with water, while maintaining a constant outlet pressure. In this scenario, the pressure drop across the core corresponds to the capillary pressure at the inlet face of the core. When compared with mercury intrusion capillary pressure measurements

  1. Transposition of Francis turbine cavitation compliance at partial load to different operating conditions

    NASA Astrophysics Data System (ADS)

    Gomes, J.; Favrel, A.; Landry, C.; Nicolet, C.; Avellan, F.

    2017-04-01

    Francis turbines operating in part load conditions experience a swirling flow at the runner outlet leading to the development of a precessing cavitation vortex rope in the draft tube. This cavitation vortex rope changes drastically the velocity of pressure waves traveling in the draft tube and may lead to resonance conditions in the hydraulic circuit. The wave speed being strongly related to the cavitation compliance, this research work presents a simple model to explain how it is affected by variations of operating conditions and proposes a method to transpose its values. Even though the focus of this paper is on transpositions within the same turbine scale, the methodology is also expected to be tested for the model to prototype transposition in the future. Comparisons between measurements and calculations are in good agreement.

  2. METHOD FOR SIMULTANEOUS 90SR AND 137CS IN-VIVO MEASUREMENTS OF SMALL ANIMALS AND OTHER ENVIRONMENTAL MEDIA DEVELOPED FOR THE CONDITIONS OF THE CHERNOBYL EXCLUSION ZONE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farfan, E.; Jannik, T.

    To perform in vivo simultaneous measurements of the {sup 90}Sr and {sup 137}Cs content in the bodies of animals living in the Chernobyl Exclusion Zone (ChEZ), an appropriate method and equipment were developed and installed in a mobile gamma beta spectrometry laboratory. This technique was designed for animals of relatively small sizes (up to 50 g). The {sup 90}Sr content is measured by a beta spectrometer with a 0.1 mm thick scintillation plastic detector. The spectrum processing takes into account the fact that the measured object is 'thick-layered' and contains a comparable quantity of {sup 137}Cs, which is a characteristicmore » condition of the ChEZ. The {sup 137}Cs content is measured by a NaI scintillation detector that is part of the combined gamma beta spectrometry system. For environmental research performed in the ChEZ, the advantages of this method and equipment (rapid measurements, capability to measure live animals directly in their habitat, and the capability of simultaneous {sup 90}Sr and {sup 137}Cs measurements) far outweigh the existing limitations (considerations must be made for background radiation and the animal size, skeletal shape and body mass). The accuracy of these in vivo measurements is shown to be consistent with standard spectrometric and radiochemical methods. Apart from the in vivo measurements, the proposed methodology, after a very simple upgrade that is also described in the article, works even more accurately with samples of other media, such as soil and plants.« less

  3. A methodology for the evaluation of the human-bioclimatic performance of open spaces

    NASA Astrophysics Data System (ADS)

    Charalampopoulos, Ioannis; Tsiros, Ioannis; Chronopoulou-Sereli, Aik.; Matzarakis, Andreas

    2017-05-01

    The purpose of this paper is to present a simple methodology to improve the evaluation of the human-biometeorological benefits of open spaces. It is based on two groups of new indices using as basis the well-known PET index. This simple methodology along with the accompanying indices allows a qualitative and quantitative evaluation of the climatic behavior of the selected sites. The proposed methodology was applied in a human-biometeorology research in the city of Athens, Greece. The results of this study are in line with the results of other related studies indicating the considerable influence of the sky view factor (SVF), the existence of the vegetation and the building material on human-biometeorological conditions. The proposed methodology may provide new insights in the decision-making process related to urban open spaces' best configuration.

  4. Extending acoustic data measured with small-scale supersonic model jets to practical aircraft exhaust jets

    NASA Astrophysics Data System (ADS)

    Kuo, Ching-Wen

    2010-06-01

    Modern military aircraft jet engines are designed with variable geometry nozzles to provide optimum thrust in different operating conditions within the flight envelope. However, the acoustic measurements for such nozzles are scarce, due to the cost involved in making full-scale measurements and the lack of details about the exact geometry of these nozzles. Thus the present effort at The Pennsylvania State University and the NASA Glenn Research Center, in partnership with GE Aviation, is aiming to study and characterize the acoustic field produced by supersonic jets issuing from converging-diverging military style nozzles. An equally important objective is to develop a scaling methodology for using data obtained from small- and moderate-scale experiments which exhibits the independence of the jet sizes to the measured noise levels. The experimental results presented in this thesis have shown reasonable agreement between small-scale and moderate-scale jet acoustic data, as well as between heated jets and heat-simulated ones. As the scaling methodology is validated, it will be extended to using acoustic data measured with small-scale supersonic model jets to the prediction of the most important components of full-scale engine noise. When comparing the measured acoustic spectra with a microphone array set at different radial locations, the characteristics of the jet noise source distribution may induce subtle inaccuracies, depending on the conditions of jet operation. A close look is taken at the details of the noise generation region in order to better understand the mismatch between spectra measured at various acoustic field radial locations. A processing methodology was developed to correct the effect of the noise source distribution and efficiently compare near-field and far-field spectra with unprecedented accuracy. This technique then demonstrates that the measured noise levels in the physically restricted space of an anechoic chamber can be appropriately

  5. Epidemiology of multiple chronic conditions: an international perspective.

    PubMed

    Schellevis, François G

    2013-01-01

    The epidemiology of multimorbidity, or multiple chronic conditions (MCCs), is one of the research priority areas of the U.S. Department of Health and Human Services (HHS) by its Strategic Framework on MCCs. A conceptual model addressing methodological issues leading to a valid measurement of the prevalence rates of MCCs has been developed and applied in descriptive epidemiological studies. Comparing these results with those from prevalence studies performed earlier and in other countries is hampered by methodological limitations. Therefore, this paper aims to put the size and patterns of MCCs in the USA, as established within the HHS Strategic Framework on MCCs, in perspective of the findings on the prevalence of MCCs in other countries. General common trends can be observed: increasing prevalence rates with increasing age, and multimorbidity being the rule rather than the exception at old age. Most frequent combinations of chronic diseases include the most frequently occurring single chronic diseases. New descriptive epidemiological studies will probably not provide new results; therefore, future descriptive studies should focus on the prevalence rates of MCCs in subpopulations, statistical clustering of chronic conditions, and the development of the prevalence rates of MCCs over time. The finding of common trends also indicates the necessary transition to a next phase of MCC research, addressing the quality of care of patients with MCCs from an organizational perspective and with respect to the content of care. Journal of Comorbidity 2013;3:36-40.

  6. Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments

    DTIC Science & Technology

    2016-03-24

    NUWC-NPT Technical Report 12,186 24 March 2016 Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology , and...Microfibers, Test Methodology , and Experiments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Anthony B...5 4 MEASUREMENTS AND EXPERIMENTAL APPARATUS ...........................................9 5 SAMPLE PREPARATION

  7. Resistances to Scientific Knowledge Production of Comparative Measurements of Dropout and Completion in European Higher Education

    ERIC Educational Resources Information Center

    Carlhed, Carina

    2017-01-01

    The article is a critical sociological analysis of current transnational practices on creating comparable measurements of dropout and completion in higher education and the consequences for the conditions of scientific knowledge production on the topic. The analysis revolves around questions of epistemological, methodological and symbolic types…

  8. Conjugate gradient based projection - A new explicit methodology for frictional contact

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Li, Maocheng; Sha, Desong

    1993-01-01

    With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.

  9. Measurement control workshop instructional materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibbs, Philip; Harvel, Charles; Clark, John

    2012-09-01

    An essential element in an effective nuclear materials control and accountability (MC&A) program is the measurement of the nuclear material as it is received, moved, processed and shipped. Quality measurement systems and methodologies determine the accuracy of the accountability values. Implementation of a measurement control program is essential to ensure that the measurement systems and methodologies perform as expected. A measurement control program also allows for a determination of the level of confidence in the accounting values.

  10. Differing antidepressant maintenance methodologies.

    PubMed

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  11. An optimal baseline selection methodology for data-driven damage detection and temperature compensation in acousto-ultrasonics

    NASA Astrophysics Data System (ADS)

    Torres-Arredondo, M.-A.; Sierra-Pérez, Julián; Cabanes, Guénaël

    2016-05-01

    The process of measuring and analysing the data from a distributed sensor network all over a structural system in order to quantify its condition is known as structural health monitoring (SHM). For the design of a trustworthy health monitoring system, a vast amount of information regarding the inherent physical characteristics of the sources and their propagation and interaction across the structure is crucial. Moreover, any SHM system which is expected to transition to field operation must take into account the influence of environmental and operational changes which cause modifications in the stiffness and damping of the structure and consequently modify its dynamic behaviour. On that account, special attention is paid in this paper to the development of an efficient SHM methodology where robust signal processing and pattern recognition techniques are integrated for the correct interpretation of complex ultrasonic waves within the context of damage detection and identification. The methodology is based on an acousto-ultrasonics technique where the discrete wavelet transform is evaluated for feature extraction and selection, linear principal component analysis for data-driven modelling and self-organising maps for a two-level clustering under the principle of local density. At the end, the methodology is experimentally demonstrated and results show that all the damages were detectable and identifiable.

  12. Application of hybrid methodology to rotors in steady and maneuvering flight

    NASA Astrophysics Data System (ADS)

    Rajmohan, Nischint

    Helicopters are versatile flying machines that have capabilities that are unparalleled by fixed wing aircraft, such as operating in hover, performing vertical takeoff and landing on unprepared sites. This makes their use especially desirable in military and search-and-rescue operations. However, modern helicopters still suffer from high levels of noise and vibration caused by the physical phenomena occurring in the vicinity of the rotor blades. Therefore, improvement in rotorcraft design to reduce the noise and vibration levels requires understanding of the underlying physical phenomena, and accurate prediction capabilities of the resulting rotorcraft aeromechanics. The goal of this research is to study the aeromechanics of rotors in steady and maneuvering flight using hybrid Computational Fluid Dynamics (CFD) methodology. The hybrid CFD methodology uses the Navier-Stokes equations to solve the flow near the blade surface but the effect of the far wake is computed through the wake model. The hybrid CFD methodology is computationally efficient and its wake modeling approach is nondissipative making it an attractive tool to study rotorcraft aeromechanics. Several enhancements were made to the CFD methodology and it was coupled to a Computational Structural Dynamics (CSD) methodology to perform a trimmed aeroelastic analysis of a rotor in forward flight. The coupling analyses, both loose and tight were used to identify the key physical phenomena that affect rotors in different steady flight regimes. The modeling enhancements improved the airloads predictions for a variety of flight conditions. It was found that the tightly coupled method did not impact the loads significantly for steady flight conditions compared to the loosely coupled method. The coupling methodology was extended to maneuvering flight analysis by enhancing the computational and structural models to handle non-periodic flight conditions and vehicle motions in time accurate mode. The flight test

  13. The Deflection Plate Analyzer: A Technique for Space Plasma Measurements Under Highly Disturbed Conditions

    NASA Technical Reports Server (NTRS)

    Wright, Kenneth H., Jr.; Dutton, Ken; Martinez, Nelson; Smith, Dennis; Stone, Nobie H.

    2004-01-01

    A technique has been developed to measure the characteristics of space plasmas under highly disturbed conditions; e.g., non-Maxwellian plasmas with strong drifting populations and plasmas contaminated by spacecraft outgassing. The present method is an extension of the capabilities of the Differential Ion Flux Probe (DIFP) to include a mass measurement that does not include either high voltage or contamination sensitive devices such as channeltron electron multipliers or microchannel plates. This reduces the complexity and expense of instrument fabrication, testing, and integration of flight hardware as compared to classical mass analyzers. The new instrument design is called the Deflection Plate Analyzer (DPA) and can deconvolve multiple ion streams and analyze each stream for ion flux intensity (density), velocity (including direction of motion), mass, and temperature (or energy distribution). The basic functionality of the DPA is discussed. The performance characteristics of a flight instrument as built for an electrodynamic tether mission, the Propulsive Small Expendable Deployer System (ProSEDS), and the instrument s role in measuring key experimental conditions are also discussed.

  14. The Deflection Plate Analyzer: A Technique for Space Plasma Measurements Under Highly Disturbed Conditions

    NASA Technical Reports Server (NTRS)

    Wright, Kenneth H., Jr.; Dutton, Ken; Martinez, Nelson; Smith, Dennis; Stone, Nobie H.

    2003-01-01

    A technique has been developed to measure the characteristics of space plasmas under highly disturbed conditions; e.g., non-Maxwellian plasmas with strong drifting populations and plasmas contaminated by spacecraft outgassing. The present method is an extension of the capabilities of the Differential Ion Flux Probe (DIFP) to include a mass measurement that does not include either high voltage or contamination sensitive devices such as channeltron electron multipliers or microchannel plates. This reduces the complexity and expense of instrument fabrication, testing, and integration of flight hardware as compared to classical mass analyzers. The new instrument design is called the Deflection Plate Analyzer (DPA) and can deconvolve multiple ion streams and analyze each stream for ion flux intensity (density), velocity (including direction of motion), mass, and temperature (or energy distribution). The basic functionality of the DPA is discussed. The performance characteristics of a flight instrument as built for an electrodynamic tether mission, the Propulsive Small Expendable Deployer System (ProSEDS), and the instrument s role in measuring key experimental conditions are also discussed.

  15. Measuring Individual Differences in Decision Biases: Methodological Considerations

    PubMed Central

    Aczel, Balazs; Bago, Bence; Szollosi, Aba; Foldes, Andrei; Lukacs, Bence

    2015-01-01

    Individual differences in people's susceptibility to heuristics and biases (HB) are often measured by multiple-bias questionnaires consisting of one or a few items for each bias. This research approach relies on the assumptions that (1) different versions of a decision bias task measure are interchangeable as they measure the same cognitive failure; and (2) that some combination of these tasks measures the same underlying construct. Based on these assumptions, in Study 1 we developed two versions of a new decision bias survey for which we modified 13 HB tasks to increase their comparability, construct validity, and the participants' motivation. The analysis of the responses (N = 1279) showed weak internal consistency within the surveys and a great level of discrepancy between the extracted patterns of the underlying factors. To explore these inconsistencies, in Study 2 we used three original examples of HB tasks for each of seven biases. We created three decision bias surveys by allocating one version of each HB task to each survey. The participants' responses (N = 527) showed a similar pattern as in Study 1, questioning the assumption that the different examples of the HB tasks are interchangeable and that they measure the same underlying construct. These results emphasize the need to understand the domain-specificity of cognitive biases as well as the effect of the wording of the cover story and the response mode on bias susceptibility before employing them in multiple-bias questionnaires. PMID:26635677

  16. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  17. Methodological challenges in measuring vaccine effectiveness using population cohorts in low resource settings.

    PubMed

    King, C; Beard, J; Crampin, A C; Costello, A; Mwansambo, C; Cunliffe, N A; Heyderman, R S; French, N; Bar-Zeev, N

    2015-09-11

    Post-licensure real world evaluation of vaccine implementation is important for establishing evidence of vaccine effectiveness (VE) and programme impact, including indirect effects. Large cohort studies offer an important epidemiological approach for evaluating VE, but have inherent methodological challenges. Since March 2012, we have conducted an open prospective cohort study in two sites in rural Malawi to evaluate the post-introduction effectiveness of 13-valent pneumococcal conjugate vaccine (PCV13) against all-cause post-neonatal infant mortality and monovalent rotavirus vaccine (RV1) against diarrhoea-related post-neonatal infant mortality. Our study sites cover a population of 500,000, with a baseline post-neonatal infant mortality of 25 per 1000 live births. We conducted a methodological review of cohort studies for vaccine effectiveness in a developing country setting, applied to our study context. Based on published literature, we outline key considerations when defining the denominator (study population), exposure (vaccination status) and outcome ascertainment (mortality and cause of death) of such studies. We assess various definitions in these three domains, in terms of their impact on power, effect size and potential biases and their direction, using our cohort study for illustration. Based on this iterative process, we discuss the pros and cons of our final per-protocol analysis plan. Since no single set of definitions or analytical approach accounts for all possible biases, we propose sensitivity analyses to interrogate our assumptions and methodological decisions. In the poorest regions of the world where routine vital birth and death surveillance are frequently unavailable and the burden of disease and death is greatest We conclude that provided the balance between definitions and their overall assumed impact on estimated VE are acknowledged, such large scale real-world cohort studies can provide crucial information to policymakers by providing

  18. Determining Faculty and Student Views: Applications of Q Methodology in Higher Education

    ERIC Educational Resources Information Center

    Ramlo, Susan

    2012-01-01

    William Stephenson specifically developed Q methodology, or Q, as a means of measuring subjectivity. Q has been used to determine perspectives/views in a wide variety of fields from marketing research to political science but less frequently in education. In higher education, the author has used Q methodology to determine views about a variety of…

  19. Methodological factors conducting research with incarcerated persons with diabetes.

    PubMed

    Reagan, Louise; Shelton, Deborah

    2016-02-01

    The aim of this study was to describe methodological issues specific to conducting research with incarcerated vulnerable populations who have diabetes. Much has been written about the ethical and logistical challenges of conducting research with vulnerable incarcerated populations. However, conducting research with incarcerated persons with diabetes is associated with additional issues related to research design, measurement, sampling and recruitment, and data collection procedures. A cross-sectional study examining the relationships of diabetes knowledge, illness representation and self-care behaviors with glycemic control in 124 incarcerated persons was conducted and serves as the basis for describing methodological factors for the conduct of research with an incarcerated population with diabetes. Within this incarcerated population with diabetes, sampling bias due to gender inequity, recruitment of participants not using insulin, self-reported vision impairment, and a lack of standardized instruments especially for measuring diabetes self-care were methodological challenges. Clinical factors that serve as potential barriers for study conduct were identified as risk for hypoglycemia due to insulin timing and other activities. Conducting research with incarcerated persons diagnosed with diabetes requires attention to a set of methodological concerns above and beyond that of the ethical and legal regulations for protecting the rights of this vulnerable population. To increase opportunities for conducting rigorous as well as facility- and patient-friendly research, researchers need to blend their knowledge of diabetes with an understanding of prison rules and routines. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Methodological Challenges in Studies Examining the Effects of Breakfast on Cognitive Performance and Appetite in Children and Adolescents.

    PubMed

    Adolphus, Katie; Bellissimo, Nick; Lawton, Clare L; Ford, Nikki A; Rains, Tia M; Totosy de Zepetnek, Julia; Dye, Louise

    2017-01-01

    Breakfast is purported to confer a number of benefits on diet quality, health, appetite regulation, and cognitive performance. However, new evidence has challenged the long-held belief that breakfast is the most important meal of the day. This review aims to provide a comprehensive discussion of the key methodological challenges and considerations in studies assessing the effect of breakfast on cognitive performance and appetite control, along with recommendations for future research. This review focuses on the myriad challenges involved in studying children and adolescents specifically. Key methodological challenges and considerations include study design and location, sampling and sample section, choice of objective cognitive tests, choice of objective and subjective appetite measures, merits of providing a fixed breakfast compared with ad libitum, assessment and definition of habitual breakfast consumption, transparency of treatment condition, difficulty of isolating the direct effects of breakfast consumption, untangling acute and chronic effects, and influence of confounding variables. These methodological challenges have hampered a clear substantiation of the potential positive effects of breakfast on cognition and appetite control and contributed to the debate questioning the notion that breakfast is the most important meal of the day. © 2017 American Society for Nutrition.