Science.gov

Sample records for conditions measurement methodology

  1. Methodology for measuring exhaust aerosol size distributions using an engine test under transient operating conditions

    NASA Astrophysics Data System (ADS)

    María Desantes, José; Bermúdez, Vicente; Molina, Santiago; Linares, Waldemar G.

    2011-11-01

    A study on the sources of variability in the measurement of particle size distribution using a two-stage dilution system and an engine exhaust particle sizer was conducted to obtain a comprehensive and repeatable methodology that can be used to measure the particle size distribution of aerosols emitted by a light-duty diesel engine under transient operating conditions. The paper includes three experimental phases: an experimental validation of the measurement method; an evaluation of the influence of sampling factors, such as dilution system pre-conditioning; and a study of the effects of the dilution conditions, such as the dilution ratio and the dilution air temperature. An examination of the type and degree of influence of each studied factor is presented, recommendations for reducing variability are given and critical parameter values are identified to develop a highly reliable measurement methodology that could be applied to further studies on the effect of engine operating parameters on exhaust particle size distributions.

  2. Methodological Development On Conditional Sampling Method: Application To Nox Fluxes Measured During The Escompte Campaign

    NASA Astrophysics Data System (ADS)

    Fotiadi, A.; F; Lohou; Serça, D.; Druilhet, A.; Laville, P.; Bouchou, P.; Lopez, A.

    Surface fluxes of reactive nitrogen oxides (NOx =NO + NO2) are essential to quantify the net impact of nitrogen and ozone budget in the atmospheric boundary layer. To accurately establish their sources and sinks, specific methods of measurement have to be developed taking into account the sensors characteristics (e.g. time response). The most direct method to measure energy and gas fluxes is the Eddy Correlation (EC) method based on the covariance between the vertical wind velocity (w) fluctuations and the scalar (X) fluctuations. The EC method requires fast-response sensors that are not available for many trace gases (as NOx). The Relaxed Eddy Accumulation or conditional sampling technique was proposed as an alternative solution to overcome this problem. A system for conditional sampling at the field scale was developed and applied to determine NOx fluxes in different Mediterranean ecosystems in the frame- work of the ESCOMPTE experimental campaign (June-July 2001). In order to assure the accuracy in the fluxes calculation a methodological approach of data analysis has been developed. This approach is based on the statistical characteristics, internal struc- ture and spectral analysis of turbulent functions. It allows us to establish data selection criteria related to homogeneity, stationarity and turbulence characterisation. These cri- teria which concern statistical characteristics of w that is recorded in real time during the sampling period, have been related to existing stability conditions. Assuming sim- ilarity between the 'slow' scalar related to REA method and the 'fast' scalars related to EC (e.g. H2O, CO2, O3), other criteria based on the covariance convergence can, as well be established to improve the quality of the REA measurements. Indeed, data analysis shows that H2O, CO2, O3 functions are highly correlated (correlation coeffi- cient in the order of 0.9 - absolute value), which confirms the similarity assumption.

  3. Site-conditions map for Portugal based on VS measurements: methodology and final model

    NASA Astrophysics Data System (ADS)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and

  4. Development and validation of a methodology for intracellular pH measurements of hybridoma cells under bioreactor culture conditions.

    PubMed

    Cherlet, M; Franck, P; Nabet, P; Marc, A

    1999-01-01

    The intracellular pH (pH(i)) is an important factor in the regulation of different cellular processes. It might therefore be used as a marker of the physiological state of cells cultivated in a bioreactor environment. We developed and validated therefore a methodology that permits a reproducible and reliable pH(i) measurement under such bioreactor culture conditions, contrary to earlier reported measurements, carried out on cells resuspended in buffers under nongrowth conditions. The hybridoma cells were sampled from the culture, stained with the pH-sensitive dye BCECF-AM (BCECF = 2',7'-bis-carboxyethyl-5,6-carboxyfluorescein), and analyzed by flow cytometry. Such a measurement is perfectible to changes of the cells between the moment of sampling and of final analysis on the flow cytometer. All intermittent steps were for this reason studied in detail, either to determine the optimal conditions to be used or to characterize their influence on the final pH(i) value measured. Additional experiments were carried out, showing the representativeness of the measured pH(i) value for the pH(i) the cells possess really in the culture at the moment of sampling.

  5. Health Measurement Scales: Methodological Issues

    PubMed Central

    Panagiotakos, Demosthenes

    2009-01-01

    Health scales or indices are composite tools aiming to measure a variety of clinical conditions, behaviors, attitudes and beliefs that are difficult to be measured quantitatively. During the past years, these tools have been extensively used in cardiovascular disease prevention. The already proposed scales have shown good ability in assessing individual characteristics, but had moderate predictive ability in relation to the development of chronic diseases and various other health outcomes. In this review, methodological issues for the development of health scales are discussed. Specifically, the selection of the appropriate number of components, the selection of classes for each component, the use of weights of scale components and the role of intra- or inter-correlation between components are discussed. Based on the current literature the use of components with large number of classes, as well as the use of specific weights for each scale component and the low-to-moderate inter-correlation rate between the components, is suggested in order to increase the diagnostic accuracy of the tool. PMID:20054421

  6. Development of the methodology of exhaust emissions measurement under RDE (Real Driving Emissions) conditions for non-road mobile machinery (NRMM) vehicles

    NASA Astrophysics Data System (ADS)

    Merkisz, J.; Lijewski, P.; Fuc, P.; Siedlecki, M.; Ziolkowski, A.

    2016-09-01

    The paper analyzes the exhaust emissions from farm vehicles based on research performed under field conditions (RDE) according to the NTE procedure. This analysis has shown that it is hard to meet the NTE requirements under field conditions (engine operation in the NTE zone for at least 30 seconds). Due to a very high variability of the engine conditions, the share of a valid number of NTE windows in the field test is small throughout the entire test. For this reason, a modification of the measurement and exhaust emissions calculation methodology has been proposed for farm vehicles of the NRMM group. A test has been developed composed of the following phases: trip to the operation site (paved roads) and field operations (including u-turns and maneuvering). The range of the operation time share in individual test phases has been determined. A change in the method of calculating the real exhaust emissions has also been implemented in relation to the NTE procedure.

  7. Methodology for determining CD-SEM measurement condition of sub-20nm resist patterns for 0.33NA EUV lithography

    NASA Astrophysics Data System (ADS)

    Okai, Nobuhiro; Lavigne, Erin; Hitomi, Keiichiro; Halle, Scott; Hotta, Shoji; Koshihara, Shunsuke; Tanaka, Junichi; Bailey, Todd

    2015-03-01

    A novel methodology was established for determining critical dimension scanning electron microscope (CD-SEM) optimum measurement condition of sub-20 nm resist patterns for 0.33NA EUV lithography yielding both small shrinkage and high precision. To investigate dependency of resist shrinkage on pattern size and electron beam irradiation condition, shrinkage of 18, 32, and 45 nm EUV resist patterns was measured over a wide range of beam conditions. A shrinkage trend similar to that of ArF resist patterns was observed for 32 and 45 nm, but 18 nm pattern showed a different dependence on acceleration voltage. Conventional methodology developed for ArF resist pattern to predict shrinkage and precision using the Taguchi method was applied to EUV resist pattern to examine the extendibility of the method. Predicted shrinkage by Taguchi method for 32 and 45 nm patterns agreed with measurements. However, the prediction error increases considerably as the pattern size decreases from 32 to 18 nm because there is a significant interaction between acceleration voltage and irradiated electron dose in L18 array used in the Taguchi method. Thus, we proposed a new method that consists of separated prediction procedures of shrinkage and precision using both a shrinkage curve and the Taguchi method, respectively. The new method was applied to 18 nm EUV resist pattern, and the optimum measurement condition with shrinkage of 1.5 nm and precision of 0.12 nm was determined. Our new method is a versatile technique which is applicable not only to fine EUV resist pattern but also to ArF resist pattern.

  8. The antioxidant activity of teas measured by the FRAP method adapted to the FIA system: optimising the conditions using the response surface methodology.

    PubMed

    Martins, Alessandro C; Bukman, Lais; Vargas, Alexandro M M; Barizão, Érica O; Moraes, Juliana C G; Visentainer, Jesuí V; Almeida, Vitor C

    2013-05-01

    This study proposes a FRAP assay adapted to FIA system with a merging zones configuration. The FIA system conditions were optimised with the response surface methodology using the central composite rotatable design. The optimisation parameters studied were: the carrier flow rate, the lengths of the sample and reagent loops, and reactor length. The conditions selected in accordance with the results were: carrier flow rate of 1.00 ml/min, length of the loops 18.2 cm and length of the reaction coil 210.1 cm. The detection and quantification limits were, respectively, 28.6 and 86.8 μmol/l Fe(2+), and the precision was 1.27%. The proposed method had an analytical frequency of 30 samples/h and about 95% less volume of FRAP reagent was consumed. The FRAP assay adapted to the FIA system under the optimised conditions was utilised to determine the antioxidant activity of tea samples.

  9. Methodological Quality of National Guidelines for Pediatric Inpatient Conditions

    PubMed Central

    Hester, Gabrielle; Nelson, Katherine; Mahant, Sanjay; Eresuma, Emily; Keren, Ron; Srivastava, Rajendu

    2014-01-01

    Background Guidelines help inform standardization of care for quality improvement (QI). The Pediatric Research in Inpatient Settings (PRIS) network published a prioritization list of inpatient conditions with high prevalence, cost, and variation in resource utilization across children’s hospitals. The methodological quality of guidelines for priority conditions is unknown. Objective To rate the methodological quality of national guidelines for 20 priority pediatric inpatient conditions. Design We searched sources including PubMed for national guidelines published 2002–2012. Guidelines specific to one organism, test or treatment, or institution were excluded. Guidelines were rated by two raters using a validated tool (AGREE II) with an overall rating on a 7-point scale (7–highest). Inter-rater reliability was measured with a weighted kappa coefficient. Results 17 guidelines met inclusion criteria for 13 conditions, 7 conditions yielded no relevant national guidelines. The highest methodological quality guidelines were for asthma, tonsillectomy, and bronchiolitis (mean overall rating 7, 6.5 and 6.5 respectively); the lowest were for sickle cell disease (2 guidelines) and dental caries (mean overall rating 4, 3.5, and 3 respectively). The overall weighted kappa was 0.83 (95% confidence interval 0.78–0.87). Conclusions We identified a group of moderate to high methodological quality national guidelines for priority pediatric inpatient conditions. Hospitals should consider these guidelines to inform QI initiatives. PMID:24677729

  10. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions

    PubMed Central

    Schmithausen, Alexander J.; Trimborn, Manfred; Büscher, Wolfgang

    2016-01-01

    Trace gases such as nitrous oxide (N2O), methane (CH4), and carbon dioxide (CO2) are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR); photoacoustic system (PAS)) are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC) with electron capture detection (ECD), but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system) and GC in the laboratory (offline system). Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges. PMID:27706101

  11. A new condition specific quality of life measure for the blind and the partially sighted in Sub-Saharan Africa, the IOTAQOL: methodological aspects of the development procedure.

    PubMed

    Leplège, Alain; Schemann, Jean François; Diakité, Bah; Touré, Ousmane; Ecosse, Emmanuel; Jaffré, Yannick; Dumestre, Gérard

    2006-10-01

    In Mali, blind and partially sighted people represent 1.2% of the population. Good quality and low cost ophthalmologic care is available, but, unfortunately, is insufficiently taken advantage of. In order to contribute to the analysis of this situation a valid and reliable questionnaire was needed to take the patient's perspective into account. Because of face validity concerns, it was not possible to merely translate an existing questionnaire. Thus we decided to develop a new questionnaire directly in one of the main languages of Mali: Bambara. This involved the setting of a study team composed of social and health science specialists, the majority of whom were native Bambara speakers. The overall project consisted in the iteration of three main steps (1) Conceptual clarification and operationalization of this concept. (2) Qualitative steps: qualitative interviews, focus groups and content analysis. (3) Quantitative steps: statistical analysis of an initial try-out survey (143 participants) and a validation survey (420 participants). This approach yields satisfying results. Indeed, the final version of the IOTAQOL has good psychometric properties. Thus, this interviewer administered instrument can be used to measure health-related quality-of-life in Mali and the methodology that we used could serve as a basis for similar projects.

  12. A Wavelet-Based Methodology for Grinding Wheel Condition Monitoring

    SciTech Connect

    Liao, T. W.; Ting, C.F.; Qu, Jun; Blau, Peter Julian

    2007-01-01

    Grinding wheel surface condition changes as more material is removed. This paper presents a wavelet-based methodology for grinding wheel condition monitoring based on acoustic emission (AE) signals. Grinding experiments in creep feed mode were conducted to grind alumina specimens with a resinoid-bonded diamond wheel using two different conditions. During the experiments, AE signals were collected when the wheel was 'sharp' and when the wheel was 'dull'. Discriminant features were then extracted from each raw AE signal segment using the discrete wavelet decomposition procedure. An adaptive genetic clustering algorithm was finally applied to the extracted features in order to distinguish different states of grinding wheel condition. The test results indicate that the proposed methodology can achieve 97% clustering accuracy for the high material removal rate condition, 86.7% for the low material removal rate condition, and 76.7% for the combined grinding conditions if the base wavelet, the decomposition level, and the GA parameters are properly selected.

  13. Directional reflectance characterization facility and measurement methodology.

    PubMed

    McGuckin, B T; Haner, D A; Menzies, R T; Esproles, C; Brothers, A M

    1996-08-20

    A precision reflectance characterization facility, constructed specifically for the measurement of the bidirectional reflectance properties of Spectralon panels planned for use as in-flight calibrators on the NASA Multiangle Imaging Spectroradiometer (MISR) instrument is described. The incident linearly polarized radiation is provided at three laser wavelengths: 442, 632.8, and 859.9 nm. Each beam is collimated when incident on the Spectralon. The illuminated area of the panel is viewed with a silicon photodetector that revolves around the panel (360°) on a 30-cm boom extending from a common rotational axis. The reflected radiance detector signal is ratioed with the signal from a reference detector to minimize the effect of amplitude instabilities in the laser sources. This and other measures adopted to reduce noise have resulted in a bidirectional reflection function (BRF) calibration facility with a measurement precision with regard to a BRF measurement of ±0.002 at the 1ς confidence level. The Spectralon test piece panel is held in a computer-controlled three-axis rotational assembly capable of a full 360° rotation in the horizontal plane and 90° in the vertical. The angular positioning system has repeatability and resolution of 0.001°. Design details and an outline of the measurement methodology are presented.

  14. Relative Hazard and Risk Measure Calculation Methodology

    SciTech Connect

    Stenner, Robert D.; Strenge, Dennis L.; Elder, Matthew S.; Andrews, William B.; Walton, Terry L.

    2003-09-15

    The RHRM equations, as represented in methodology and code presented in this report, are primarily a collection of key factors normally used in risk assessment that are relevant to understanding the hazards and risks associated with projected mitigation, cleanup, and risk management activities. The RHRM code has broad application potential. For example, it can be used to compare one mitigation, cleanup, or risk management activity with another, instead of just comparing it to just the fixed baseline. If the appropriate source term data are available, it can be used in its non-ratio form to estimate absolute values of the associated controlling hazards and risks. These estimated values of controlling hazards and risks can then be examined to help understand which mitigation, cleanup, or risk management activities are addressing the higher hazard conditions and risk reduction potential at a site. Graphics can be generated from these absolute controlling hazard and risk values to graphically compare these high hazard and risk reduction potential conditions. If the RHRM code is used in this manner, care must be taken to specifically define and qualify (e.g., identify which factors were considered and which ones tended to drive the hazard and risk estimates) the resultant absolute controlling hazard and risk values.

  15. PIMM: A Performance Improvement Measurement Methodology

    SciTech Connect

    Not Available

    1994-05-15

    This report presents a Performance Improvement Measurement Methodology (PIMM) for measuring and reporting the mission performance for organizational elements of the U.S. Department of Energy to comply with the Chief Financial Officer`s Act (CFOA) of 1990 and the Government Performance and Results Act (GPRA) of 1993. The PIMM is illustrated by application to the Morgantown Energy Technology Center (METC), a Research, Development and Demonstration (RD&D) field center of the Office of Fossil Energy, along with limited applications to the Strategic Petroleum Reserve Office and the Office of Fossil Energy. METC is now implementing the first year of a pilot project under GPRA using the PIMM. The PIMM process is applicable to all elements of the Department; organizations may customize measurements to their specific missions. The PIMM has four aspects: (1) an achievement measurement that applies to any organizational element, (2) key indicators that apply to institutional elements, (3) a risk reduction measurement that applies to all RD&D elements and to elements with long-term activities leading to risk-associated outcomes, and (4) a cost performance evaluation. Key Indicators show how close the institution is to attaining long range goals. Risk reduction analysis is especially relevant to RD&D. Product risk is defined as the chance that the product of new technology will not meet the requirements of the customer. RD&D is conducted to reduce technology risks to acceptable levels. The PIMM provides a profile to track risk reduction as RD&D proceeds. Cost performance evaluations provide a measurement of the expected costs of outcomes relative to their actual costs.

  16. Advances in phase measurement methodology for MOEMS testing

    NASA Astrophysics Data System (ADS)

    Pryputniewicz, Ryszard J.

    2010-04-01

    Advances in emerging technologies of microelectromechanical systems (MEMS) in general and micro-opto-electro-mechanical systems (MOEMS) in particular are some of the most challenging tasks in today's experimental mechanics. More specifically, development of these miniature devices requires sophisticated design, analysis, fabrication, testing, and characterization tools that have multiphysics and multiscale capabilities. One of the approaches employed in the development of microsystems of current interest is based on hybridization of the analytical, computational, and experimental solutions. In particular, the experimental solutions are relying on recent advances of the optoelectronic laser interferometric microscope (OELIM) methodology based on phase measurements. This hybrid approach is applicable under static and dynamic conditions and, as such, facilitates measurements of dynamic and thermomechanical behavior of the individual components, their packages, and other complex material structures. In this paper, the phase measurement methodology is described and its applicability is illustrated by use of representative micro-samples with emphasis on MOEMS. Shape, deformations, and motions of these samples are measured quantitatively in near real-time. Preliminary theoretical results correlate with the experimental results well within the criteria governed by the uncertainty analysis. Validated correlations will lead to establishment of ``design by analysis'' methodology for efficient and effective development of structures, which is becoming more and more necessary especially for the satisfaction of microsystem reliability assessment (MRA) requirements. All in all, representative results presented in this paper indicate that the phase measurement methodology is a viable tool for micro-scale measurements and, as such, it is particularly useful for development of microsystems.

  17. Relative Hazard and Risk Measure Calculation Methodology

    SciTech Connect

    Stenner, Robert D.; Strenge, Dennis L.; Elder, Matthew S.

    2004-03-20

    The relative hazard (RH) and risk measure (RM) methodology and computer code is a health risk-based tool designed to allow managers and environmental decision makers the opportunity to readily consider human health risks (i.e., public and worker risks) in their screening-level analysis of alternative cleanup strategies. Environmental management decisions involve consideration of costs, schedules, regulatory requirements, health hazards, and risks. The RH-RM tool is a risk-based environmental management decision tool that allows managers the ability to predict and track health hazards and risks over time as they change in relation to mitigation and cleanup actions. Analysis of the hazards and risks associated with planned mitigation and cleanup actions provides a baseline against which alternative strategies can be compared. This new tool allows managers to explore “what if scenarios,” to better understand the impact of alternative mitigation and cleanup actions (i.e., alternatives to the planned actions) on health hazards and risks. This new tool allows managers to screen alternatives on the basis of human health risk and compare the results with cost and other factors pertinent to the decision. Once an alternative or a narrow set of alternatives are selected, it will then be more cost-effective to perform the detailed risk analysis necessary for programmatic and regulatory acceptance of the selected alternative. The RH-RM code has been integrated into the PNNL developed Framework for Risk Analysis In Multimedia Environmental Systems (FRAMES) to allow the input and output data of the RH-RM code to be readily shared with the more comprehensive risk analysis models, such as the PNNL developed Multimedia Environmental Pollutant Assessment System (MEPAS) model.

  18. Revisiting the Sch önbein ozone measurement methodology

    NASA Astrophysics Data System (ADS)

    Ramírez-González, Ignacio A.; Añel, Juan A.; Saiz-López, Alfonso; García-Feal, Orlando; Cid, Antonio; Mejuto, Juan Carlos; Gimeno, Luis

    2017-04-01

    Trough the XIX century the Schönbein method gained a lot of popularity by its easy way to measure tropospheric ozone. Traditionally it has been considered that Schönbein measurements are not accurate enough to be useful. Detractors of this method argue that it is sensitive to meteorological conditions, being the most important the influence of relative humidity. As a consequence the data obtained by this method have usually been discarded. Here we revisit this method taking into account that values measured during the 19th century were taken using different measurement papers. We explore several concentrations of starch and potassium iodide, the basis for this measurement method. Our results are compared with the previous ones existing in the literature. The validity of the Schönbein methodology is discussed having into account humidity and other meteorological variables.

  19. Measuring Self-Stability: A Methodological Note.

    ERIC Educational Resources Information Center

    Cheung, Tak-sing

    1981-01-01

    Reviews three major measures of the concept of self-stability: the discrepancy measure, the syndromatic measure, and the longitudinal measure. Assesses their relative strengths as well as weaknesses. Suggests that the longitudinal measure may be used to check the degree of social desirability effect of the syndromatic measure. (Author)

  20. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review

    PubMed Central

    Chung, Stephanie T.; Chacko, Shaji K.; Sunehag, Agneta L.

    2015-01-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions. PMID:26604176

  1. [Pharmacological treatment conciliation methodology in patients with multiple conditions].

    PubMed

    Alfaro-Lara, Eva Rocío; Vega-Coca, María Dolores; Galván-Banqueri, Mercedes; Nieto-Martín, María Dolores; Pérez-Guerrero, Concepción; Santos-Ramos, Bernardo

    2014-02-01

    To carry out a bibliographic review in order to identify the different methodologies used along the reconciliation process of drug therapy applicable to polypathological patients. We performed a literature review. Data sources The bibliographic review (February 2012) included the following databases: Pubmed, EMBASE, CINAHL, PsycINFO and Spanish Medical Index (IME). The different methodologies, identified on those databases, to measure the conciliation process in polypathological patients, or otherwise elderly patients or polypharmacy, were studied. Study selection Two hundred and seventy three articles were retrieved, of which 25 were selected. Data extraction Specifically: the level of care, the sources of information, the use of registration forms, the established time, the medical professional in charge and the registered variables such as errors of reconciliation. Most of studies selected when the patient was admitted into the hospital and after the hospital discharge of the patient. The main sources of information to be highlighted are: the interview and the medical history of the patient. An established time is not explicitly stated on most of them, nor the registration form is used. The main professional in charge is the clinical pharmacologist. Apart from the home medication, the habits of self-medication and phytotherapy are also identified. The common errors of reconciliation vary from the omission of drugs to different forms of interaction with other medicinal products (drugs interactions). There is a large heterogeneity of methodologies used for reconciliation. There is not any work done on the specific figure of the polypathological patient, which precisely requires a standardized methodology due to its complexity and its susceptibility to errors of reconciliation. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  2. Quantum Measurement and Initial Conditions

    NASA Astrophysics Data System (ADS)

    Stoica, Ovidiu Cristinel

    2016-03-01

    Quantum measurement finds the observed system in a collapsed state, rather than in the state predicted by the Schrödinger equation. Yet there is a relatively spread opinion that the wavefunction collapse can be explained by unitary evolution (for instance in the decoherence approach, if we take into account the environment). In this article it is proven a mathematical result which severely restricts the initial conditions for which measurements have definite outcomes, if pure unitary evolution is assumed. This no-go theorem remains true even if we take the environment into account. The result does not forbid a unitary description of the measurement process, it only shows that such a description is possible only for very restricted initial conditions. The existence of such restrictions of the initial conditions can be understood in the four-dimensional block universe perspective, as a requirement of global self-consistency of the solutions of the Schrödinger equation.

  3. Methodological Challenges in Measuring Child Maltreatment

    ERIC Educational Resources Information Center

    Fallon, Barbara; Trocme, Nico; Fluke, John; MacLaurin, Bruce; Tonmyr, Lil; Yuan, Ying-Ying

    2010-01-01

    Objective: This article reviewed the different surveillance systems used to monitor the extent of reported child maltreatment in North America. Methods: Key measurement and definitional differences between the surveillance systems are detailed and their potential impact on the measurement of the rate of victimization. The infrastructure…

  4. Methodological Challenges in Measuring Child Maltreatment

    ERIC Educational Resources Information Center

    Fallon, Barbara; Trocme, Nico; Fluke, John; MacLaurin, Bruce; Tonmyr, Lil; Yuan, Ying-Ying

    2010-01-01

    Objective: This article reviewed the different surveillance systems used to monitor the extent of reported child maltreatment in North America. Methods: Key measurement and definitional differences between the surveillance systems are detailed and their potential impact on the measurement of the rate of victimization. The infrastructure…

  5. Wastewater Sampling Methodologies and Flow Measurement Techniques.

    ERIC Educational Resources Information Center

    Harris, Daniel J.; Keffer, William J.

    This document provides a ready source of information about water/wastewater sampling activities using various commercial sampling and flow measurement devices. The report consolidates the findings and summarizes the activities, experiences, sampling methods, and field measurement techniques conducted by the Environmental Protection Agency (EPA),…

  6. A methodological review of resilience measurement scales

    PubMed Central

    2011-01-01

    Background The evaluation of interventions and policies designed to promote resilience, and research to understand the determinants and associations, require reliable and valid measures to ensure data quality. This paper systematically reviews the psychometric rigour of resilience measurement scales developed for use in general and clinical populations. Methods Eight electronic abstract databases and the internet were searched and reference lists of all identified papers were hand searched. The focus was to identify peer reviewed journal articles where resilience was a key focus and/or is assessed. Two authors independently extracted data and performed a quality assessment of the scale psychometric properties. Results Nineteen resilience measures were reviewed; four of these were refinements of the original measure. All the measures had some missing information regarding the psychometric properties. Overall, the Connor-Davidson Resilience Scale, the Resilience Scale for Adults and the Brief Resilience Scale received the best psychometric ratings. The conceptual and theoretical adequacy of a number of the scales was questionable. Conclusion We found no current 'gold standard' amongst 15 measures of resilience. A number of the scales are in the early stages of development, and all require further validation work. Given increasing interest in resilience from major international funders, key policy makers and practice, researchers are urged to report relevant validation statistics when using the measures. PMID:21294858

  7. Experimental comparison of exchange bias measurement methodologies

    SciTech Connect

    Hovorka, Ondrej; Berger, Andreas; Friedman, Gary

    2007-05-01

    Measurements performed on all-ferromagnetic bilayer systems and supported by model calculation results are used to compare different exchange bias characterization methods. We demonstrate that the accuracy of the conventional two-point technique based on measuring the sum of the coercive fields depends on the symmetry properties of hysteresis loops. On the other hand, the recently proposed center of mass method yields results independent of the hysteresis loop type and coincides with the two-point measurement only if the loops are symmetric. Our experimental and simulation results clearly demonstrate a strong correlation between loop asymmetry and the difference between these methods.

  8. Sustainable Food Security Measurement: A Systemic Methodology

    NASA Astrophysics Data System (ADS)

    Findiastuti, W.; Singgih, M. L.; Anityasari, M.

    2017-04-01

    Sustainable food security measures how a region provides food for its people without endangered the environment. In Indonesia, it was legally measured in Food Security and Vulnerability (FSVA). However, regard to sustainable food security policy, the measurement has not encompassed the environmental aspect. This will lead to lack of environmental aspect information for adjusting the next strategy. This study aimed to assess Sustainable Food security by encompassing both food security and environment aspect using systemic eco-efficiency. Given existing indicator of cereal production level, total emission as environment indicator was generated by constructing Causal Loop Diagram (CLD). Then, a stock-flow diagram was used to develop systemic simulation model. This model was demonstrated for Indonesian five provinces. The result showed there was difference between food security order with and without environmental aspect assessment.

  9. Methodology for high accuracy contact angle measurement.

    PubMed

    Kalantarian, A; David, R; Neumann, A W

    2009-12-15

    A new version of axisymmetric drop shape analysis (ADSA) called ADSA-NA (ADSA-no apex) was developed for measuring interfacial properties for drop configurations without an apex. ADSA-NA facilitates contact angle measurements on drops with a capillary protruding into the drop. Thus a much simpler experimental setup, not involving formation of a complete drop from below through a hole in the test surface, may be used. The contact angles of long-chained alkanes on a commercial fluoropolymer, Teflon AF 1600, were measured using the new method. A new numerical scheme was incorporated into the image processing to improve the location of the contact points of the liquid meniscus with the solid substrate to subpixel resolution. The images acquired in the experiments were also analyzed by a different drop shape technique called theoretical image fitting analysis-axisymmetric interfaces (TIFA-AI). The results were compared with literature values obtained by means of the standard ADSA for sessile drops with the apex. Comparison of the results from ADSA-NA with those from TIFA-AI and ADSA reveals that, with different numerical strategies and experimental setups, contact angles can be measured with an accuracy of less than 0.2 degrees. Contact angles and surface tensions measured from drops with no apex, i.e., by means of ADSA-NA and TIFA-AI, were considerably less scattered than those from complete drops with apex. ADSA-NA was also used to explore sources of improvement in contact angle resolution. It was found that using an accurate value of surface tension as an input enhances the accuracy of contact angle measurements.

  10. Evaluation of UT Wall Thickness Measurements and Measurement Methodology

    SciTech Connect

    Weier, Dennis R.; Pardini, Allan F.

    2007-10-01

    CH2M HILL has requested that PNNL examine the ultrasonic methodology utilized in the inspection of the Hanford double shell waste tanks. Specifically, PNNL is to evaluate the UT process variability and capability to detect changes in wall thickness and to document the UT operator's techniques and methodology in the determination of the reported minimum and average UT data and how it compares to the raw (unanalyzed) UT data.

  11. A Standardized Software Reliability Measurement Methodology

    DTIC Science & Technology

    1991-12-01

    application areas: avionics: communications: command, control. comjmunication, and intelligence; electronic warfare: and radar systems [81 :6]. The study...reliability tools [29, 34, 61]. Goel states: Software reliability is a useful measure in planning and controlling resources during the development...of Goel to identify four classes of soft-ware reliabihity models: fault seeding models: inlut 2-. domain models; times between failure models; and

  12. Toward a new methodology for measuring the threshold Shields number

    NASA Astrophysics Data System (ADS)

    Rousseau, Gauthier; Dhont, Blaise; Ancey, Christophe

    2016-04-01

    A number of bedload transport equations involve the threshold Shields number (corresponding to the threshold of incipient motion for particles resting on the streambed). Different methods have been developed for determining this threshold Shields number; they usually assume that the initial streambed is plane prior to sediment transport. Yet, there are many instances in real-world scenarios, in which the initial streambed is not free of bed forms. We are interested in developing a new methodology for determining the threshold of incipient motion in gravel-bed streams in which smooth bed forms (e.g., anti-dunes) develop. Experiments were conducted in a 10-cm wide, 2.5-m long flume, whose initial inclination was 3%. Flows were supercritical and fully turbulent. The flume was supplied with water and sediment at fixed rates. As bed forms developed and migrated, and sediment transport rates exhibited wide fluctuations, measurements had to be taken over long times (typically 10 hr). Using a high-speed camera, we recorded the instantaneous bed load transport rate at the outlet of the flume by taking top-view images. In parallel, we measured the evolution of the bed slope, water depth, and shear stress by filming through a lateral window of the flume. These measurements allowed for the estimation of the space and time-averaged slope, from which we deduced the space and time-averaged Shields number under incipient bed load transport conditions. In our experiments, the threshold Shields number was strongly dependent on streambed morphology. Experiments are under way to determine whether taking the space and time average of incipient motion experiments leads to a more robust definition of the threshold Shields number. If so, this new methodology will perform better than existing approaches at measuring the threshold Shields number.

  13. Methodological aspects of EEG and body dynamics measurements during motion

    PubMed Central

    Reis, Pedro M. R.; Hebenstreit, Felix; Gabsteiger, Florian; von Tscharner, Vinzenz; Lochmann, Matthias

    2014-01-01

    EEG involves the recording, analysis, and interpretation of voltages recorded on the human scalp which originate from brain gray matter. EEG is one of the most popular methods of studying and understanding the processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements that are performed in response to the environment. However, there are methodological difficulties which can occur when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions on how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics, and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determinating real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks. PMID:24715858

  14. DEVELOPMENT OF PETROLUEM RESIDUA SOLUBILITY MEASUREMENT METHODOLOGY

    SciTech Connect

    Per Redelius

    2006-03-01

    In the present study an existing spectrophotometry system was upgraded to provide high-resolution ultraviolet (UV), visible (Vis), and near infrared (NIR) analyses of test solutions to measure the relative solubilities of petroleum residua dissolved in eighteen test solvents. Test solutions were prepared by dissolving ten percent petroleum residue in a given test solvent, agitating the mixture, followed by filtration and/or centrifugation to remove insoluble materials. These solutions were finally diluted with a good solvent resulting in a supernatant solution that was analyzed by spectrophotometry to quantify the degree of dissolution of a particular residue in the suite of test solvents that were selected. Results obtained from this approach were compared with spot-test data (to be discussed) obtained from the cosponsor.

  15. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  16. Measuring Bone Metabolism with Fluoride PET: Methodological Considerations.

    PubMed

    Apostolova, Ivayla; Brenner, Winfried

    2010-07-01

    In recent years the more widespread availability of PET systems and the development of hybrid PET/computed tomography (CT) imaging, allowing improved morphologic characterization of sites with increased tracer uptake, have improved the accuracy of diagnosis and strengthened the role of 18F-fluoride PET for quantitative assessment of bone pathology. This article reviews the role of 18F-fluoride PET in the skeleton, with a focus on (1) the underlying physiologic and pathophysiological processes of different conditions of bone metabolism and (2) methodological aspects of quantitative measurement of 18F-fluoride kinetics. Recent comparative studies have demonstrated that 18F-fluoride PET and, to an even greater extent, PET/CT are more accurate than 99mTc-bisphosphonate single-photon emission CT for the identification of malignant and benign lesions of the skeleton. Quantitative 18F-flouride PET has been shown valuable for direct non-invasive assessment of bone metabolism and monitoring response to therapy. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. Issues in Measurement and Methodology: CSE's 1978 Conference.

    ERIC Educational Resources Information Center

    Burry, James, Ed.; Quellmalz, Edys S., Ed.

    1978-01-01

    Abstracts are presented of the major conference papers and thematic discussions delivered at the 1978 Measurement and Methodology Conference. The titles of the presentations are: Policy-Responsive Evaluation (Wiley); When Educators Set Standards (Glass); Comments on Wiley and Glass (Schutz); Key Standard-Setting Considerations for Minimal…

  18. Methodological considerations for measuring spontaneous physical activity in rodents.

    PubMed

    Teske, Jennifer A; Perez-Leighton, Claudio E; Billington, Charles J; Kotz, Catherine M

    2014-05-15

    When exploring biological determinants of spontaneous physical activity (SPA), it is critical to consider whether methodological factors differentially affect rodents and the measured SPA. We determined whether acclimation time, sensory stimulation, vendor, or chamber size affected measures in rodents with varying propensity for SPA. We used principal component analysis to determine which SPA components (ambulatory and vertical counts, time in SPA, and distance traveled) best described the variability in SPA measurements. We compared radiotelemetry and infrared photobeams used to measure SPA and exploratory activity. Acclimation time, sensory stimulation, vendor, and chamber size independently influenced SPA, and the effect was moderated by the propensity for SPA. A 24-h acclimation period prior to SPA measurement was sufficient for habituation. Principal component analysis showed that ambulatory and vertical measurements of SPA describe different dimensions of the rodent's SPA behavior. Smaller testing chambers and a sensory attenuation cubicle around the chamber reduced SPA. SPA varies between rodents purchased from different vendors. Radiotelemetry and infrared photobeams differ in their sensitivity to detect phenotypic differences in SPA and exploratory activity. These data highlight methodological considerations in rodent SPA measurement and a need to standardize SPA methodology.

  19. Methodological considerations for measuring spontaneous physical activity in rodents

    PubMed Central

    Perez-Leighton, Claudio E.; Billington, Charles J.; Kotz, Catherine M.

    2014-01-01

    When exploring biological determinants of spontaneous physical activity (SPA), it is critical to consider whether methodological factors differentially affect rodents and the measured SPA. We determined whether acclimation time, sensory stimulation, vendor, or chamber size affected measures in rodents with varying propensity for SPA. We used principal component analysis to determine which SPA components (ambulatory and vertical counts, time in SPA, and distance traveled) best described the variability in SPA measurements. We compared radiotelemetry and infrared photobeams used to measure SPA and exploratory activity. Acclimation time, sensory stimulation, vendor, and chamber size independently influenced SPA, and the effect was moderated by the propensity for SPA. A 24-h acclimation period prior to SPA measurement was sufficient for habituation. Principal component analysis showed that ambulatory and vertical measurements of SPA describe different dimensions of the rodent's SPA behavior. Smaller testing chambers and a sensory attenuation cubicle around the chamber reduced SPA. SPA varies between rodents purchased from different vendors. Radiotelemetry and infrared photobeams differ in their sensitivity to detect phenotypic differences in SPA and exploratory activity. These data highlight methodological considerations in rodent SPA measurement and a need to standardize SPA methodology. PMID:24598463

  20. Design methodology of an automated scattering measurement facility

    NASA Astrophysics Data System (ADS)

    Mazur, D. G.

    1985-12-01

    This thesis addresses the design methodology surrounding an automated scattering measurement facility. A brief historical survey of radar cross-section (RCS) measurements is presented. The electromagnetic theory associated with a continuous wave (CW) background cancellation technique for measuring RCS is discussed as background. In addition, problems associated with interfacing test equipment, data storage and output are addressed. The facility used as a model for this thesis is located at the Air Force Institute of Technology, WPARB, OH. The design methodology applies to any automated scattering measurement facility. A software package incorporating features that enhance the operation of AFIT's facility by students is presented. Finally, sample output from the software package illustrate formats for displaying RCS data.

  1. Holdup measurements under realistic conditions

    SciTech Connect

    Sprinkel, J.K. Jr.; Marshall, R.; Russo, P.A.; Siebelist, R.

    1997-11-01

    This paper reviews the documentation of the precision and bias of holdup (residual nuclear material remaining in processing equipment) measurements and presents previously unreported results. Precision and bias results for holdup measurements are reported from training seminars with simulated holdup, which represent the best possible results, and compared to actual plutonium processing facility measurements. Holdup measurements for plutonium and uranium processing plants are also compared to reference values. Recommendations for measuring holdup are provided for highly enriched uranium facilities and for low enriched uranium facilities. The random error component of holdup measurements is less than the systematic error component. The most likely factor in measurement error is incorrect assumptions about the measurement, such as background, measurement geometry, or signal attenuation. Measurement precision on the order of 10% can be achieved with some difficulty. Bias of poor quality holdup measurement can also be improved. However, for most facilities, holdup measurement errors have no significant impact on inventory difference, sigma, or safety (criticality, radiation, or environmental); therefore, it is difficult to justify the allocation of more resources to improving holdup measurements. 25 refs., 10 tabs.

  2. A new methodology to measure the running biomechanics of amputees.

    PubMed

    Wilson, James Richard; Asfour, Shihab; Abdelrahman, Khaled Zakaria; Gailey, Robert

    2009-09-01

    We present a new methodology to measure the running biomechanics of amputees. This methodology combines the use of a spring-mass model and symmetry index, two standard techniques in biomechanics literature, but not yet used in concert to evaluate amputee biomechanics. The methodology was examined in the context of a pilot study to examine two transtibial amputee sprinters and showed biomechanically quantifiable changes for small adjustments in prosthetic prescription. Vertical ground reaction forces were measured in several trials for two transtibial amputees running at constant speed. A spring-mass model was used in conjunction with a symmetry index to observe the effect of varying prosthetic height and stiffness on running biomechanics. All spring-mass variables were significantly affected by changes in prosthetic prescription among the two subjects tested (p < 0.05). When prosthetic height was changed, both subjects showed significant differences, in Deltay(max), Deltal and contact time (t(c)) on the prosthetic limb and in k(vert) and k(leg) on the sound limb. The symmetry indices calculated for spring-mass variables were all significantly affected due to changes in prosthetic prescription for the male subject and all but the peak force (F(peak)) for the female subject. This methodology is a straight-forward tool for evaluating the effect of changes to prosthetic prescription.

  3. A methodology for measurements of nasal nitric oxide in children under 5 yr.

    PubMed

    Gupta, Rajesh; Gupta, Nisha; Turner, Stephen W

    2008-05-01

    Measurements of nasal nitric oxide (nNO) may give insight into respiratory conditions in children aged under 5 yr but no methodology has been described for this age-group. The present study aimed to establish the methodology and reproducibility for measuring nNO during tidal breathing in young children and to relate nNO to allergic conditions. Children and siblings aged under 5 yr attending hospital clinics were enrolled. On-line nNO measurements were obtained during tidal breathing using a chemiluminescence analyser. To establish our methodology, nNO was measured over 3, 5 or 10 s NO plateaus and nNO was also measured from left and right nostrils. nNO was then compared between children with and without allergic conditions. The reproducibility of nNO measurements over 24 h was studied in a separate group of children. Eighty-three children participated in the methodological part of the study and nNO was successfully measured in 57 (69%), mean (s.d.) age 3.4 (1.1) years, 14 with allergic conditions. Neither NO plateau duration nor choice of nostril influenced nNO values. The mean (s.d.) nNO for non-atopic children was 208 (103) parts per billion (ppb) and for atopic children was 284 (122), p = 0.032. Nasal NO values were not related to ambient NO, gender and passive smoke exposure; there was a non-significant trend for nNO to be positively related to age. Nasal NO measurements were reproducible in the 21 children tested, mean difference 9.6 ppb (limits of agreement-127, 146). We report a methodology for nNO measurement in young children. Further work is now required to establish the clinical utility of nNO in this age-group.

  4. National working conditions surveys in Latin America: comparison of methodological characteristics

    PubMed Central

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G.

    2015-01-01

    Background: High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. Objective: To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. Methods: The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Results: Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Conclusions: Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region. PMID:26079314

  5. National working conditions surveys in Latin America: comparison of methodological characteristics.

    PubMed

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G

    2015-01-01

    High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region.

  6. Methodology of high-resolution photography for mural condition database

    NASA Astrophysics Data System (ADS)

    Higuchi, R.; Suzuki, T.; Shibata, M.; Taniguchi, Y.

    2015-08-01

    Digital documentation is one of the most useful techniques to record the condition of cultural heritage. Recently, high-resolution images become increasingly useful because it is possible to show general views of mural paintings and also detailed mural conditions in a single image. As mural paintings are damaged by environmental stresses, it is necessary to record the details of painting condition on high-resolution base maps. Unfortunately, the cost of high-resolution photography and the difficulty of operating its instruments and software have commonly been an impediment for researchers and conservators. However, the recent development of graphic software makes its operation simpler and less expensive. In this paper, we suggest a new approach to make digital heritage inventories without special instruments, based on our recent our research project in Üzümlü church in Cappadocia, Turkey. This method enables us to achieve a high-resolution image database with low costs, short time, and limited human resources.

  7. Object shape-based optical sensing methodology and system for condition monitoring of contaminated engine lubricants

    NASA Astrophysics Data System (ADS)

    Bordatchev, Evgueni; Aghayan, Hamid; Yang, Jun

    2014-03-01

    Presence of contaminants, such as gasoline, moisture, and coolant in the engine lubricant indicates mechanical failure within the engine and significantly reduces lubricant quality. This paper describes a novel sensing system, its methodology and experimental verifications for analysis of the presence of contaminants in the engine lubricants. The sensing methodology is based on the statistical shape analysis methodology utilizing optical analysis of the distortion effect when an object image is obtained through a thin random optical medium. The novelty of the proposed sensing system lies within the employed methodology which an object with a known periodic shape is introduced behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant-object optical appearance, where an a priori known periodical structure of the object is distorted by a contaminated lubricant. The object, e.g. a stainless steel woven wire cloth with a mesh size of 65×65 µm2 and a circular wire diameter of 33 µm was placed behind a microfluidic channel, containing engine lubricant and optical images of flowing lubricant with stationary object were acquired and analyzed. Several parameters of acquired optical images, such as, color of lubricant and object, object shape width at object and lubricant levels, object relative color, and object width non-uniformity coefficient, were proposed. Measured on-line parameters were used for optical analysis of fresh and contaminated lubricants. Estimation of contaminant presence and lubricant condition was performed by comparison of parameters for fresh and contaminated lubricants. Developed methodology was verified experimentally showing ability to distinguish lubricants with 1%, 4%, 7%, and 10% coolant, gasoline and water contamination individually and in a combination form of coolant (0%-5%) and gasoline (0%-5%).

  8. From lab to field conditions: a pilot study on EEG methodology in applied sports sciences.

    PubMed

    Reinecke, Kirsten; Cordes, Marjolijn; Lerch, Christiane; Koutsandréou, Flora; Schubert, Michael; Weiss, Michael; Baumeister, Jochen

    2011-12-01

    Although neurophysiological aspects have become more important in sports and exercise sciences in the last years, it was not possible to measure cortical activity during performance outside a laboratory due to equipment limits or movement artifacts in particular. With this pilot study we want to investigate whether Electroencephalography (EEG) data obtained in a laboratory golf putting performance differ from a suitable putting task under field conditions. Therefore, parameters of the working memory (frontal Theta and parietal Alpha 2 power) were recorded during these two conditions. Statistical calculations demonstrated a significant difference only for Theta power at F4 regarding the two putting conditions "field" and "laboratory". These findings support the idea that brain activity patterns obtained under laboratory conditions are comparable but not equivalent to those obtained under field conditions. Additionally, we were able to show that the EEG methodology seems to be a reliable tool to observe brain activity under field conditions in a golf putting task. However, considering the still existing problems of movement artifacts during EEG measurements, eligible sports and exercises are limited to those being relatively motionless during execution. Further studies are needed to confirm these pilot results.

  9. Conditioning natural gas for measurement and transportation

    SciTech Connect

    Barnhard, E.E.

    1984-04-01

    This paper discusses methods of conditioning natural gas for measurement and transportation. Gas mixtures measured at the well head or into a gathering system may not yet be conditioned to pipeline standards at the point of measurement, because title to the gas passes from the seller to the buyer at that point. Therefore, it is sometimes necessary to measure the gas flow without complete conditioning and to do it accurately. Careful study of the conditioning steps that the gas has completed, or that must be performed prior to measurement, will affect selection of the measurement equipment and the success of its operation.

  10. Novel CD-SEM measurement methodology for complex OPCed patterns

    NASA Astrophysics Data System (ADS)

    Lee, Hyung-Joo; Park, Won Joo; Choi, Seuk Hwan; Chung, Dong Hoon; Shin, Inkyun; Kim, Byung-Gook; Jeon, Chan-Uk; Fukaya, Hiroshi; Ogiso, Yoshiaki; Shida, Soichi; Nakamura, Takayuki

    2014-07-01

    As design rules of lithography shrink: accuracy and precision of Critical Dimension (CD) and controllability of hard OPCed patterns are required in semiconductor production. Critical Dimension Scanning Electron Microscopes (CD SEM) are essential tools to confirm the quality of a mask such as CD control; CD uniformity and CD mean to target (MTT). Basically, Repeatability and Reproducibility (R and R) performance depends on the length of Region of Interest (ROI). Therefore, the measured CD can easily fluctuate in cases of extremely narrow regions of OPCed patterns. With that premise, it is very difficult to define MTT and uniformity of complex OPCed masks using the conventional SEM measurement approach. To overcome these difficulties, we evaluated Design Based Metrology (DBM) using Large Field Of View (LFOV) of CD-SEM. DBM can standardize measurement points and positions within LFOV based on the inflection/jog of OPCed patterns. Thus, DBM has realized several thousand multi ROI measurements with average CD. This new measurement technique can remove local CD errors and improved statistical methodology of the entire mask to enhance the representativeness of global CD uniformity. With this study we confirmed this new technique as a more reliable methodology in complex OPCed patterns compared to conventional technology. This paper summarizes the experiments of DBM with LFOV using various types of the patterns and compares them with current CD SEM methods.

  11. Methodology for evaluating statistically predicted versus measured imagery

    NASA Astrophysics Data System (ADS)

    Kooper, Rob; Bajcsy, Peter; Andersh, Dennis

    2005-05-01

    We present a novel methodology for evaluating statistically predicted versus measured multi-modal imagery, such as Synthetic Aperture Radar (SAR), Electro-Optical (EO), Multi-Spectral (MS) and Hyper-Spectral (HS) modalities. While several scene modeling approaches have been proposed in the past for multi-modal image predictions, the problem of evaluating synthetic and measured images has remained an open issue. Although analytical prediction models would be appropriate for accuracy evaluations of man-made objects, for example, SAR target modeling based on Xpatch, the analytical models cannot be applied to prediction evaluation of natural scenes because of their randomness and high geometrical complexity imaged by any of the aforementioned sensor modality. Thus, statistical prediction models are frequently chosen as more appropriate scene modeling approaches and there is a need to evaluate the accuracy of statistically predicted versus measured imagery. This problem poses challenges in terms of selecting quantitative and qualitative evaluation techniques, and establishing a methodology for systematic comparisons of synthetic and measured images. In this work, we demonstrate clutter accuracy evaluations for modified measured and predicted synthetic images with statistically modeled clutter. We show experimental results for color (red, green and blue) and HS imaging modalities, and for statistical clutter models using Johnson's family of probability distribution functions (PDFs). The methodology includes several evaluation techniques for comparing image samples and their similarity, image histograms, statistical central moments, and estimated probability distribution functions (PDFs). Particularly, we assess correlation, histogram, chi-squared, pixel and PDF parameter based error metrics quantitatively, and relate them to a human visual perception of predicted image quality. The work is directly applicable to multi-sensor phenomenology modeling for exploitation

  12. Physiological outflow boundary conditions methodology for small arteries with multiple outlets: a patient-specific hepatic artery haemodynamics case study.

    PubMed

    Aramburu, Jorge; Antón, Raúl; Bernal, Nebai; Rivas, Alejandro; Ramos, Juan Carlos; Sangro, Bruno; Bilbao, José Ignacio

    2015-04-01

    Physiological outflow boundary conditions are necessary to carry out computational fluid dynamics simulations that reliably represent the blood flow through arteries. When dealing with complex three-dimensional trees of small arteries, and therefore with multiple outlets, the robustness and speed of convergence are also important. This study derives physiological outflow boundary conditions for cases in which the physiological values at those outlets are not known (neither in vivo measurements nor literature-based values are available) and in which the tree exhibits symmetry to some extent. The inputs of the methodology are the three-dimensional domain and the flow rate waveform and the systolic and diastolic pressures at the inlet. The derived physiological outflow boundary conditions, which are a physiological pressure waveform for each outlet, are based on the results of a zero-dimensional model simulation. The methodology assumes symmetrical branching and is able to tackle the flow distribution problem when the domain outlets are at branches with a different number of upstream bifurcations. The methodology is applied to a group of patient-specific arteries in the liver. The methodology is considered to be valid because the pulsatile computational fluid dynamics simulation with the inflow flow rate waveform (input of the methodology) and the derived outflow boundary conditions lead to physiological results, that is, the resulting systolic and diastolic pressures at the inlet match the inputs of the methodology, and the flow split is also physiological. © IMechE 2015.

  13. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  14. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  15. Simple methodologies for spectral emissivity measurement of rocks

    NASA Astrophysics Data System (ADS)

    Danov, Miroslav; Borisova, Denitsa; Stoyanov, Dimitar; Petkov, Doyno

    Presented investigation is focused on the measurement of spectral emissivity in the spectral interval of 8-14µm performed by ground-based technique. This spectral interval is generally used for investigation of vegetation, rock and water surfaces. The amount of radiated energy is a function of the object's temperature and its emissivity. For this reason the emissivity data is very useful for object's temperature assessment in thermal imaging process. We have developed and compared two simplified methodologies for measurement of spectral emissivity of rock and mineral samples, avoiding additional heating or grinding of the samples that accompanied other techniques. The first of them is related to the measurements of the hemispherical spectral emissivity, while the second one concerns the measurement of the directional spectral emissivity of samples. Both methodologies are suitable for laboratory and field measurements of samples with small active area (10cm2 ). As an illustration of the hemispherical spectral emissivity approach, the emissivity spectrum of limestone is presented. Most frequently the emissivity is referred to the normal emissivity. However, the directionality of the emissivity has an important effect on the measurements, for example when the land surface temperature is deduced. A simple methodology for measuring of the directional emissivity is proposed and developed. It is based on the emission of a collimated infrared (IR) source irradiating the investigated sample. The IR radiation is reflected by the sample and collected by a lithium tantalite pyroelectric detector. The spectral resolution of the reflected by the sample emission is provided by a set of 30 narrow-band transmission filters. Phase sensitive detection technique is used to enhance the signal/noise. The registered data are processed by a PC. The measuring process will be discussed and the experimentally measured directional emissivity spectra will be presented, related to some rock

  16. Laboratory measurements of gravel thermal properties. A methodology proposal

    NASA Astrophysics Data System (ADS)

    Cultrera, Matteo; Peron, Fabio; Bison, Paolo; Dalla Santa, Giorgia; Bertermann, David; Muller, Johannes; Bernardi, Adriana; Galgaro, Antonio

    2017-04-01

    Gravel thermal properties measurements at laboratory level is quite challenging due to several technical and logistic issues, mainly connected to the sediment sizes and the variability of their mineralogical composition. The direct measurement of gravel thermal properties usually are not able to involve a representative volume of geological material, consequently the thermal measurements performed produce much dispersed results and not consistent due to the large interstitial voids and the poor physical contact with the measuring sensors. With the aim of directly provide the measurement of the gravel thermal properties, a new methodology has been developed and some results are already available on several gravel deposits samples around Europe. Indeed, a single guarded hot plate Taurus Instruments TLP 800 measured the gravel thermal properties. Some instrumental adjustments were necessary to adapt the measuring devices and to finalize the thermal measurements on gravels at the IUAV FISTEC laboratory (Environmental Technical Physics Laboratory of Venice University). This device usually provides thermal measurements according to ISO 8302, ASTM C177, EN 1946-2, EN 12664, EN 12667 and EN 12939 for building materials. A preliminary calibration has been performed comparing the outcomes obtained with the single guarded hot plate with a needle probe of a portable thermal conductivity meter (ISOMET). Standard sand (ISO 67:2009) is used as reference material. This study is provided under the Cheap-GSHPs project that has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement no. 657982

  17. A new drop-shape methodology for surface tension measurement

    NASA Astrophysics Data System (ADS)

    Cabezas, M. G.; Bateni, A.; Montanero, J. M.; Neumann, A. W.

    2004-11-01

    Drop-shape techniques, such as axisymmetric drop-shape analysis (ADSA), have been widely used to measure surface tension. In the current schemes, theoretical curves are fitted to the experimental profiles by adjusting the value of surface tension. The best match between theoretical and experimental profiles identifies the surface tension of the drop. Extracting the experimental drop profile using edge detection, is an important part of the current drop-shape techniques. However, edge detections fail when acquisition of sharp images is not possible due to experimental or optical limitations. A new drop-shape approach is presented, which eliminates the need for the edge detection and provides a wider range of applicability. The new methodology, called theoretical image fitting analysis (TIFA), generates theoretical images of the drop and forms an error function that describes the pixel-by-pixel deviation of the theoretical image from the experimental one. Taking surface tension as an adjustable parameter, TIFA minimizes the error function, i.e. fits the theoretical image to the experimental one. The validity of the new methodology is examined by comparing the results with those of ADSA. Using the new methodology it is finally possible to enhance the study of the surface tension of lung surfactants at higher concentrations. Due to the opaqueness of the solution, such studies were limited to the low concentrations of surfactants heretofore.

  18. Methodology for measurement of shielding effectiveness in the microwave range

    NASA Astrophysics Data System (ADS)

    Baeckstroem, M.; Jansson, L.

    1995-02-01

    In the last few years an increased attention has been paid to the threat posed by pulsed high power microwaves against the operational reliability of electrical systems. In the process of system design and hardness verification determination of shielding effectiveness plays an important role. The report is a brief summary of the authors experiences regarding measurements, presentation and evaluation of data from measurements of shielding effectiveness of electronic compartments in the microwave range, primarily between 0.4 and 18 GHz. The design and function of field probes is dealt with, especially what concerns measurements in electrically small spaces. Guiding principles for the choice of frequency resolution and of probe locations are given. The significance of different ways to excite the test object is discussed as well as a method to determine receiving cross section of cables. Proposals on how to present measurement data and how to evaluate data w.r.t design requirements are presented. The report is meant both to serve as a guidance on how to perform shielding measurements and to provide a foundation for future work to improve and further develop the methodology. Further development is in particular required for shielding measurements in electrically small cavities.

  19. New methodology for adjusting rotating shadowband irradiometer measurements

    NASA Astrophysics Data System (ADS)

    Vignola, Frank; Peterson, Josh; Wilbert, Stefan; Blanc, Philippe; Geuder, Norbert; Kern, Chris

    2017-06-01

    A new method is developed for correcting systematic errors found in rotating shadowband irradiometer measurements. Since the responsivity of photodiode-based pyranometers typically utilized for RST sensors is dependent upon the wavelength of the incident radiation and the spectral distribution of the incident radiation is different for the Direct Normal Trradiance and the Diffuse Horizontal Trradiance, spectral effects have to be considered. These cause the most problematic errors when applying currently available correction functions to RST measurements. Hence, direct normal and diffuse contributions are analyzed and modeled separately. An additional advantage of this methodology is that it provides a prescription for how to modify the adjustment algorithms to locations with different atmospheric characteristics from the location where the calibration and adjustment algorithms were developed. A summary of results and areas for future efforts are then discussed.

  20. Select Methodology for Validating Advanced Satellite Measurement Systems

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.

    2008-01-01

    Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.

  1. Select Methodology for Validating Advanced Satellite Measurement Systems

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.

    2008-01-01

    Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.

  2. A Methodology for Measuring Strain in Power Semiconductors

    NASA Astrophysics Data System (ADS)

    Avery, Seth M.

    The objective of this work is to develop a strain measurement methodology for use in power electronics during electrical operation; such that strain models can be developed and used as the basis of an active strain controller---improving the reliability of power electronics modules. This research involves developing electronic speckle pattern interferometry (ESPI) into a technology capable of measuring thermal-mechanical strain in electrically active power semiconductors. ESPI is a non-contact optical technique capable of high resolution (approx. 10 nm) surface displacement measurements. This work has developed a 3-D ESPI test stand, where simultaneous in- and out-of-plane measured components are combined to accurately determine full-field surface displacement. Two cameras are used to capture both local (interconnect level) displacements and strains, and global (device level) displacements. Methods have been developed to enable strain measurements of larger loads, while avoiding speckle decorrelation (which limits ESPI measurement of large deformations). A method of extracting strain estimates directly from unfiltered and wrapped phase maps has been developed, simplifying data analysis. Experimental noise measurements are made and used to develop optimal filtering using model-based tracking and determined strain noise characteristics. The experimental results of this work are strain measurements made on the surface of a leadframe of an electrically active IGBT. A model-based tracking technique has been developed to allow for the optimal strain solution to be extracted from noisy displacement results. Also, an experimentally validated thermal-mechanical FE strain model has been developed. The results of this work demonstrate that in situ strain measurements in power devices are feasible. Using the procedures developed in the work, strain measurements at critical locations of strain, which limit device reliability, at relevant power levels can be completed.

  3. Methodology for measuring current distribution effects in electrochromic smart windows.

    PubMed

    Engfeldt, Johnny Degerman; Georen, Peter; Lagergren, Carina; Lindbergh, Göran

    2011-10-10

    Electrochromic (EC) devices for use as smart windows have a large energy-saving potential when used in the construction and transport industries. When upscaling EC devices to window size, a well-known challenge is to design the EC device with a rapid and uniform switching between colored (charged) and bleached (discharged) states. A well-defined current distribution model, validated with experimental data, is a suitable tool for optimizing the electrical system design for rapid and uniform switching. This paper introduces a methodology, based on camera vision, for experimentally validating EC current distribution models. The key is the methodology's capability to both measure and simulate current distribution effects as transmittance distribution. This paper also includes simple models for coloring (charging) and bleaching (discharging), taking into account secondary current distribution with charge transfer resistance and ohmic effects. Some window-size model predictions are included to show the potential for using a validated EC current distribution model as a design tool. © 2011 Optical Society of America

  4. Measuring user experience in digital gaming: theoretical and methodological issues

    NASA Astrophysics Data System (ADS)

    Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte

    2007-01-01

    There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.

  5. Experimental Methodology for Measuring Combustion and Injection-Coupled Responses

    NASA Technical Reports Server (NTRS)

    Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.

    2006-01-01

    A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.

  6. Methodological considerations for measuring glucocorticoid metabolites in feathers

    PubMed Central

    Berk, Sara A.; McGettrick, Julie R.; Hansen, Warren K.; Breuner, Creagh W.

    2016-01-01

    In recent years, researchers have begun to use corticosteroid metabolites in feathers (fCORT) as a metric of stress physiology in birds. However, there remain substantial questions about how to measure fCORT most accurately. Notably, small samples contain artificially high amounts of fCORT per millimetre of feather (the small sample artefact). Furthermore, it appears that fCORT is correlated with circulating plasma corticosterone only when levels are artificially elevated by the use of corticosterone implants. Here, we used several approaches to address current methodological issues with the measurement of fCORT. First, we verified that the small sample artefact exists across species and feather types. Second, we attempted to correct for this effect by increasing the amount of methanol relative to the amount of feather during extraction. We consistently detected more fCORT per millimetre or per milligram of feather in small samples than in large samples even when we adjusted methanol:feather concentrations. We also used high-performance liquid chromatography to identify hormone metabolites present in feathers and measured the reactivity of these metabolites against the most commonly used antibody for measuring fCORT. We verified that our antibody is mainly identifying corticosterone (CORT) in feathers, but other metabolites have significant cross-reactivity. Lastly, we measured faecal glucocorticoid metabolites in house sparrows and correlated these measurements with corticosteroid metabolites deposited in concurrently grown feathers; we found no correlation between faecal glucocorticoid metabolites and fCORT. We suggest that researchers should be cautious in their interpretation of fCORT in wild birds and should seek alternative validation methods to examine species-specific relationships between environmental challenges and fCORT. PMID:27335650

  7. Innovative methodologies and technologies for thermal energy release measurement.

    NASA Astrophysics Data System (ADS)

    Marotta, Enrica; Peluso, Rosario; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Chiodini, Giovanni; Mangiacapra, Annarita; Petrillo, Zaccaria; Sansivero, Fabio; Vilardo, Giuseppe; Marfe, Barbara

    2016-04-01

    Volcanoes exchange heat, gases and other fluids between the interrior of the Earth and its atmosphere influencing processes both at the surface and above it. This work is devoted to improve the knowledge on the parameters that control the anomalies in heat flux and chemical species emissions associated with the diffuse degassing processes of volcanic and hydrothermal zones. We are studying and developing innovative medium range remote sensing technologies to measure the variations through time of heat flux and chemical emissions in order to boost the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The current methodologies used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. Remote sensing of these parameters will allow for measurements faster than already accredited methods therefore it will be both more effective and efficient in case of emergency and it will be used to make quick routine monitoring. We are currently developing a method based on drone-born IR cameras to measure the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. The use of flying drones will allow to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature at distance in the order of hundreds of meters. Further development of remote sensing will be done through the use, on flying drones, of multispectral and/or iperspectral sensors, UV scanners in order to be able to detect the amount of chemical species released in the athmosphere.

  8. Contact angle measurements under thermodynamic equilibrium conditions.

    PubMed

    Lages, Carol; Méndez, Eduardo

    2007-08-01

    The precise control of the ambient humidity during contact angle measurements is needed to obtain stable and valid data. For a such purpose, a simple low-cost device was designed, and several modified surfaces relevant to biosensor design were studied. Static contact angle values for these surfaces are lower than advancing contact angles published for ambient conditions, indicating that thermodynamic equilibrium conditions are needed to avoid drop evaporation during the measurements.

  9. Measuring persistence: A literature review focusing on methodological issues

    SciTech Connect

    Wolfe, A.K.; Brown, M.A.; Trumble, D.

    1995-03-01

    This literature review was conducted as part of a larger project to produce a handbook on the measurement of persistence. The past decade has marked the development of the concept of persistence and a growing recognition that the long-term impacts of demand-side management (DSM) programs warrant careful assessment. Although Increasing attention has been paid to the topic of persistence, no clear consensus has emerged either about its definition or about the methods most appropriate for its measurement and analysis. This project strives to fill that gap by reviewing the goals, terminology, and methods of past persistence studies. It was conducted from the perspective of a utility that seeks to acquire demand-side resources and is interested in their long-term durability; it was not conducted from the perspective of the individual consumer. Over 30 persistence studies, articles, and protocols were examined for this report. The review begins by discussing the underpinnings of persistence studies: namely, the definitions of persistence and the purposes of persistence studies. Then. it describes issues relevant to both the collection and analysis of data on the persistence of energy and demand savings. Findings from persistence studies also are summarized. Throughout the review, four studies are used repeatedly to illustrate different methodological and analytical approaches to persistence so that readers can track the data collection. data analysis, and findings of a set of comprehensive studies that represent alternative approaches.

  10. Object recognition testing: methodological considerations on exploration and discrimination measures.

    PubMed

    Akkerman, Sven; Blokland, Arjan; Reneerkens, Olga; van Goethem, Nick P; Bollen, Eva; Gijselaers, Hieronymus J M; Lieben, Cindy K J; Steinbusch, Harry W M; Prickaerts, Jos

    2012-07-01

    The object recognition task (ORT) is a popular one-trial learning test for animals. In the current study, we investigated several methodological issues concerning the task. Data was pooled from 28 ORT studies, containing 731 male Wistar rats. We investigated the relationship between 3 common absolute- and relative discrimination measures, as well as their relation to exploratory activity. In this context, the effects of pre-experimental habituation, object familiarity, trial duration, retention interval and the amnesic drugs MK-801 and scopolamine were investigated. Our analyses showed that the ORT is very sensitive, capable of detecting subtle differences in memory (discrimination) and exploratory performance. As a consequence, it is susceptible to potential biases due to (injection) stress and side effects of drugs. Our data indicated that a minimum amount of exploration is required in the sample and test trial for stable significant discrimination performance. However, there was no relationship between the level of exploration in the sample trial and discrimination performance. In addition, the level of exploration in the test trial was positively related to the absolute discrimination measure, whereas this was not the case for relative discrimination measures, which correct for exploratory differences, making them more resistant to exploration biases. Animals appeared to remember object information over multiple test sessions. Therefore, when animals have encountered both objects in prior test sessions, the object preference observed in the test trial of 1h retention intervals is probably due to a relative difference in familiarity between the objects in the test trial, rather than true novelty per se. Taken together, our findings suggest to take into consideration pre-experimental exposure (familiarization) to objects, habituation to treatment procedures, and the use of relative discrimination measures when using the ORT.

  11. Methodological Considerations for Hair Cortisol Measurements in Children

    PubMed Central

    Slominski, Radomir; Rovnaghi, Cynthia R.; Anand, Kanwaljeet J. S.

    2015-01-01

    Background Hair cortisol levels are used increasingly as a measure for chronic stress in young children. We propose modifications to the current methods used for hair cortisol analysis to more accurately determine reference ranges for hair cortisol across different populations and age groups. Methods The authors compared standard (finely cutting hair) vs. milled methods for hair processing (n=16), developed a 4-step extraction process for hair protein and cortisol (n=16), and compared liquid chromatography-mass spectrometry (LCMS) vs. ELISA assays for measuring hair cortisol (n=28). The extraction process included sequential incubations in methanol and acetone, repeated twice. Hair protein was measured via spectrophotometric ratios at 260/280 nm to indicate the hair dissolution state using a BioTek® plate reader and dedicated software. Hair cortisol was measured using an ELISA assay kit. Individual (n=13), pooled hair samples (n=12) with high, intermediate, and low cortisol values and the ELISA assay internal standards (n=3) were also evaluated by LCMS. Results Milled and standard methods showed highly correlated hair cortisol (rs=0.951, p<0.0001) and protein values (rs=0.902, p=0.0002), although higher yields of cortisol and protein were obtained from the standard method in 13/16 and 14/16 samples respectively (p<0.05). Four sequential extractions yielded additional amounts of protein (36.5%, 27.5%, 30.5%, 3.1%) and cortisol (45.4%, 31.1%, 15.1%, 0.04%) from hair samples. Cortisol values measured by LCMS and ELISA were correlated (rs=0.737; p<0.0001), although cortisol levels (median [IQR]) detected in the same samples by LCMS (38.7 [14.4, 136] ng/ml) were lower than by ELISA (172.2 [67.9, 1051] ng/ml). LCMS also detected cortisone, which comprised 13.4% (3.7%, 25.9%) of the steroids detected. Conclusion Methodological studies suggest that finely cutting hair with sequential incubations in methanol and acetone, repeated twice, extracts greater yields of cortisol

  12. Adaptability of laser diffraction measurement technique in soil physics methodology

    NASA Astrophysics Data System (ADS)

    Barna, Gyöngyi; Szabó, József; Rajkai, Kálmán; Bakacsi, Zsófia; Koós, Sándor; László, Péter; Hauk, Gabriella; Makó, András

    2016-04-01

    There are intentions all around the world to harmonize soils' particle size distribution (PSD) data by the laser diffractometer measurements (LDM) to that of the sedimentation techniques (pipette or hydrometer methods). Unfortunately, up to the applied methodology (e. g. type of pre-treatments, kind of dispersant etc.), PSDs of the sedimentation methods (due to different standards) are dissimilar and could be hardly harmonized with each other, as well. A need was arisen therefore to build up a database, containing PSD values measured by the pipette method according to the Hungarian standard (MSZ-08. 0205: 1978) and the LDM according to a widespread and widely used procedure. In our current publication the first results of statistical analysis of the new and growing PSD database are presented: 204 soil samples measured with pipette method and LDM (Malvern Mastersizer 2000, HydroG dispersion unit) were compared. Applying usual size limits at the LDM, clay fraction was highly under- and silt fraction was overestimated compared to the pipette method. Subsequently soil texture classes determined from the LDM measurements significantly differ from results of the pipette method. According to previous surveys and relating to each other the two dataset to optimizing, the clay/silt boundary at LDM was changed. Comparing the results of PSDs by pipette method to that of the LDM, in case of clay and silt fractions the modified size limits gave higher similarities. Extension of upper size limit of clay fraction from 0.002 to 0.0066 mm, and so change the lower size limit of silt fractions causes more easy comparability of pipette method and LDM. Higher correlations were found between clay content and water vapor adsorption, specific surface area in case of modified limit, as well. Texture classes were also found less dissimilar. The difference between the results of the two kind of PSD measurement methods could be further reduced knowing other routinely analyzed soil parameters

  13. Definition and Validation of a Methodology to Measure the Local Static Stiffness of Large Appendages Interfaces Using Dynamic Measurements

    NASA Astrophysics Data System (ADS)

    Bernasconi, M.; Rodriguez Senin, A.; Laduree, G.

    2014-06-01

    This paper describes the methodology developed under the Sentinel 1 spacecraft structure project to measure the local static stiffness of synthetic aperture radar antenna (SAR-A) interfaces using dynamic measurements. The methodology used consists in measuring accelerance [acceleration/force] frequency response functions (FRF) at the SAR-A interfaces and at several points selected as boundary conditions used for the derivation of the local stiffness [1]. The accelerance FRF is used to calculate the flexibility FRF [displacement/force] from which the static term of the flexibility is extracted. The static term is obtained via a least squares approximation at low frequency of the real part of the flexibility FRF (curve fitting approach) and extrapolation of the curve down to 0 Hz.Since the test was performed with the launch vehicle adapter ring clamped, the direct results of these measurements lead to global stiffness values. To calculate the local stiffness values the results were post- processed to subtract the contribution of the global deformation of the spacecraft structure. The local flexibility matrix at the SAR-A interfaces is calculated by imposing zero displacement at those points selected as virtual boundary conditions. Then, the local stiffness components were obtained inverting the diagonal terms of the local flexibility matrix for the three translational and the two in-plane rotational degrees of freedom in 9 SAR-A interfaces.The results obtained using this methodology were validated with a classical static test at one of the interfaces showing a good correlation between static and dynamic tests results. It was concluded that this methodology is suitable for the verification of static stiffness of large appendages interfaces and it can be applied to future missions that carry large payloads with critical structural interfaces.

  14. [Optimization of ultrasonic extraction conditions of safflower yellow from Carthamus tinctorius by response surface methodology].

    PubMed

    Sun, Yan-wen; He, Yu; Zhang, Ru-ping; Huang, Jia-wei; Gu, Man-cang

    2013-12-01

    To investigate the optimization of extraction conditions of safflower yellow from Cartbamus tirwtorius by response surface methodology. Experimental factors and levels were selected by one-factor test, and then according to the central composite experimental design principle, response surface'-methodology with three factors and three levels was used to establish a mathematical model to obtain the optimal extraction conditions with hydroxysafflower yellow A being the target and its extraction yield as response value. The optimal extraction conditions of safflower yellow were as follows: extraction temperature was 55 t, ratio of water to raw material was 16:1 and extraction time was 39 mm for three times. Under these conditions, the extraction yield of safflower yellow is 1.798%, and the relative error between the predicted value with actual value is 2.758%. The optimized method can provide reference for the efficient extraction of safflower yellow from Carthomos tinctorius

  15. Conditional measurements as probes of quantum dynamics

    SciTech Connect

    Siddiqui, Shabnam; Erenso, Daniel; Vyas, Reeta; Singh, Surendra

    2003-06-01

    We discuss conditional measurements as probes of quantum dynamics and show that they provide different ways to characterize quantum fluctuations. We illustrate this by considering the light from a subthreshold degenerate parametric oscillator. Analytic results and curves are presented to illustrate the behavior.

  16. Aircraft emission measurements by remote sensing methodologies at airports

    NASA Astrophysics Data System (ADS)

    Schäfer, Klaus; Jahn, Carsten; Sturm, Peter; Lechner, Bernhard; Bacher, Michael

    The emission indices of aircraft engine exhausts from measurements taken under operating conditions, to calculate precisely the emission inventories of airports, are not available up to now. To determine these data, measurement campaigns were performed on idling aircraft at major European airports using non-intrusive spectroscopic methods like Fourier transform infrared spectrometry and differential optical absorption spectroscopy. Emission indices for CO and NO x were calculated and compared to the values given in the International Civil Aviation Organisation (ICAO) database. The emission index for CO for 36 different aircraft engine types and for NO x (24 different engine types) were determined. It was shown that for idling aircraft, CO emissions are underestimated using the ICAO database. The emission indices for NO x determined in this work are lower than given in the ICAO database. In addition, a high variance of emission indices in each aircraft family and from engine to engine of the same engine type was found. During the same measurement campaigns, the emission indices for CO and NO of eight different types of auxilliary power units were investigated.

  17. Developing a Methodology for Measuring Stress Transients at Seismogenic Depth

    NASA Astrophysics Data System (ADS)

    Silver, P. G.; Niu, F.; Daley, T.; Majer, E.

    2005-05-01

    The dependence of crack properties on stress means that crustal seismic velocity exhibits stress dependence. This dependence constitutes, in principle, a powerful means of studying transient changes in stress at seismogenic depth through the repeat measurement of travel time from a controlled source. While the scientific potential of this stress dependence has been known for decades, time-dependent seismic imaging has yet to become a reliable means of measuring subsurface stress changes in fault-zone environments. This is due to 1) insufficient delay-time precision necessary to detect small changes in stress, and 2) the difficulty in establishing a reliable in-situ calibration between stress and seismic velocity. These two problems are coupled because the best sources of calibration, solid-earth tides and barometric pressure, produce weak stress perturbations of order 102-103 Pa that require precision in the measurement of the fractional velocity change dlnv of order 10-6, based on laboratory experiments. We have thus focused on developing a methodology that is capable of providing this high level of precision. For example, we have shown that precision in dlnv is maximized when there are Q/π wavelengths in the source-receiver path. This relationship provides a means of selecting an optimal geometry and/or source characteristic frequency in the planning of experiments. We have initiated a series of experiments to demonstrate the detectability of these stress-calibration signals in progressively more tectonically relevant settings. Initial tests have been completed on the smallest scale, with two boreholes 17 m deep and 3 meters apart. We have used a piezoelectric source (0.1ms source pulse repeated every 100ms) and a string of 24 hydrophones to record P waves with a dominant frequency of 10KHz. Recording was conducted for 160 hours. The massive stacking of ~36,000 high-SNR traces/hr leads to delay-time precision of 6ns (hour sampling) corresponding to dlnv

  18. The Role of Measuring in the Learning of Environmental Occupations and Some Aspects of Environmental Methodology

    ERIC Educational Resources Information Center

    István, Lüko

    2016-01-01

    The methodology neglected area of pedagogy, within the environmental specialist teaching methodology cultivating the best part is incomplete. In this article I shall attempt to environmental methodology presented in one part of the University of West Environmental Education workshop, based on the measurement of experiential learning as an…

  19. Weak measurement and Bohmian conditional wave functions

    SciTech Connect

    Norsen, Travis; Struyve, Ward

    2014-11-15

    It was recently pointed out and demonstrated experimentally by Lundeen et al. that the wave function of a particle (more precisely, the wave function possessed by each member of an ensemble of identically-prepared particles) can be “directly measured” using weak measurement. Here it is shown that if this same technique is applied, with appropriate post-selection, to one particle from a perhaps entangled multi-particle system, the result is precisely the so-called “conditional wave function” of Bohmian mechanics. Thus, a plausibly operationalist method for defining the wave function of a quantum mechanical sub-system corresponds to the natural definition of a sub-system wave function which Bohmian mechanics uniquely makes possible. Similarly, a weak-measurement-based procedure for directly measuring a sub-system’s density matrix should yield, under appropriate circumstances, the Bohmian “conditional density matrix” as opposed to the standard reduced density matrix. Experimental arrangements to demonstrate this behavior–and also thereby reveal the non-local dependence of sub-system state functions on distant interventions–are suggested and discussed. - Highlights: • We study a “direct measurement” protocol for wave functions and density matrices. • Weakly measured states of entangled particles correspond to Bohmian conditional states. • Novel method of observing quantum non-locality is proposed.

  20. Determining seabird body condition using nonlethal measures.

    PubMed

    Jacobs, Shoshanah R; Elliott, Kyle; Guigueno, Mélanie F; Gaston, Anthony J; Redman, Paula; Speakman, John R; Weber, Jean-Michel

    2012-01-01

    Energy stores are critical for successful breeding, and longitudinal studies require nonlethal methods to measure energy stores ("body condition"). Nonlethal techniques for measuring energy reserves are seldom verified independently. We compare body mass, size-corrected mass (SCM), plasma lipids, and isotopic dilution with extracted total body lipid content in three seabird species (thick-billed murres Uria lomvia, all four measures; northern fulmars Fulmarus glacialis, three measures; and black-legged kittiwakes Rissa tridactyla, two measures). SCM and body mass were better predictors of total body lipids for the species with high percent lipids (fulmars; R2 = 0.5-0.6) than for the species with low percent lipids (murres and kittiwakes; R2 = 0.2-0.4). The relationship between SCM and percent body lipids, which we argue is often a better measure of condition, was also poor (R2 < 0.2) for species with low lipids. In a literature comparison of 17 bird species, percent lipids was the only predictor of the strength of the relationship between mass and total body lipids; we suggest that SCM be used as an index of energy stores only when lipids exceed 15% of body mass. Across all three species we measured, SCM based on the ordinary least squares regression of mass on the first principal component outperformed other measures. Isotopic dilution was a better predictor of both total body lipids and percent body lipids than were mass, SCM, or plasma lipids in murres. Total body lipids decreased through the breeding season at both sites, while total and neutral plasma lipid concentrations increased at one site but not another, suggesting mobilization of lipid stores for breeding. A literature review showed substantial variation in the reliability of plasma markers, and we recommend isotopic dilution (oxygen-18, plateau) for determination of energy reserves in birds where lipid content is below 15%.

  1. Mass-based condition measures and their relationship with fitness: in what condition is condition?

    PubMed Central

    Barnett, Craig A.; Suzuki, Toshitaka N.; Sakaluk, Scott K.; Thompson, Charles F.

    2015-01-01

    Mass or body-size measures of ‘condition’ are of central importance to the study of ecology and evolution, and it is often assumed that differences in condition measures are positively and linearly related to fitness. Using examples drawn from ecological studies, we show that indices of condition frequently are unlikely to be related to fitness in a linear fashion. Researchers need to be more explicit in acknowledging the limitations of mass-based condition measures and accept that, under some circumstances, they may not relate to fitness as traditionally assumed. Any relationship between a particular condition measure and fitness should first be empirically validated before condition is used as a proxy for fitness. In the absence of such evidence, researchers should explicitly acknowledge that assuming such a relationship may be unrealistic. PMID:26019406

  2. Methodology of C factor verification in conditions of the Czech Republic

    NASA Astrophysics Data System (ADS)

    Dvorakova, T.; Dostal, T.; David, V.; Kavka, P.; Krasa, J.; Koudelka, P.

    2012-04-01

    Universal Soil Loss Equation (USLE) is a widely used tool for the assessment of soil erosion in the Czech Republic as well as in many other countries. C factor is one of the six factors composing USLE. This factor represents vegetation cover and management on agricultural land. Its values were derived based on a comparison of the soil loss from a plot with given crops and management and the soil loss which is tilled as continuously fallow. The influence of vegetation cover on the soil loss varies during the vegetation season and it is important to determine the representative value of C factor for each vegetative stage. The year is, according to the original methodology, divided into 5 or 6 periods. Expected protective effect for each of these periods is calculated separately. The value of C factor was first published in the Czech Republic in 1984 and since then has never been revised. The values were taken from a USA catalog and it is uncertain how our conditions were verified. But in fact the C factor is very dependent on the technologies or climates in that particular country. The agriculture crops are cultivated [bred] and their protective effect may be different. For some crops there is no C factor value at all. The Department of Irrigation, Drainage and Landscape Engineering, Faculty of Civil Engineering CTU in Prague has dealt with the protection of soil and water for many years. For several years it has operated an experimental basin in which there are three erosion plots identical to those where the USLE was derived. On these plots the value of C factor can be measured very easily, but it would take a very long time to compile the new catalog of these values. The Department acquired a mobile rainfall simulator to quickly obtain a larger set of data. Parameters of the instrument and methodology of determination of individual C factor values using the mobile rainfall simulator are subject to contribution. The water erosion affects most the rainfall event

  3. Bone drilling methodology and tool based on position measurements.

    PubMed

    Díaz, Iñaki; Gil, Jorge Juan; Louredo, Marcos

    2013-11-01

    Bone drilling, despite being a very common procedure in hospitals around the world, becomes very challenging when performed close to organs such as the cochlea or when depth control is critical for avoiding damage to surrounding tissue. To date, several mechatronic prototypes have been proposed to assist surgeons by automatically detecting bone layer transitions and breakthroughs. However, none of them is currently accurate enough to be part of the surgeon's standard equipment. The present paper shows a test bench specially designed to evaluate prior methodologies and analyze their drawbacks. Afterward, a new layer detection methodology with improved performance is described and tested. Finally, the prototype of a portable mechatronic bone drill that takes advantage of the proposed detection algorithm is presented.

  4. Bridge stay cable condition assessment using vibration measurement techniques

    NASA Astrophysics Data System (ADS)

    Tabatabai, Habib; Mehrabi, Armin B.; Yen, Wen-huei P.

    1998-03-01

    In this paper, results of a research project sponsored by the Federal Highway Administration (FHWA) on a non-destructive method for measurement of stay cable forces in cable-stayed bridges are presented. This project included development and verification of specific analytical and experimental procedures for measurement of stay cable forces. In one set of procedures, a single laser vibrometer is used to measure low- level cable vibrations due to ambient (wind and traffic) excitation. The laser device allows rapid measurement of cable vibrations at distances of up to several hundred feet. Procedures are also developed for utilization of accelerometers attached to cables. Contact sensors are more appropriate when long-term remote monitoring is desired. Measured natural frequencies of vibration are related to cable tension through a mathematical formulation developed during the course of this study. This formulation includes the effects of cable sag-extensibility, bending stiffness, various boundary conditions, intermediate springs or dampers, etc. This method can also be used during construction in lieu of the 'lift off' method. The accuracy and effectiveness of this methodology was tested in the laboratory on a scaled model of a cable, and on two cable-stayed bridges. This ability to rapidly measure stay cable forces provides an opportunity for global condition assessment of these major structures.

  5. Optimisation of conditions for the extraction of casearins from Casearia sylvestris using response surface methodology.

    PubMed

    Bandeira, Karin F; Tininis, Aristeu G; Da Bolzani, Vanderlan S; Cavalheiro, Alberto J

    2006-01-01

    Optimal conditions for the extraction of casearins from Casearia sylvestris were determined using response surface methodology. The maceration and sonication extraction techniques were performed using a 3 x 3 x 3 full factorial design including three acidity conditions, three solvents of different polarities and three extraction times. The yields and selectivities of the extraction of casearins were significantly influenced by acidity conditions. Taking into account all variables tested, the optimal conditions for maceration extraction were estimated to involve treatment with dichloromethane saturated with ammonium hydroxide for 26 h. Similar yields and selectivities for casearins were determined for sonication extraction using the same solvent but for the much shorter time of 1 h. The best results for stabilisation of the fresh plant material were obtained using leaves that had been oven dried at 40 degrees C for 48 h.

  6. Novel methodology for accurate resolution of fluid signatures from multi-dimensional NMR well-logging measurements

    NASA Astrophysics Data System (ADS)

    Anand, Vivek

    2017-03-01

    A novel methodology for accurate fluid characterization from multi-dimensional nuclear magnetic resonance (NMR) well-logging measurements is introduced. This methodology overcomes a fundamental challenge of poor resolution of features in multi-dimensional NMR distributions due to low signal-to-noise ratio (SNR) of well-logging measurements. Based on an unsupervised machine-learning concept of blind source separation, the methodology resolves fluid responses from simultaneous analysis of large quantities of well-logging data. The multi-dimensional NMR distributions from a well log are arranged in a database matrix that is expressed as the product of two non-negative matrices. The first matrix contains the unique fluid signatures, and the second matrix contains the relative contributions of the signatures for each measurement sample. No a priori information or subjective assumptions about the underlying features in the data are required. Furthermore, the dimensionality of the data is reduced by several orders of magnitude, which greatly simplifies the visualization and interpretation of the fluid signatures. Compared to traditional methods of NMR fluid characterization which only use the information content of a single measurement, the new methodology uses the orders-of-magnitude higher information content of the entire well log. Simulations show that the methodology can resolve accurate fluid responses in challenging SNR conditions. The application of the methodology to well-logging data from a heavy oil reservoir shows that individual fluid signatures of heavy oil, water associated with clays and water in interstitial pores can be accurately obtained.

  7. Calibration methodology for proportional counters applied to yield measurements of a neutron burst.

    PubMed

    Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo

    2014-01-01

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  8. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    SciTech Connect

    Tarifeño-Saldivia, Ariel E-mail: atarisal@gmail.com; Pavez, Cristian; Soto, Leopoldo; Mayer, Roberto E.

    2014-01-15

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  9. Measurement-based auralization methodology for the assessment of noise mitigation measures

    NASA Astrophysics Data System (ADS)

    Thomas, Pieter; Wei, Weigang; Van Renterghem, Timothy; Botteldooren, Dick

    2016-09-01

    The effect of noise mitigation measures is generally expressed by noise levels only, neglecting the listener's perception. In this study, an auralization methodology is proposed that enables an auditive preview of noise abatement measures for road traffic noise, based on the direction dependent attenuation of a priori recordings made with a dedicated 32-channel spherical microphone array. This measurement-based auralization has the advantage that all non-road traffic sounds that create the listening context are present. The potential of this auralization methodology is evaluated through the assessment of the effect of an L-shaped mound. The angular insertion loss of the mound is estimated by using the ISO 9613-2 propagation model, the Pierce barrier diffraction model and the Harmonoise point-to-point model. The realism of the auralization technique is evaluated by listening tests, indicating that listeners had great difficulty in differentiating between a posteriori recordings and auralized samples, which shows the validity of the followed approaches.

  10. Methodology for determining time-dependent mechanical properties of tuff subjected to near-field repository conditions

    SciTech Connect

    Blacic, J.D.; Andersen, R.

    1983-01-01

    We have established a methodology to determine the time dependence of strength and transport properties of tuff under conditions appropriate to a nuclear waste repository. Exploratory tests to determine the approximate magnitudes of thermomechanical property changes are nearly complete. In this report we describe the capabilities of an apparatus designed to precisely measure the time-dependent deformation and permeability of tuff at simulated repository conditions. Preliminary tests with this new apparatus indicate that microclastic creep failure of tuff occurs over a narrow strain range with little precursory Tertiary creep behavior. In one test, deformation under conditions of slowly decreasing effective pressure resulted in failure, whereas some strain indicators showed a decreasing rate of strain.

  11. Alcohol Measurement Methodology in Epidemiology: Recent Advances and Opportunities

    PubMed Central

    Greenfield, Thomas K.; Kerr, William C.

    2009-01-01

    Aim To review and discuss measurement issues in survey assessment of alcohol consumption for epidemiological studies. Methods The following areas are considered: implications of cognitive studies of question answering like self-referenced schemata of drinking, reference period and retrospective recall, as well as the assets and liabilities of types of current (e.g., food frequency, quantity frequency, graduated frequencies, and heavy drinking indicators) and lifetime drinking measures. Finally we consider units of measurement and improving measurement by detailing the ethanol content of drinks in natural settings. Results and conclusions Cognitive studies suggest inherent limitations in the measurement enterprise, yet diary studies show promise of broadly validating methods that assess a range of drinking amounts per occasion; improvements in survey measures of drinking in the life course are indicated; attending in detail to on and off-premise drink pour sizes and ethanol concentrations of various beverages shows promise of narrowing the coverage gap plaguing survey alcohol measurement. PMID:18422826

  12. Response Surface Methodology: An Extensive Potential to Optimize in vivo Photodynamic Therapy Conditions

    SciTech Connect

    Tirand, Loraine; Bastogne, Thierry; Bechet, Denise M.Sc.; Linder, Michel; Thomas, Noemie; Frochot, Celine; Guillemin, Francois; Barberi-Heyob, Muriel

    2009-09-01

    Purpose: Photodynamic therapy (PDT) is based on the interaction of a photosensitizing (PS) agent, light, and oxygen. Few new PS agents are being developed to the in vivo stage, partly because of the difficulty in finding the right treatment conditions. Response surface methodology, an empirical modeling approach based on data resulting from a set of designed experiments, was suggested as a rational solution with which to select in vivo PDT conditions by using a new peptide-conjugated PS targeting agent, neuropilin-1. Methods and Materials: A Doehlert experimental design was selected to model effects and interactions of the PS dose, fluence, and fluence rate on the growth of U87 human malignant glioma cell xenografts in nude mice, using a fixed drug-light interval. All experimental results were computed by Nemrod-W software and Matlab. Results: Intrinsic diameter growth rate, a tumor growth parameter independent of the initial volume of the tumor, was selected as the response variable and was compared to tumor growth delay and relative tumor volumes. With only 13 experimental conditions tested, an optimal PDT condition was selected (PS agent dose, 2.80 mg/kg; fluence, 120 J/cm{sup 2}; fluence rate, 85 mW/cm{sup 2}). Treatment of glioma-bearing mice with the peptide-conjugated PS agent, followed by the optimized PDT condition showed a statistically significant improvement in delaying tumor growth compared with animals who received the PDT with the nonconjugated PS agent. Conclusions: Response surface methodology appears to be a useful experimental approach for rapid testing of different treatment conditions and determination of optimal values of PDT factors for any PS agent.

  13. A methodology for hard/soft information fusion in the condition monitoring of aircraft

    NASA Astrophysics Data System (ADS)

    Bernardo, Joseph T.

    2013-05-01

    Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.

  14. Application of Response Surface Methodology to Evaluation of Bioconversion Experimental Conditions

    PubMed Central

    Cheynier, Véronique; Feinberg, Max; Chararas, Constantin; Ducauze, Christian

    1983-01-01

    Using Candida tenuis, a yeast isolated from the digestive tube of the larva of Phoracantha semipunctata (Cerambycidae, Coleoptera), we were able to demonstrate the bioconversion of citronellal to citronellol. Response surface methodology was used to achieve the optimization of the experimental conditions for that bioconversion process. To study the proposed second-order polynomial model, we used a central composite experimental design with multiple linear regression to estimate the model coefficients of the five selected factors believed to influence the bioconversion process. Only four were demonstrated to be predominant: the incubation pH, temperature, time, and the amount of substrate. The best reduction yields (close to 90%) were obtained with alkaline pH conditions (pH 7.5), a low temperature (25°C), a small amount of substrate (15 μl), and short incubation time (16 h). This methodology was very efficient: only 36 experiments were necessary to assess these conditions, and model adequacy was very satisfactory as the coefficient of determination was 0.9411. PMID:16346211

  15. A methodological framework for linking bioreactor function to microbial communities and environmental conditions.

    PubMed

    de los Reyes, Francis L; Weaver, Joseph E; Wang, Ling

    2015-06-01

    In the continuing quest to relate microbial communities in bioreactors to function and environmental and operational conditions, engineers and biotechnologists have adopted the latest molecular and 'omic methods. Despite the large amounts of data generated, gaining mechanistic insights and using the data for predictive and practical purposes is still a huge challenge. We present a methodological framework that can guide experimental design, and discuss specific issues that can affect how researchers generate and use data to elucidate the relationships. We also identify, in general terms, bioreactor research opportunities that appear promising. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A comparison methodology for measured and predicted displacement fields in modal analysis

    NASA Astrophysics Data System (ADS)

    Sebastian, C. M.; López-Alba, E.; Patterson, E. A.

    2017-07-01

    Recent advances in experimental mechanics have enabled full-field measurements of deformation fields and - particularly in the field of solid mechanics - methodologies have been proposed for utilizing these fields in the validation of computational models. However, the comparison of modal shapes and the path from the undeformed shape to the deformed shape at the extreme of a vibration cycle is not straightforward. Therefore a new method to compare vibration data from experiment to simulations is presented which uses full-field experimental data from the entire cycle of vibration. Here, the first three modes of vibration of an aerospace panel were compared, covering a frequency range of 14-59 Hz and maximum out-of-plane displacements of 2 mm. Two different comparison methodologies are considered; the first is the use of confidence bands, previously explored for quasi-static loading, the second is the use of a concordance correlation coefficient, which provides quantifiable information about the validity of the simulation. In addition, three different simulation conditions were considered, representing a systematic refinement of the model. It was found that meaningful conclusions can be drawn about the simulation by comparing individual components of deformation from the image decomposition process, such as the relative phase and magnitude. It was ultimately found that the best performing model did not entirely fall within the confidence bounds for all conditions, but returned a concordance correlation coefficient of nearly 70% for all three modes.

  17. Recurrence measure of conditional dependence and applications

    NASA Astrophysics Data System (ADS)

    Ramos, Antônio M. T.; Builes-Jaramillo, Alejandro; Poveda, Germán; Goswami, Bedartha; Macau, Elbert E. N.; Kurths, Jürgen; Marwan, Norbert

    2017-05-01

    Identifying causal relations from observational data sets has posed great challenges in data-driven causality inference studies. One of the successful approaches to detect direct coupling in the information theory framework is transfer entropy. However, the core of entropy-based tools lies on the probability estimation of the underlying variables. Here we propose a data-driven approach for causality inference that incorporates recurrence plot features into the framework of information theory. We define it as the recurrence measure of conditional dependence (RMCD), and we present some applications. The RMCD quantifies the causal dependence between two processes based on joint recurrence patterns between the past of the possible driver and present of the potentially driven, excepting the contribution of the contemporaneous past of the driven variable. Finally, it can unveil the time scale of the influence of the sea-surface temperature of the Pacific Ocean on the precipitation in the Amazonia during recent major droughts.

  18. Recurrence measure of conditional dependence and applications.

    PubMed

    Ramos, Antônio M T; Builes-Jaramillo, Alejandro; Poveda, Germán; Goswami, Bedartha; Macau, Elbert E N; Kurths, Jürgen; Marwan, Norbert

    2017-05-01

    Identifying causal relations from observational data sets has posed great challenges in data-driven causality inference studies. One of the successful approaches to detect direct coupling in the information theory framework is transfer entropy. However, the core of entropy-based tools lies on the probability estimation of the underlying variables. Here we propose a data-driven approach for causality inference that incorporates recurrence plot features into the framework of information theory. We define it as the recurrence measure of conditional dependence (RMCD), and we present some applications. The RMCD quantifies the causal dependence between two processes based on joint recurrence patterns between the past of the possible driver and present of the potentially driven, excepting the contribution of the contemporaneous past of the driven variable. Finally, it can unveil the time scale of the influence of the sea-surface temperature of the Pacific Ocean on the precipitation in the Amazonia during recent major droughts.

  19. Eddy correlation measurements in wet environmental conditions

    NASA Astrophysics Data System (ADS)

    Cuenca, R. H.; Migliori, L.; O Kane, J. P.

    2003-04-01

    The lower Feale catchment is a low-lying peaty area of 200 km^2 situated in southwest Ireland that is subject to inundation by flooding. The catchment lies adjacent to the Feale River and is subject to tidal signals as well as runoff processes. Various mitigation strategies are being investigated to reduce the damage due to flooding. Part of the effort has required development of a detailed hydrologic balance for the study area which is a wet pasture environment with local field drains that are typically flooded. An eddy correlation system was installed in the summer of 2002 to measure components of the energy balance, including evapotranspiration, along with special sensors to measure other hydrologic variables particular to this study. Data collected will be essential for validation of surface flux models to be developed for this site. Data filtering is performed using a combination of software developed by the Boundary-Layer Group (BLG) at Oregon State University together with modifications made to this system for conditions at this site. This automated procedure greatly reduces the tedious inspection of individual records. The package of tests, developed by the BLG for both tower and aircraft high frequency data, checks for electronic spiking, signal dropout, unrealistic magnitudes, extreme higher moment statistics, as well as other error scenarios not covered by the instrumentation diagnostics built into the system. Critical parameter values for each potential error were developed by applying the tests to real fast response turbulent time series. Potential instrumentation problems, flux sampling problems, and unusual physical situations records are flagged for removal or further analysis. A final visual inspection step is required to minimize rejection of physically unusual but real behavior in the time series. The problems of data management, data quality control, individual instrumentation sensitivity, potential underestimation of latent and sensible heat

  20. Optimization of hydrolysis conditions for bovine plasma protein using response surface methodology.

    PubMed

    Seo, Hyun-Woo; Jung, Eun-Young; Go, Gwang-Woong; Kim, Gap-Don; Joo, Seon-Tea; Yang, Han-Sul

    2015-10-15

    The purpose of this study was to establish optimal conditions for the hydrolysis of bovine plasma protein. Response surface methodology was used to model and optimize responses [degree of hydrolysis (DH), 2,2-diphenyl-1-picrydrazyl (DPPH) radical-scavenging activity and Fe(2+)-chelating activity]. Hydrolysis conditions, such as hydrolysis temperature (46.6-63.4 °C), hydrolysis time (98-502 min), and hydrolysis pH (6.32-9.68) were selected as the main processing conditions in the hydrolysis of bovine plasma protein. Optimal conditions for maximum DH (%), DPPH radical-scavenging activity (%) and Fe(2+)-chelating activity (%) of the hydrolyzed bovine plasma protein, were respectively established. We discovered the following three conditions for optimal hydrolysis of bovine plasma: pH of 7.82-8.32, temperature of 54.1 °C, and time of 338.4-398.4 min. We consequently succeeded in hydrolyzing bovine plasma protein under these conditions and confirmed the various desirable properties of optimal hydrolysis.

  1. Measuring service line competitive position. A systematic methodology for hospitals.

    PubMed

    Studnicki, J

    1991-01-01

    To mount a broad effort aimed at improving their competitive position for some service or group of services, hospitals have begun to pursue product line management techniques. A few hospitals have even reorganized completely under the product line framework. The benefits include focusing accountability for operations and results, facilitating coordination between departments and functions, stimulating market segmentation, and promoting rigorous examination of new and existing programs. As part of its strategic planning process, a suburban Baltimore hospital developed a product line management methodology with six basic steps: (1) define the service lines (which they did by grouping all existing diagnosis-related groups into 35 service lines), (2) determine the contribution of each service line to total inpatient volume, (3) determine trends in service line volumes (by comparing data over time), (4) derive a useful comparison group (competing hospitals or groups of hospitals with comparable size, scope of services, payer mix, and financial status), (5) review multiple time frames, and (6) summarize the long- and short-term performance of the hospital's service lines to focus further analysis. This type of systematic and disciplined analysis can become part of a permanent strategic intelligence program. When hospitals have such a program in place, their market research, planning, budgeting, and operations will be tied together in a true management decision support system.

  2. Modeling of the effect of freezer conditions on the hardness of ice cream using response surface methodology.

    PubMed

    Inoue, K; Ochi, H; Habara, K; Taketsuka, M; Saito, H; Ichihashi, N; Iwatsuki, K

    2009-12-01

    The effect of conventional continuous freezer parameters [mix flow (L/h), overrun (%), drawing temperature ( degrees C), cylinder pressure (kPa), and dasher speed (rpm)] on the hardness of ice cream under varying measured temperatures (-5, -10, and -15 degrees C) was investigated systematically using response surface methodology (central composite face-centered design), and the relationships were expressed as statistical models. The range (maximum and minimum values) of each freezer parameter was set according to the actual capability of the conventional freezer and applicability to the manufacturing process. Hardness was measured using a penetrometer. These models showed that overrun and drawing temperature had significant effects on hardness. The models can be used to optimize freezer conditions to make ice cream of the least possible hardness under the highest overrun (120%) and a drawing temperature of approximately -5.5 degrees C (slightly warmer than the lowest drawing temperature of -6.5 degrees C) within the range of this study. With reference to the structural elements of the ice cream, we suggest that the volume of overrun and ice crystal content, ice crystal size, and fat globule destabilization affect the hardness of ice cream. In addition, the combination of a simple instrumental parameter and response surface methodology allows us to show the relation between freezer conditions and one of the most important properties-hardness-visually and quantitatively on the practical level.

  3. Causality analysis in business performance measurement system using system dynamics methodology

    NASA Astrophysics Data System (ADS)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  4. Measurement-based reliability prediction methodology. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Linn, Linda Shen

    1991-01-01

    In the past, analytical and measurement based models were developed to characterize computer system behavior. An open issue is how these models can be used, if at all, for system design improvement. The issue is addressed here. A combined statistical/analytical approach to use measurements from one environment to model the system failure behavior in a new environment is proposed. A comparison of the predicted results with the actual data from the new environment shows a close correspondence.

  5. Aqueduct: a methodology to measure and communicate global water risks

    NASA Astrophysics Data System (ADS)

    Gassert, Francis; Reig, Paul

    2013-04-01

    , helping non-expert audiences better understand and evaluate risks facing water users. This presentation will discuss the methodology used to combine the indicator values into aggregated risk scores and lessons learned from working with diverse audiences in academia, development institutions, and the public and private sectors.

  6. Electrochemical treatment of deproteinated whey wastewater and optimization of treatment conditions with response surface methodology.

    PubMed

    Güven, Güray; Perendeci, Altunay; Tanyolaç, Abdurrahman

    2008-08-30

    Electrochemical treatment of deproteinated whey wastewater produced during cheese manufacture was studied as an alternative treatment method for the first time in literature. Through the preliminary batch runs, appropriate electrode material was determined as iron due to high removal efficiency of chemical oxygen demand (COD), and turbidity. The electrochemical treatment conditions were optimized through response surface methodology (RSM), where applied voltage was kept in the range, electrolyte concentration was minimized, waste concentration and COD removal percent were maximized at 25 degrees C. Optimum conditions at 25 degrees C were estimated through RSM as 11.29 V applied voltage, 100% waste concentration (containing 40 g/L lactose) and 19.87 g/L electrolyte concentration to achieve 29.27% COD removal. However, highest COD removal through the set of runs was found as 53.32% within 8h. These results reveal the applicability of electrochemical treatment to the deproteinated whey wastewater as an alternative advanced wastewater treatment method.

  7. Conditioning Methodologies for DanceSport: Lessons from Gymnastics, Figure Skating, and Concert Dance Research.

    PubMed

    Outevsky, David; Martin, Blake Cw

    2015-12-01

    Dancesport, the competitive branch of ballroom dancing, places high physiological and psychological demands on its practitioners, but pedagogical resources in these areas for this dance form are limited. Dancesport competitors could benefit from strategies used in other aesthetic sports. In this review, we identify conditioning methodologies from gymnastics, figure skating, and contemporary, modern, and ballet dance forms that could have relevance and suitability for dancesport training, and propose several strategies for inclusion in the current dancesport curriculum. We reviewed articles derived from Google Scholar, PubMed, ScienceDirect, Taylor & Francis Online, and Web of Science search engines and databases, with publication dates from 1979 to 2013. The keywords included MeSH terms: dancing, gymnastics, physiology, energy metabolism, physical endurance, and range of motion. Out of 47 papers examined, 41 papers met the inclusion criteria (validity of scientific methods, topic relevance, transferability to dancesport, publication date). Quality and validity of the data were assessed by examining the methodologies in each study and comparing studies on similar populations as well as across time using the PRISMA 2009 checklist and flowchart. The relevant research suggests that macro-cycle periodization planning, aerobic and anaerobic conditioning, range of motion and muscular endurance training, and performance psychology methods have potential for adaptation for dancesport training. Dancesport coaches may help their students fulfill their ambitions as competitive athletes and dance artists by adapting the relevant performance enhancement strategies from gymnastics, figure skating, and concert dance forms presented in this paper.

  8. Measuring Individual Differences in Decision Biases: Methodological Considerations

    PubMed Central

    Aczel, Balazs; Bago, Bence; Szollosi, Aba; Foldes, Andrei; Lukacs, Bence

    2015-01-01

    Individual differences in people's susceptibility to heuristics and biases (HB) are often measured by multiple-bias questionnaires consisting of one or a few items for each bias. This research approach relies on the assumptions that (1) different versions of a decision bias task measure are interchangeable as they measure the same cognitive failure; and (2) that some combination of these tasks measures the same underlying construct. Based on these assumptions, in Study 1 we developed two versions of a new decision bias survey for which we modified 13 HB tasks to increase their comparability, construct validity, and the participants' motivation. The analysis of the responses (N = 1279) showed weak internal consistency within the surveys and a great level of discrepancy between the extracted patterns of the underlying factors. To explore these inconsistencies, in Study 2 we used three original examples of HB tasks for each of seven biases. We created three decision bias surveys by allocating one version of each HB task to each survey. The participants' responses (N = 527) showed a similar pattern as in Study 1, questioning the assumption that the different examples of the HB tasks are interchangeable and that they measure the same underlying construct. These results emphasize the need to understand the domain-specificity of cognitive biases as well as the effect of the wording of the cover story and the response mode on bias susceptibility before employing them in multiple-bias questionnaires. PMID:26635677

  9. Application of MetaRail railway noise measurement methodology: comparison of three track systems

    NASA Astrophysics Data System (ADS)

    Kalivoda, M.; Kudrna, M.; Presle, G.

    2003-10-01

    Within the fourth RTD Framework Programme, the European Union has supported a research project dealing with the improvement of railway noise (emission) measurement methodologies. This project was called MetaRail and proposed a number of procedures and methods to decrease systematic measurement errors and to increase reproducibility. In 1999 the Austrian Federal Railways installed 1000 m of test track to explore the long-term behaviour of three different ballast track systems. This test included track stability, rail forces and ballast forces, as well as vibration transmission and noise emission. The noise study was carried out using the experience and methods developed within MetaRail. This includes rail roughness measurements as well as measurements of vertical railhead, sleeper and ballast vibration in parallel with the noise emission measurement with a single microphone at a distance of 7.5 m from the track. Using a test train with block- and disc-braked vehicles helped to control operational conditions and indicated the influence of different wheel roughness. It has been shown that the parallel recording of several vibration signals together with the noise signal makes it possible to evaluate the contributions of car body, sleeper, track and wheel sources to the overall noise emission. It must be stressed that this method is not focused as is a microphone-array. However, this methodology is far easier to apply and thus cheaper. Within this study, noise emission was allocated to the different elements to answer questions such as whether the sleeper eigenfrequency is transmitted into the rail.

  10. Individualized Outcome Measures of Internal Change: Methodological Considerations.

    ERIC Educational Resources Information Center

    Beutler, Larry E.; Hamblin, David L.

    1986-01-01

    Issues related to the selection of assessment procedures, the scaling of assessment devices, the computation of difference or change scores, and the translation of these scores into common indexes of outcome are explored. Suggestions are provided for refining the measurement of inferred states. (Author/BL)

  11. Practical remarks on the heart rate and saturation measurement methodology

    NASA Astrophysics Data System (ADS)

    Kowal, M.; Kubal, S.; Piotrowski, P.; Staniec, K.

    2017-05-01

    A surface reflection-based method for measuring heart rate and saturation has been introduced as one having a significant advantage over legacy methods in that it lends itself for use in special applications such as those where a person’s mobility is of prime importance (e.g. during a miner’s work) and excluding the use of traditional clips. Then, a complete ATmega1281-based microcontroller platform has been described for performing computational tasks of signal processing and wireless transmission. In the next section remarks have been provided regarding the basic signal processing rules beginning with raw voltage samples of converted optical signals, their acquisition, storage and smoothing. This chapter ends with practical remarks demonstrating an exponential dependence between the minimum measurable heart rate and the readout resolution at different sampling frequencies for different cases of averaging depth (in bits). The following section is devoted strictly to the heart rate and hemoglobin oxygenation (saturation) measurement with the use of the presented platform, referenced to measurements obtained with a stationary certified pulsoxymeter.

  12. Conceptual and Methodological Issues in Treatment Integrity Measurement

    ERIC Educational Resources Information Center

    McLeod, Bryce D.; Southam-Gerow, Michael A.; Weisz, John R.

    2009-01-01

    This special series focused on treatment integrity in the child mental health and education field is timely. The articles do a laudable job of reviewing (a) the current status of treatment integrity research and measurement, (b) existing conceptual models of treatment integrity, and (c) the limitations of prior research. Overall, this thoughtful…

  13. Measure of Landscape Heterogeneity by Agent-Based Methodology

    NASA Astrophysics Data System (ADS)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  14. Conceptual and Methodological Issues in Treatment Integrity Measurement

    ERIC Educational Resources Information Center

    McLeod, Bryce D.; Southam-Gerow, Michael A.; Weisz, John R.

    2009-01-01

    This special series focused on treatment integrity in the child mental health and education field is timely. The articles do a laudable job of reviewing (a) the current status of treatment integrity research and measurement, (b) existing conceptual models of treatment integrity, and (c) the limitations of prior research. Overall, this thoughtful…

  15. Measuring Social Interaction of the Urban Elderly: A Methodological Synthesis.

    ERIC Educational Resources Information Center

    Sokolovsky, Jay; Cohen, Carl I.

    1981-01-01

    Investigates the myth of the inner-city elderly as isolated. Suggests: there are severe limitations to traditional measures of determining sociability; social network analysis can overcome many of the deficiencies of other methods; and a synthesis of the anthropological and sociological approaches to network analysis can optimize data collection.…

  16. Sensible organizations: technology and methodology for automatically measuring organizational behavior.

    PubMed

    Olguin Olguin, Daniel; Waber, Benjamin N; Kim, Taemie; Mohan, Akshay; Ara, Koji; Pentland, Alex

    2009-02-01

    We present the design, implementation, and deployment of a wearable computing platform for measuring and analyzing human behavior in organizational settings. We propose the use of wearable electronic badges capable of automatically measuring the amount of face-to-face interaction, conversational time, physical proximity to other people, and physical activity levels in order to capture individual and collective patterns of behavior. Our goal is to be able to understand how patterns of behavior shape individuals and organizations. By using on-body sensors in large groups of people for extended periods of time in naturalistic settings, we have been able to identify, measure, and quantify social interactions, group behavior, and organizational dynamics. We deployed this wearable computing platform in a group of 22 employees working in a real organization over a period of one month. Using these automatic measurements, we were able to predict employees' self-assessments of job satisfaction and their own perceptions of group interaction quality by combining data collected with our platform and e-mail communication data. In particular, the total amount of communication was predictive of both of these assessments, and betweenness in the social network exhibited a high negative correlation with group interaction satisfaction. We also found that physical proximity and e-mail exchange had a negative correlation of r = -0.55 (p 0.01), which has far-reaching implications for past and future research on social networks.

  17. Methodological aspects of exhaled nitric oxide measurements in infants.

    PubMed

    Gabriele, Carmelo; van der Wiel, Els C; Nieuwhof, Eveline M; Moll, Henriette A; Merkus, Peter J F M; de Jongste, Johan C

    2007-02-01

    Guidelines for the measurement of fractional exhaled nitric oxide (FE(NO)) recommend refraining from lung function tests (LFT) and certain foods and beverages before performing FE(NO) measurements, as they may lead to transiently altered FE(NO) levels. Little is known of such factors in infants. The aim of the present study was to evaluate whether forced expiratory maneuvers, sedation, nasal contamination, and breastfeeding affect FE(NO) values in infants. FE(NO) was measured off-line during tidal breathing by means of a facemask covering nose and mouth. FE(NO) measurements were performed in 45 sedated infants (mean age 12.1 months) who underwent LFT because of airway diseases and in 83 unsedated healthy infants (mean age 4.3 months). In infants with airway diseases, no difference was found in FE(NO) values before and 5 min after LFT (n = 19 infants, p = 0.7) and FE(NO) values before sedation did not differ from FE(NO) values during sedation (n = 10 infants, p = 0.2). Oral FE(NO) values were significantly lower than mixed (nasal + oral) FE(NO) (n = 42 infants, p < 0.001). FE(NO) values before and 5 min after breastfeeding were not different (n = 11 healthy infants, p = 0.57). The short-term reproducibility in healthy infants (n = 54) was satisfactory (intraclass correlation coefficient = 0.94). We conclude that, in infants with airway diseases, LFT prior to FE(NO) measurement did not influence FE(NO) values and FE(NO) values did not change after sedation. Oral FE(NO) values were significantly lower than mixed (oral + nasal) FE(NO), and breastfeeding did not influence FE(NO). Short-term reproducibility in awake healthy infants was good.

  18. Measuring the human psychophysiological conditions without contact

    NASA Astrophysics Data System (ADS)

    Scalise, L.; Casacanditella, L.; Cosoli, G.

    2017-08-01

    Heart Rate Variability, HRV, studies the variations of cardiac rhythm caused by the autonomic regulation. HRV analysis can be applied to the study of the effects of mental or physical stressors on the psychophysiological conditions. The present work is a pilot study performed on a 23-year-old healthy subject. The measurement of HRV was performed by means of two sensors, that is an electrocardiograph and a Laser Doppler Vibrometer, which is a non-contact device able to detect the skin vibrations related to the cardiac activity. The present study aims to evaluate the effects of a physical task on HRV parameters (in both time and frequency domain), and consequently on the autonomic regulation, and the capability of Laser Doppler Vibrometry in correctly detecting the effects of stress on the Heart Variability. The results show a significant reduction of HRV parameters caused by the execution of the physical task (i.e. variations of 25-40% for parameters in time domain, also higher in frequency domain); this is consistent with the fact that stress causes a reduced capability of the organism in varying the Heart Rate (and, consequently, a limited HRV). LDV was able to correctly detect this phenomenon in the time domain, while the parameters in the frequency domain show significant deviations with respect to the gold standard technique (i.e. ECG). This may be due to the movement artefacts that have consistently modified the shape of the vibration signal measured by means of LDV, after having performed the physical task. In the future, in order to avoid this drawback, the LDV technique could be used to evaluate the effects of a mental task on HRV signals (i.e. the evaluation of mental stress).

  19. Ketone measurements using dipstick methodology in cats with diabetes mellitus.

    PubMed

    Zeugswetter, F; Pagitz, M

    2009-01-01

    To compare the results of urine and plasma ketone dip test in a group of diabetic cats with possible ketosis or ketoacidosis, using laboratory plasma beta-hydroxybutyrate measurements as the gold standard. According to clinical examinations, plasma beta-hydroxybutyrate measurements and venous blood gas analysis, 54 cats with diabetes mellitus were classified as non-ketotic (n=3), ketotic (n=40) or ketoacidotic (n=11). Plasma and urine acetoacetate concentrations were determined using urine reagent strips. Although there was a significant positive correlation between blood and urine ketone measurements (r=0.695, P<0.001), the results differed significantly (Z=-3.494, P<0.001). Using the differential positive rates, the best cut-off value to detect cats with ketoacidosis was 1.5 mmol/l for urine and 4 mmol/l for plasma. The sensitivity/specificity was 82/95 per cent for urine and 100/88 per cent for plasma, respectively. The urine and plasma ketone dip tests have a different diagnostic accuracy, and results have to be interpreted differently. Because of its high sensitivity, the plasma ketone dip test performs better than the urine ketone dip test to identify cats with impending or established ketoacidosis.

  20. A Methodology for Robust Multiproxy Paleoclimate Reconstructions and Modeling of Temperature Conditional Quantiles.

    PubMed

    Janson, Lucas; Rajaratnam, Bala

    Great strides have been made in the field of reconstructing past temperatures based on models relating temperature to temperature-sensitive paleoclimate proxies. One of the goals of such reconstructions is to assess if current climate is anomalous in a millennial context. These regression based approaches model the conditional mean of the temperature distribution as a function of paleoclimate proxies (or vice versa). Some of the recent focus in the area has considered methods which help reduce the uncertainty inherent in such statistical paleoclimate reconstructions, with the ultimate goal of improving the confidence that can be attached to such endeavors. A second important scientific focus in the subject area is the area of forward models for proxies, the goal of which is to understand the way paleoclimate proxies are driven by temperature and other environmental variables. One of the primary contributions of this paper is novel statistical methodology for (1) quantile regression with autoregressive residual structure, (2) estimation of corresponding model parameters, (3) development of a rigorous framework for specifying uncertainty estimates of quantities of interest, yielding (4) statistical byproducts that address the two scientific foci discussed above. We show that by using the above statistical methodology we can demonstrably produce a more robust reconstruction than is possible by using conditional-mean-fitting methods. Our reconstruction shares some of the common features of past reconstructions, but we also gain useful insights. More importantly, we are able to demonstrate a significantly smaller uncertainty than that from previous regression methods. In addition, the quantile regression component allows us to model, in a more complete and flexible way than least squares, the conditional distribution of temperature given proxies. This relationship can be used to inform forward models relating how proxies are driven by temperature.

  1. Bayesian methodology to estimate and update safety performance functions under limited data conditions: a sensitivity analysis.

    PubMed

    Heydari, Shahram; Miranda-Moreno, Luis F; Lord, Dominique; Fu, Liping

    2014-03-01

    In road safety studies, decision makers must often cope with limited data conditions. In such circumstances, the maximum likelihood estimation (MLE), which relies on asymptotic theory, is unreliable and prone to bias. Moreover, it has been reported in the literature that (a) Bayesian estimates might be significantly biased when using non-informative prior distributions under limited data conditions, and that (b) the calibration of limited data is plausible when existing evidence in the form of proper priors is introduced into analyses. Although the Highway Safety Manual (2010) (HSM) and other research studies provide calibration and updating procedures, the data requirements can be very taxing. This paper presents a practical and sound Bayesian method to estimate and/or update safety performance function (SPF) parameters combining the information available from limited data with the SPF parameters reported in the HSM. The proposed Bayesian updating approach has the advantage of requiring fewer observations to get reliable estimates. This paper documents this procedure. The adopted technique is validated by conducting a sensitivity analysis through an extensive simulation study with 15 different models, which include various prior combinations. This sensitivity analysis contributes to our understanding of the comparative aspects of a large number of prior distributions. Furthermore, the proposed method contributes to unification of the Bayesian updating process for SPFs. The results demonstrate the accuracy of the developed methodology. Therefore, the suggested approach offers considerable promise as a methodological tool to estimate and/or update baseline SPFs and to evaluate the efficacy of road safety countermeasures under limited data conditions.

  2. A Methodology for Robust Multiproxy Paleoclimate Reconstructions and Modeling of Temperature Conditional Quantiles

    PubMed Central

    Janson, Lucas; Rajaratnam, Bala

    2014-01-01

    Great strides have been made in the field of reconstructing past temperatures based on models relating temperature to temperature-sensitive paleoclimate proxies. One of the goals of such reconstructions is to assess if current climate is anomalous in a millennial context. These regression based approaches model the conditional mean of the temperature distribution as a function of paleoclimate proxies (or vice versa). Some of the recent focus in the area has considered methods which help reduce the uncertainty inherent in such statistical paleoclimate reconstructions, with the ultimate goal of improving the confidence that can be attached to such endeavors. A second important scientific focus in the subject area is the area of forward models for proxies, the goal of which is to understand the way paleoclimate proxies are driven by temperature and other environmental variables. One of the primary contributions of this paper is novel statistical methodology for (1) quantile regression with autoregressive residual structure, (2) estimation of corresponding model parameters, (3) development of a rigorous framework for specifying uncertainty estimates of quantities of interest, yielding (4) statistical byproducts that address the two scientific foci discussed above. We show that by using the above statistical methodology we can demonstrably produce a more robust reconstruction than is possible by using conditional-mean-fitting methods. Our reconstruction shares some of the common features of past reconstructions, but we also gain useful insights. More importantly, we are able to demonstrate a significantly smaller uncertainty than that from previous regression methods. In addition, the quantile regression component allows us to model, in a more complete and flexible way than least squares, the conditional distribution of temperature given proxies. This relationship can be used to inform forward models relating how proxies are driven by temperature. PMID:25587203

  3. Optimization of hull-less pumpkin seed roasting conditions using response surface methodology.

    PubMed

    Vujasinović, Vesna; Radočaj, Olga; Dimić, Etelka

    2012-05-01

    Response surface methodology (RSM) was applied to optimize hull-less pumpkin seed roasting conditions before seed pressing to maximize the biochemical composition and antioxidant capacity of the virgin pumpkin oils obtained using a hydraulic press. Hull-less pumpkin seeds were roasted for various lengths of time (30 to 70 min) at various roasting temperatures (90 to 130 °C), resulting in 9 different oil samples, while the responses were phospholipids content, total phenols content, α- and γ-tocopherols, and antioxidative activity [by 2,2-diphenyl-1-picrylhydrazyl (DPPH) free-radical assay]. Mathematical models have shown that roasting conditions influenced all dependent variables at P < 0.05. The higher roasting temperatures had a significant effect (P < 0.05) on phospholipids, phenols, and α-tocopherols contents, while longer roasting time had a significant effect (P < 0.05) on γ-tocopherol content and antioxidant capacity, among the samples prepared under different roasting conditions. The optimum conditions for roasting the hull-less pumpkin seeds were 120 °C for duration of 49 min, which resulted in these oil concentrations: phospholipids 0.29%, total phenols 23.06 mg/kg, α-tocopherol 5.74 mg/100 g, γ-tocopherol 24.41 mg/100 g, and an antioxidative activity (EC(50)) of 27.18 mg oil/mg DPPH.

  4. Phenomenological methodology for assessing the influence of flow conditions on the acoustic response of exhaust aftertreatment systems

    NASA Astrophysics Data System (ADS)

    Torregrosa, A. J.; Arnau, F. J.; Piqueras, P.; Sanchis, E. J.; Tartoussi, H.

    2017-05-01

    The increasing limits of standards on aerosol and gaseous emissions from internal combustion engines have led to the progressive inclusion of different exhaust aftertreatment systems (EATS) as a part of the powertrain. Regulated emissions are generally abated making use of devices based on monolithic structures with different chemical functions. As a side effect, wave transmission across the device is affected and so is the boundary at the exhaust line inlet, so that the design of the latter is in turn affected. While some models are available for the prediction of these effects, the geometrical complexity of many devices makes still necessary in many cases to rely on experimental measurements, which cannot cover all the diversity of flow conditions under which these devices operate. To overcome this limitation, a phenomenological methodology is proposed in this work that allows for the sound extrapolation of experimental results to flow conditions different from those used in the measurements. The transfer matrix is obtained from tests in an impulse rig for different excitation amplitudes and mean flows. The experimental coefficients of the transmission matrix of the device are fitted to Fourier series. It allows treating the influence of the flow conditions on the acoustic response, which is manifested on changes in the characteristic periods, separately from the specific properties of every device. In order to provide predictive capabilities to the method, the Fourier series approach is coupled to a gas dynamics model able to account for the sensitivity of propagation velocity to variations in the flow conditions.

  5. Methodology Development for Measurement of Agent Fate in an Environmental Wind Tunnel

    DTIC Science & Technology

    2005-10-01

    METHODOLOGY DEVELOPMENT FOR MEASUREMENT OF AGENT FATE IN AN ENVIRONMENTAL WIND TUNNEL Wendel Shuely, Robert Nickol, GEO-Centers, and...managerial support by Dr. H. Durst , Mr. L. Bickford and Dr. J. Savage, ECBC, and Mr. Tim Bauer, NSWC.

  6. Measuring multiple discrimination through a survey-based methodology.

    PubMed

    Cea D'Ancona, Ma Ángeles

    2017-09-01

    This paper focuses on the concept of multiple discrimination and its measurement through survey methods. The study was designed as a quasi-experimental comparison of survey mode effects on the quality of discrimination measurement: the traditional 'face-to-face' survey, the conventional self-completed mode and CAWI (finally deleted due to its non-comparability). Consistent with our hypothesis, some support was obtained for the social desirability bias and survey mode effects: 1) self-administration of questionnaires favours the declaration of discriminatory attitudes and personal experiences of discrimination; 2) the effect of privacy is greater in direct indicators of discriminatory attitudes; 3) perceptions and experiences of discrimination are more frequently reported by highly educated respondents. Nevertheless, contrary to our expectations, less educated respondents are also affected by survey mode and continue to be underrepresented in self-completed methods. The current research aims to serve as a basis for further research in this area. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Effective speed and agility conditioning methodology for random intermittent dynamic type sports.

    PubMed

    Bloomfield, Jonathan; Polman, Remco; O'Donoghue, Peter; McNaughton, Lars

    2007-11-01

    Different coaching methods are often used to improve performance. This study compared the effectiveness of 2 methodologies for speed and agility conditioning for random, intermittent, and dynamic activity sports (e.g., soccer, tennis, hockey, basketball, rugby, and netball) and the necessity for specialized coaching equipment. Two groups were delivered either a programmed method (PC) or a random method (RC) of conditioning with a third group receiving no conditioning (NC). PC participants used the speed, agility, quickness (SAQ) conditioning method, and RC participants played supervised small-sided soccer games. PC was also subdivided into 2 groups where participants either used specialized SAQ equipment or no equipment. A total of 46 (25 males and 21 females) untrained participants received (mean +/- SD) 12.2 +/- 2.1 hours of physical conditioning over 6 weeks between a battery of speed and agility parameter field tests. Two-way analysis of variance results indicated that both conditioning groups showed a significant decrease in body mass and body mass index, although PC achieved significantly greater improvements on acceleration, deceleration, leg power, dynamic balance, and the overall summation of % increases when compared to RC and NC (p < 0.05). PC in the form of SAQ exercises appears to be a superior method for improving speed and agility parameters; however, this study found that specialized SAQ equipment was not a requirement to observe significant improvements. Further research is required to establish whether these benefits transfer to sport-specific tasks as well as to the underlying mechanisms resulting in improved performance.

  8. Global positioning system: a methodology for modelling the pseudorange measurements

    NASA Astrophysics Data System (ADS)

    Barros, M. S. S.; Rosa, L. C. L.; Walter, F.; Alves, L. H. P. M.

    1999-01-01

    A model for GPS measurements (pseudorange) based on time series statistic (ARIMA) is presented. The model is based on data collected by a Trimble differential GPS receiver over a period of 20 minutes, corresponding to more than 700 data points. The best fitting model, ARIMA(2,2,1), was obtained after testing different models. The final model shows a square law behavior and a residue with a Gaussian like shape with a mean close to zero, and standard deviation of 1.21 m. This result is confirmed by the Kolmogorov-Smirnov test at the significance level of 5%. The independence is tested finding the autocorrelation function, and it is shown that within the confidence interval the independence hypothesis is confirmed (Durbin-Watson test).

  9. Measures of outdoor play and independent mobility in children and youth: A methodological review.

    PubMed

    Bates, Bree; Stone, Michelle R

    2015-09-01

    Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct

  10. Methodology for reliability based condition assessment. Application to concrete structures in nuclear plants

    SciTech Connect

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period.

  11. Calf venous compliance measured by venous occlusion plethysmography: methodological aspects.

    PubMed

    Skoog, Johan; Zachrisson, Helene; Lindenberger, Marcus; Ekman, Mikael; Ewerman, Lea; Länne, Toste

    2015-02-01

    Calf venous compliance (C calf) is commonly evaluated with venous occlusion plethysmography (VOP) during a standard cuff deflation protocol. However, the technique relies on two not previously validated assumptions concerning thigh cuff pressure (P cuff) transmission and the impact of net fluid filtration (F filt) on C calf. The aim was to validate VOP in the lower limb and to develop a model to correct for F filt during VOP. Strain-gauge technique was used to study calf volume changes in 15 women and 10 age-matched men. A thigh cuff was inflated to 60 mmHg for 4 and 8 min with a subsequent decrease of 1 mmHg s(-1). Intravenous pressure (P iv) was measured simultaneously. C calf was determined with the commonly used equation [Compliance = β 1 + 2β 2 × P cuff] describing the pressure-compliance relationship. A model was developed to identify and correct for F filt. Transmission of P cuff to P iv was 100 %. The decrease in P cuff correlated well with P iv reduction (r = 0.99, P < 0.001). Overall, our model showed that C calf was underestimated when F filt was not accounted for (all P < 0.01). F filt was higher in women (P < 0.01) and showed a more pronounced effect on C calf compared to men (P < 0.05). The impact of F filt was similar during 4- and 8-min VOP. P cuff is an adequate substitute for P iv in the lower limb. F filt is associated with an underestimation of C calf and differences in the effect of F filt during VOP can be accounted for with the correction model. Thus, our model seems to be a valuable tool in future studies of venous wall function.

  12. Optimizing conditions for methylmercury extraction from fish samples for GC analysis using response surface methodology.

    PubMed

    Hajeb, P; Jinap, S; Abu Bakar, F; Bakar, J

    2009-06-01

    Response surface methodology (RSM) was used to determine the optimum experimental conditions to extract methylmercury from fish samples for GC analysis. The influence of four variables - acid concentration (3-12 M), cysteine concentration (0.5-2% w/v), solvent volume (3-9 ml) and extraction time (10-30 min) - on recovery of methylmercury was evaluated. The detection limit for methylmercury analysis using a microelectron capture detector was 7 ng g(-1) in fish samples. The mean recovery under optimum conditions was 94%. Experimental data were adequately fitted into a second-order polynomial model with multiple regression coefficients (r(2)) of 0.977. The four variables had a significant effect (p < 0.05) on the recovery of methylmercury from a reference material (BCR-463). Optimum conditions for methylmercury extraction were found using an acid concentration of 12.2 M, cysteine concentration of 2.4%, solvent volume of 1.5 ml and extraction time of 35 min. The validation of the developed method to analyze methylmercury in fish samples exhibited good agreement with mercury content in the samples.

  13. Optimization of culture conditions for flexirubin production by Chryseobacterium artocarpi CECT 8497 using response surface methodology.

    PubMed

    Venil, Chidambaram Kulandaisamy; Zakaria, Zainul Akmar; Ahmad, Wan Azlina

    2015-01-01

    Flexirubins are the unique type of bacterial pigments produced by the bacteria from the genus Chryseobacterium, which are used in the treatment of chronic skin disease, eczema etc. and may serve as a chemotaxonomic marker. Chryseobacterium artocarpi CECT 8497, an yellowish-orange pigment producing strain was investigated for maximum production of pigment by optimizing medium composition employing response surface methodology (RSM). Culture conditions affecting pigment production were optimized statistically in shake flask experiments. Lactose, l-tryptophan and KH2PO4 were the most significant variables affecting pigment production. Box Behnken design (BBD) and RSM analysis were adopted to investigate the interactions between variables and determine the optimal values for maximum pigment production. Evaluation of the experimental results signified that the optimum conditions for maximum production of pigment (521.64 mg/L) in 50 L bioreactor were lactose 11.25 g/L, l-tryptophan 6 g/L and KH2PO4 650 ppm. Production under optimized conditions increased to 7.23 fold comparing to its production prior to optimization. Results of this study showed that statistical optimization of medium composition and their interaction effects enable short listing of the significant factors influencing maximum pigment production from Chryseobacterium artocarpi CECT 8497. In addition, this is the first report optimizing the process parameters for flexirubin type pigment production from Chryseobacterium artocarpi CECT 8497.

  14. [Factors conditioning primary care services utilization. Empirical evidence and methodological inconsistencies].

    PubMed

    Sáez, M

    2003-01-01

    In Spain, the degree and characteristics of primary care services utilization have been the subject of analysis since at least the 1980s. One of the main reasons for this interest is to assess the extent to which utilization matches primary care needs. In fact, the provision of an adequate health service for those who most need it is a generally accepted priority. The evidence shows that individual characteristics, mainly health status, are the factors most closely related to primary care utilization. Other personal characteristics, such as gender and age, could act as modulators of health care need. Some family and/or cultural variables, as well as factors related to the health care professional and institutions, could explain some of the observed variability in primary care services utilization. Socioeconomic variables, such as income, reveal a paradox. From an aggregate perspective, income is the main determinant of utilization as well as of health care expenditure. When data are analyzed for individuals, however, income is not related to primary health utilization. The situation is controversial, with methodological implications and, above all, consequences for the assessment of the efficiency in primary care utilization. Review of the literature reveals certain methodological inconsistencies that could at least partly explain the disparity of the empirical results. Among others, the following flaws can be highlighted: design problems, measurement errors, misspecification, and misleading statistical methods.Some solutions, among others, are quasi-experiments, the use of large administrative databases and of primary data sources (design problems); differentiation between types of utilization and between units of analysis other than consultations, and correction of measurement errors in the explanatory variables (measurement errors); consideration of relevant explanatory variables (misspecification); and the use of multilevel models (statistical methods).

  15. A methodological approach of estimating resistance to flow under unsteady flow conditions

    NASA Astrophysics Data System (ADS)

    Mrokowska, M. M.; Rowiński, P. M.; Kalinowska, M. B.

    2015-10-01

    This paper presents an evaluation and analysis of resistance parameters: friction slope, friction velocity and Manning coefficient in unsteady flow. The methodology to enhance the evaluation of resistance by relations derived from flow equations is proposed. The main points of the methodology are (1) to choose a resistance relation with regard to a shape of a channel and (2) type of wave, (3) to choose an appropriate method to evaluate slope of water depth, and (4) to assess the uncertainty of result. In addition to a critical analysis of existing methods, new approaches are presented: formulae for resistance parameters for a trapezoidal channel, and a translation method instead of Jones' formula to evaluate the gradient of flow depth. Measurements obtained from artificial dam-break flood waves in a small lowland watercourse have made it possible to apply the method and to analyse to what extent resistance parameters vary in unsteady flow. The study demonstrates that results of friction slope and friction velocity are more sensitive to applying simplified formulae than the Manning coefficient (n). n is adequate as a flood routing parameter but may be misleading when information on trend of resistance with flow rate is crucial. Then friction slope or friction velocity seems to be better choice.

  16. Novel methodology for accurate resolution of fluid signatures from multi-dimensional NMR well-logging measurements.

    PubMed

    Anand, Vivek

    2017-03-01

    A novel methodology for accurate fluid characterization from multi-dimensional nuclear magnetic resonance (NMR) well-logging measurements is introduced. This methodology overcomes a fundamental challenge of poor resolution of features in multi-dimensional NMR distributions due to low signal-to-noise ratio (SNR) of well-logging measurements. Based on an unsupervised machine-learning concept of blind source separation, the methodology resolves fluid responses from simultaneous analysis of large quantities of well-logging data. The multi-dimensional NMR distributions from a well log are arranged in a database matrix that is expressed as the product of two non-negative matrices. The first matrix contains the unique fluid signatures, and the second matrix contains the relative contributions of the signatures for each measurement sample. No a priori information or subjective assumptions about the underlying features in the data are required. Furthermore, the dimensionality of the data is reduced by several orders of magnitude, which greatly simplifies the visualization and interpretation of the fluid signatures. Compared to traditional methods of NMR fluid characterization which only use the information content of a single measurement, the new methodology uses the orders-of-magnitude higher information content of the entire well log. Simulations show that the methodology can resolve accurate fluid responses in challenging SNR conditions. The application of the methodology to well-logging data from a heavy oil reservoir shows that individual fluid signatures of heavy oil, water associated with clays and water in interstitial pores can be accurately obtained. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Methodological quality of studies on the measurement properties of neck pain and disability questionnaires: a systematic review.

    PubMed

    Terwee, Caroline B; Schellingerhout, Jasper M; Verhagen, Arianne P; Koes, Bart W; de Vet, Henrica C W

    2011-05-01

    The aim of this study was to obtain an overview of the methodological quality of studies on the measurement properties of neck pain and disability questionnaires and to describe how well various aspects of the design and statistical analyses of studies on measurement properties are performed. A systematic review was performed of published studies on the measurement properties of neck pain and disability questionnaires. Two reviewers independently rated the quality of the studies using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. This checklist was developed in an international Delphi consensus study. A total of 47 articles were included on the measurement properties of 8 different questionnaires. The methodological quality of the included studies was adequate on some aspects (often, adequate statistical analyses are used for assessing reliability, measurement error, and construct validity) but can be improved on other aspects. The most important methodological aspects that need to be improved are as follows: assessing unidimensionality in internal consistency analysis, stable patients and similar test conditions in studies on reliability and measurement error, and more emphasis on the relevance and comprehensiveness of the items in content validity studies. Furthermore, it is recommended that studies on construct validity and responsiveness should be based on predefined hypotheses and that better statistical methods should be used in responsiveness studies. Considering the importance of adequate measurement properties, it is concluded that, in the field of measuring neck pain and disability, there is room for improvement in the methodological quality of studies measurement properties. Copyright © 2011 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  18. Exploring the optimum conditions for maximizing the microbial growth of Candida intermedia by response surface methodology.

    PubMed

    Yönten, Vahap; Aktaş, Nahit

    2014-01-01

    Exploring optimum and cost-efficient medium composition for microbial growth of Candida intermedia Y-1981 yeast culture growing on whey was studied by applying a multistep response surface methodology. In the first step, Plackett-Burman (PB) design was utilized to determine the most significant fermentation medium factors on microbial growth. The medium temperature, sodium chloride and lactose concentrations were determined as the most important factors. Subsequently, the optimum combinations of the selected factors were explored by steepest ascent (SA) and central composite design (CCD). The optimum values for lactose and sodium chloride concentrations and medium temperature were found to be 18.4 g/L, 0.161 g/L, and 32.4°C, respectively. Experiments carried out at the optimum conditions revealed a maximum specific growth rate of 0.090 1/hr; 42% of total lactose removal was achieved in 24 h of fermentation time. The obtained results were finally verified with batch reactor experiments carried out under the optimum conditions evaluated.

  19. Development of a methodological tool for the assessment of the hydromorphological conditions of lakes in Europe

    NASA Astrophysics Data System (ADS)

    Gay, Aurore; Argillier, Christine; Reynaud, Nathalie; Nicolas, Delphine; Baudoin, Jean-Marc

    2017-04-01

    The assessment of the ecological status of surface waters considering the biological, physico-chemical and hydromorphological conditions is requested by the European Water Framework Directive (WFD). If research efforts have particularly concentrated on rivers, lakes have yet received less attention. Nevertheless, due to their function of receptacles of inland waters, the habitats they provide to an important biodiversity and the numerous services they support (water supply, recreational activities, hydroelectricity), assessing the ecological quality of lakes becomes crucial for their protection. Still, this task remains challenging, especially considering the hydromorphological compartments. Indeed, while promising tools already exist to assess the lake biological and physico-chemical status, our comprehension of the impact of hydromophological impairments on the global ecosystem functioning remains poor and existing tools to assess such impacts often focus only on morphological aspects and in a qualitative rather than quantitative way. In this context, our study aims at providing stakeholders with a methodology to assess quantitatively the hydrological and morphological quality of lakes in Europe. The developed methodology, LAKe HYdromorphological Conditions tool (LAKHYC tool) is based on our current knowledge of the functioning of lakes and pre-existing works (e.g., Rowan et al., 2012; Rinaldi et al., 2013). The LAKHYC tool integrates the six parameters requested by the WFD, each one being assessed by at least three descriptors that are calculated as Ecological Quality Ratios, i.e. as the deviation from a reference condition. The originality of the present method lies in the fact that specific reference conditions are defined for each descriptor. In this way, we avoid using a predetermined set of lakes considered as not impacted by human activities and which often corresponds to natural lakes in specific areas (e.g., mountains) and do not represent the diversity

  20. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    NASA Astrophysics Data System (ADS)

    Tarifeño-Saldivia, Ariel; Mayer, Roberto E.; Pavez, Cristian; Soto, Leopoldo

    2015-03-01

    This work introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from detection of the burst of neutrons. An improvement of more than one order of magnitude in the accuracy of a paraffin wax moderated 3He-filled tube is obtained by using this methodology with respect to previous calibration methods.

  1. A combined linear optimisation methodology for water resources allocation in Alfeios River Basin (Greece) under uncertain and vague system conditions

    NASA Astrophysics Data System (ADS)

    Bekri, Eleni; Yannopoulos, Panayotis; Disse, Markus

    2013-04-01

    In the present study, a combined linear programming methodology, based on Li et al. (2010) and Bekri et al. (2012), is employed for optimizing water allocation under uncertain system conditions in the Alfeios River Basin, in Greece. The Alfeios River is a water resources system of great natural, ecological, social and economic importance for Western Greece, since it has the longest and highest flow rate watercourse in the Peloponnisos region. Moreover, the river basin was exposed in the last decades to a plethora of environmental stresses (e.g. hydrogeological alterations, intensively irrigated agriculture, surface and groundwater overexploitation and infrastructure developments), resulting in the degradation of its quantitative and qualitative characteristics. As in most Mediterranean countries, water resource management in Alfeios River Basin has been focused up to now on an essentially supply-driven approach. It is still characterized by a lack of effective operational strategies. Authority responsibility relationships are fragmented, and law enforcement and policy implementation are weak. The present regulated water allocation puzzle entails a mixture of hydropower generation, irrigation, drinking water supply and recreational activities. Under these conditions its water resources management is characterised by high uncertainty and by vague and imprecise data. The considered methodology has been developed in order to deal with uncertainties expressed as either probability distributions, or/and fuzzy boundary intervals, derived by associated α-cut levels. In this framework a set of deterministic submodels is studied through linear programming. The ad hoc water resources management and alternative management patterns in an Alfeios subbasin are analyzed and evaluated under various scenarios, using the above mentioned methodology, aiming to promote a sustainable and equitable water management. Li, Y.P., Huang, G.H. and S.L., Nie, (2010), Planning water resources

  2. Measuring and Targeting Internal Conditions for School Effectiveness in the Free State of South Africa

    ERIC Educational Resources Information Center

    Kgaile, Abraham; Morrison, Keith

    2006-01-01

    A questionnaire-based methodology for constructing an overall index of school effectiveness is reported, focusing on within-school conditions of schools, which is currently being used in the Free State of South Africa. The article reports the construction and use of a straightforward instrument for measuring the effectiveness of key aspects of the…

  3. Methodology of Blade Unsteady Pressure Measurement in the NASA Transonic Flutter Cascade

    NASA Technical Reports Server (NTRS)

    Lepicovsky, J.; McFarland, E. R.; Capece, V. R.; Jett, T. A.; Senyitko, R. G.

    2002-01-01

    In this report the methodology adopted to measure unsteady pressures on blade surfaces in the NASA Transonic Flutter Cascade under conditions of simulated blade flutter is described. The previous work done in this cascade reported that the oscillating cascade produced waves, which for some interblade phase angles reflected off the wind tunnel walls back into the cascade, interfered with the cascade unsteady aerodynamics, and contaminated the acquired data. To alleviate the problems with data contamination due to the back wall interference, a method of influence coefficients was selected for the future unsteady work in this cascade. In this approach only one blade in the cascade is oscillated at a time. The majority of the report is concerned with the experimental technique used and the experimental data generated in the facility. The report presents a list of all test conditions for the small amplitude of blade oscillations, and shows examples of some of the results achieved. The report does not discuss data analysis procedures like ensemble averaging, frequency analysis, and unsteady blade loading diagrams reconstructed using the influence coefficient method. Finally, the report presents the lessons learned from this phase of the experimental effort, and suggests the improvements and directions of the experimental work for tests to be carried out for large oscillation amplitudes.

  4. Theoretical and Methodological Challenges in Measuring Instructional Quality in Mathematics Education Using Classroom Observations

    ERIC Educational Resources Information Center

    Schlesinger, Lena; Jentsch, Armin

    2016-01-01

    In this article, we analyze theoretical as well as methodological challenges in measuring instructional quality in mathematics classrooms by examining standardized observational instruments. At the beginning, we describe the results of a systematic literature review for determining subject-specific aspects measured in recent lesson studies in…

  5. Theoretical and Methodological Challenges in Measuring Instructional Quality in Mathematics Education Using Classroom Observations

    ERIC Educational Resources Information Center

    Schlesinger, Lena; Jentsch, Armin

    2016-01-01

    In this article, we analyze theoretical as well as methodological challenges in measuring instructional quality in mathematics classrooms by examining standardized observational instruments. At the beginning, we describe the results of a systematic literature review for determining subject-specific aspects measured in recent lesson studies in…

  6. Mathematical simulation of power conditioning systems. Volume 1: Simulation of elementary units. Report on simulation methodology

    NASA Technical Reports Server (NTRS)

    Prajous, R.; Mazankine, J.; Ippolito, J. C.

    1978-01-01

    Methods and algorithms used for the simulation of elementary power conditioning units buck, boost, and buck-boost, as well as shunt PWM are described. Definitions are given of similar converters and reduced parameters. The various parts of the simulation to be carried out are dealt with; local stability, corrective network, measurements of input-output impedance and global stability. A simulation example is given.

  7. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology.

    PubMed

    Arulmathi, P; Elangovan, G; Begum, A Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm(2), electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively.

  8. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology

    PubMed Central

    Arulmathi, P.; Elangovan, G.; Begum, A. Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm2, electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively. PMID:26491716

  9. Optimization of culture conditions for hydrogen production by Ethanoligenens harbinense B49 using response surface methodology.

    PubMed

    Guo, Wan-Qian; Ren, Nan-Qi; Wang, Xiang-Jing; Xiang, Wen-Sheng; Ding, Jie; You, Yang; Liu, Bing-Feng

    2009-02-01

    The design of an optimum and cost-efficient medium for high-level production of hydrogen by Ethanoligenens harbinense B49 was attempted by using response surface methodology (RSM). Based on the Plackett-Burman design, Fe(2+) and Mg(2+) were selected as the most critical nutrient salts. Subsequently, the optimum combination of the selected factors and the sole carbon source glucose were investigated by the Box-Behnken design. Results showed that the maximum hydrogen yield of 2.21 mol/mol glucose was predicted when the concentrations of glucose, Fe(2+) and Mg(2+) were 14.57 g/L, 177.28 mg/L and 691.98 mg/L, respectively. The results were further verified by triplicate experiments. The batch reactors were operated under an optimized condition of the respective glucose, Fe(2+) and Mg(2+) concentration of 14.5 g/L, 180 mg/L and 690 mg/L, the initial pH of 6.0 and experimental temperature of 35+/-1(o)C. Without further pH adjustment, the maximum hydrogen yield of 2.20 mol/mol glucose was obtained based on the optimized medium with further verified the practicability of this optimum strategy.

  10. Technical Report on Preliminary Methodology for Enhancing Risk Monitors with Integrated Equipment Condition Assessment

    SciTech Connect

    Ramuhalli, Pradeep; Coles, Garill A.; Coble, Jamie B.; Hirt, Evelyn H.

    2013-09-17

    Small modular reactors (SMRs) generally include reactors with electric output of ~350 MWe or less (this cutoff varies somewhat but is substantially less than full-size plant output of 700 MWe or more). Advanced SMRs (AdvSMRs) refer to a specific class of SMRs and are based on modularization of advanced reactor concepts. AdvSMRs may provide a longer-term alternative to traditional light-water reactors (LWRs) and SMRs based on integral pressurized water reactor concepts currently being considered. Enhancing affordability of AdvSMRs will be critical to ensuring wider deployment. AdvSMRs suffer from loss of economies of scale inherent in small reactors when compared to large (~greater than 600 MWe output) reactors. Some of this loss can be recovered through reduced capital costs through smaller size, fewer components, modular fabrication processes, and the opportunity for modular construction. However, the controllable day-to-day costs of AdvSMRs will be dominated by operation and maintenance (O&M) costs. Technologies that help characterize real-time risk are important for controlling O&M costs. Risk monitors are used in current nuclear power plants to provide a point-in-time estimate of the system risk given the current plant configuration (e.g., equipment availability, operational regime, and environmental conditions). However, current risk monitors are unable to support the capability requirements listed above as they do not take into account plant-specific normal, abnormal, and deteriorating states of active components and systems. This report documents technology developments that are a step towards enhancing risk monitors that, if integrated with supervisory plant control systems, can provide the capability requirements listed and meet the goals of controlling O&M costs. The report describes research results from an initial methodology for enhanced risk monitors by integrating real-time information about equipment condition and POF into risk monitors.

  11. An Updated Methodology for Enhancing Risk Monitors with Integrated Equipment Condition Assessment

    SciTech Connect

    Ramuhalli, Pradeep; Hirt, Evelyn H.; Coles, Garill A.; Bonebrake, Christopher A.; Ivans, William J.; Wootan, David W.; Mitchell, Mark R.

    2014-07-18

    Small modular reactors (SMRs) generally include reactors with electric output of ~350 MWe or less (this cutoff varies somewhat but is substantially less than full-size plant output of 700 MWe or more). Advanced SMRs (AdvSMRs) refer to a specific class of SMRs and are based on modularization of advanced reactor concepts. Enhancing affordability of AdvSMRs will be critical to ensuring wider deployment, as AdvSMRs suffer from loss of economies of scale inherent in small reactors when compared to large (~greater than 600 MWe output) reactors and the controllable day-to-day costs of AdvSMRs will be dominated by operation and maintenance (O&M) costs. Technologies that help characterize real-time risk are important for controlling O&M costs. Risk monitors are used in current nuclear power plants to provide a point-in-time estimate of the system risk given the current plant configuration (e.g., equipment availability, operational regime, and environmental conditions). However, current risk monitors are unable to support the capability requirements listed above as they do not take into account plant-specific normal, abnormal, and deteriorating states of active components and systems. This report documents technology developments towards enhancing risk monitors that, if integrated with supervisory plant control systems, can provide the capability requirements listed and meet the goals of controlling O&M costs. The report describes research results on augmenting an initial methodology for enhanced risk monitors that integrate real-time information about equipment condition and POF into risk monitors. Methods to propagate uncertainty through the enhanced risk monitor are evaluated. Available data to quantify the level of uncertainty and the POF of key components are examined for their relevance, and a status update of this data evaluation is described. Finally, we describe potential targets for developing new risk metrics that may be useful for studying trade-offs for economic

  12. Precision insolation measurement under field conditions

    NASA Technical Reports Server (NTRS)

    Reid, M. S.; Berdahl, C. M.; Gardner, R. A.

    1977-01-01

    The paper describes a precision insolation instrument - the Mark 3 Kendall Radiometer - developed for field survey work and for use as a calibration transfer standard. The instrument, which can be used to make accurate measurements of solar irradiance, has a windowless black cavity receptor mounted in a massive heat sink and has equal sensitivity to ultraviolet, infrared, and visible radiation. Attention is given to a calibration stability analysis over two years of operation.

  13. Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation

    DOE PAGES

    Xie, Zhenhua; Yan, Binhang; Zhang, Li; ...

    2017-01-25

    In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.

  14. The influence of construction methodology on structural brain network measures: A review.

    PubMed

    Qi, Shouliang; Meesters, Stephan; Nicolay, Klaas; Romeny, Bart M Ter Haar; Ossenblok, Pauly

    2015-09-30

    Structural brain networks based on diffusion MRI and tractography show robust attributes such as small-worldness, hierarchical modularity, and rich-club organization. However, there are large discrepancies in the reports about specific network measures. It is hypothesized that these discrepancies result from the influence of construction methodology. We surveyed the methodological options and their influences on network measures. It is found that most network measures are sensitive to the scale of brain parcellation, MRI gradient schemes and orientation model, and the tractography algorithm, which is in accordance with the theoretical analysis of the small-world network model. Different network weighting schemes represent different attributes of brain networks, which makes these schemes incomparable between studies. Methodology choice depends on the specific study objectives and a clear understanding of the pros and cons of a particular methodology. Because there is no way to eliminate these influences, it seems more practical to quantify them, optimize the methodologies, and construct structural brain networks with multiple spatial resolutions, multiple edge densities, and multiple weighting schemes.

  15. Measuring DNA Replication in Hypoxic Conditions.

    PubMed

    Foskolou, Iosifina P; Biasoli, Deborah; Olcina, Monica M; Hammond, Ester M

    2016-01-01

    It is imperative that dividing cells maintain replication fork integrity in order to prevent DNA damage and cell death. The investigation of DNA replication is of high importance as alterations in this process can lead to genomic instability, a known causative factor of tumor development. A simple, sensitive, and informative technique which enables the study of DNA replication, is the DNA fiber assay, an adaptation of which is described in this chapter. The DNA fiber method is a powerful tool, which allows the quantitative and qualitative analysis of DNA replication at the single molecule level. The sequential pulse labeling of live cells with two thymidine analogues and the subsequent detection with specific antibodies and fluorescence imaging allows direct examination of sites of DNA synthesis. In this chapter, we describe how this assay can be performed in conditions of low oxygen levels (hypoxia)-a physiologically relevant stress that occurs in most solid tumors. Moreover, we suggest ways on how to overcome the technical problems that arise while using the hypoxic chambers.

  16. Application of conditional entropy measures to steganalysis

    NASA Astrophysics Data System (ADS)

    Marsh, John; Knapik, Timothy; Lo, Ephraim; Heitzenrater, Chad

    2006-02-01

    Many commercial steganographic programs use least significant bit (LSB) embedding techniques to hide data in 24-bit color images. We present the results from a new steganalysis algorithm that uses a variety of entropy and conditional entropy features of various image bitplanes to detect the presence of LSB hiding. Our technique uses a Support Vector Machine (SVM) for bivariate classification. We use the SVMLight implementation due to Joachims (available at http://svmlight.joachims.org/). A novel Genetic Algorithm (GA) approach was used to optimize the feature set used by the classifier. Results include correct identification rates as high as >98% and false positive rates as low as <2%. We have applied using the staganography programs stegHide and Hide4PGP. The hiding algorithms are capable of both sequential and distributed LSB embedding. The image library consisted of 40,000 digital images of varying size and content, which form a diverse test set. Training sets consisted of as many as 34,000 images, half "clean" and the other half a disjoint set containing embedded data. The hidden data consisted of files with various sizes and various information densities, ranging from very low average entropy (e.g., standard word processing or spreadsheet files) to very high entropy (compressed data). The testing phase used a similarly prepared set, disjoint from the training data. Our work includes comparisons with current state-of-the-art techniques, and a detailed study of how results depend on training set size and feature sets used.

  17. Measuring the entropy from shifted boundary conditions

    NASA Astrophysics Data System (ADS)

    Giusti, L.; Pepe, M.

    We explore a new computational strategy for determining the equation of state of the SU(3) Yang-Mills theory. By imposing shifted boundary conditions, the entropy density is computed from the vacuum expectation value of the off-diagonal components T_{0k} of the energy-momentum tensor. A step-scaling function is introduced to span a wide range in temperature values. We present preliminary numerical results for the entropy density and its step-scaling function obtained at eight temperature values in the range T_c - 15 T_c. At each temperature, discretization effects are removed by simulating the theory at several lattice spacings and by extrapolating the results to the continuum limit. Finite-size effects are always kept below the statistical errors. The absence of ultraviolet power divergences and the remarkably small discretization effects allow for a precise determination of the step-scaling function in the explored temperature range. These findings establish this strategy as a viable solution for an accurate determination of the equation of state in a wide range of temperature values.

  18. Validation of a CFD Methodology for Variable Speed Power Turbine Relevant Conditions

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.; Giel, Paul W.; McVetta, Ashlie B.

    2013-01-01

    Analysis tools are needed to investigate aerodynamic performance of Variable-Speed Power Turbines (VSPT) for rotorcraft applications. The VSPT operates at low Reynolds numbers (transitional flow) and over a wide range of incidence. Previously, the capability of a published three-equation turbulence model to predict accurately the transition location for three-dimensional heat transfer problems was assessed. In this paper, the results of a post-diction exercise using a three-dimensional flow in a transonic linear cascade comprising VSPT blading are presented. The measured blade pressure distributions and exit total pressure and flow angles for two incidence angles corresponding to cruise (i = 5.8deg) and takeoff (i = -36.7deg) were used for this study. For the higher loading condition of cruise and the negative incidence condition of takeoff, overall agreement with data may be considered satisfactory but areas of needed improvement are also indicated.

  19. Optimization of culture conditions to improve Helicobacter pylori growth in Ham's F-12 medium by response surface methodology.

    PubMed

    Bessa, L J; Correia, D M; Cellini, L; Azevedo, N F; Rocha, I

    2012-01-01

    Helicobacter pylori is a gastroduodenal pathogen that colonizes the human stomach and is the causal agent of gastric diseases. From the clinical and epidemiological point of view, enhancing and improving the growth of this bacterium in liquid media is an important goal to achieve in order to allow the performance of accurate physiological studies. The aim of this work was to optimize three culture conditions that influence the growth of H. pylori in the defined medium Ham s F-12 supplemented with 5 percent fetal bovine serum by using response surface methodology as a statistical technique to obtain the optimal conditions. The factors studied in this experimental design (Box-Behnken design) were the pH of the medium, the shaking speed (rpm) and the percentage of atmospheric oxygen, in a total of 17 experiments. The biomass specific growth rate was the response measured. The model was validated for pH and shaking speed. The percentage of atmospheric oxygen did not influence the growth for the range of values studied. At the optimal values found for pH and shaking speed, 8 and 130 rpm, respectively, a specific growth rate value of 0.164 h-1, corresponding to a maximal concentration of approximately 1.5x108 CFU/ml, was reached after 8 h. The experimental design strategy allowed, for the first time, the optimization of H. pylori growth in a semi-synthetic medium, which may be important to improve physiological and metabolic studies of this fastidious bacterium.

  20. A coupled Bayesian and fault tree methodology to assess future groundwater conditions in light of climate change

    NASA Astrophysics Data System (ADS)

    Huang, J. J.; Du, M.; McBean, E. A.; Wang, H.; Wang, J.

    2014-08-01

    Maintaining acceptable groundwater levels, particularly in arid areas, while protecting ecosystems, are key measures against desertification. Due to complicated hydrological processes and their inherent uncertainties, investigations of groundwater recharge conditions are challenging, particularly in arid areas under climate changing conditions. To assist planning to protect against desertification, a fault tree methodology, in conjunction with fuzzy logic and Bayesian data mining, are applied to Minqin Oasis, a highly vulnerable regime in northern China. A set of risk factors is employed within the fault tree framework, with fuzzy logic translating qualitative risk data into probabilities. Bayesian data mining is used to quantify the contribution of each risk factor to the final aggregated risk. The implications of both historical and future climate trends are employed for temperature, precipitation and potential evapotranspiration (PET) to assess water table changes under various future scenarios. The findings indicate that water table levels will continue to drop at the rate of 0.6 m yr-1 in the future when climatic effects alone are considered, if agricultural and industrial production capacity remain at 2004 levels.

  1. Response surface methodology for optimising the culture conditions for eicosapentaenoic acid production by marine bacteria.

    PubMed

    Abd Elrazak, Ahmed; Ward, Alan C; Glassey, Jarka

    2013-05-01

    Polyunsaturated fatty acids (PUFAs), especially eicosapentaenoic acid (EPA), are increasingly attracting scientific attention owing to their significant health-promoting role in the human body. However, the human body lacks the ability to produce them in vivo. The limitations associated with the current sources of ω-3 fatty acids from animal and plant sources have led to increased interest in microbial production. Bacterial isolate 717 was identified as a potential high EPA producer. As an important step in the process development of the microbial PUFA production, the culture conditions at the bioreactor scale were optimised for the isolate 717 using a response surface methodology exploring the significant effect of temperature, pH and dissolved oxygen and the interaction between them on the EPA production. This optimisation strategy led to a significant increase in the amount of EPA produced by the isolate under investigation, where the amount of EPA increased from 9 mg/g biomass (33 mg/l representing 7.6 % of the total fatty acids) to 45 mg/g (350 mg/l representing 25 % of the total fatty acids). To avoid additional costs associated with extreme cooling at large scale, a temperature shock experiment was carried out reducing the overall cooling time from the whole cultivation process to 4 h only prior to harvest. The ability of the organism to produce EPA under the complete absence of oxygen was tested revealing that oxygen is not critically required for the biosynthesis of EPA but the production improved in the presence of oxygen. The stability of the produced oil and the complete absence of heavy metals in the bacterial biomass are considered as an additional benefit of bacterial EPA compared to other sources of PUFA. To our knowledge this is the first report of a bacterial isolate producing EPA with such high yields making the large-scale manufacture much more economically viable.

  2. Surface monitoring measurements of materials on environmental change conditions

    NASA Astrophysics Data System (ADS)

    Tornari, Vivi; Bernikola, Eirini; Bellendorf, Paul; Bertolin, Chiara; Camuffo, Dario; Kotova, Lola; Jacobs, Daniela; Zarnic, Roko; Rajcic, Vlatka; Leissner, Johanna

    2013-05-01

    Climate Change is one of the most critical global challenges of our time and the burdened cultural heritage of Europe is particularly vulnerable to be left unprotected. Climate for Culture2 project exploits the damage impact of climate change on cultural heritage at regional scale. In this paper the progress of the study with in situ measurements and investigations at cultural heritage sites throughout Europe combined with laboratory simulations is described. Cultural works of art are susceptible to deterioration with environmental changes causing imperceptibly slow but steady accumulation of damaging effects directly impacted on structural integrity. Laser holographic interference method is employed to provide remote non destructive field-wise detection of the structural differences occurred as climate responses. The first results from climate simulation of South East Europe (Crete) are presented. A full study in regards to the four climate regions of Europe is foreseen to provide values for development of a precise and integrated model of thermographic building simulations for evaluation of impact of climate change. Development of a third generation user interface software optimised portable metrology system (DHSPI II) is designed to record in custom intervals the surface of materials witnessing reactions under simulated climatic conditions both onfield and in laboratory. The climate conditions refer to real data-loggers readings representing characteristic historical building in selected climate zones. New generation impact sensors termed Glass Sensors and Free Water Sensors are employed in the monitoring procedure to cross-correlate climate data with deformation data. In this paper results from the combined methodology are additionally presented.

  3. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    ERIC Educational Resources Information Center

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  4. Computer Science and Technology: Measurement of Interative Computing: Methodology and Application.

    ERIC Educational Resources Information Center

    Cotton, Ira W.

    This dissertation reports the development and application of a new methodology for the measurement and evaluation of interactive computing, applied to either the users of an interactive computing system or to the system itself, including the service computer and any communications network through which the service is delivered. The focus is on the…

  5. Integrating Multi-Tiered Measurement Outcomes for Special Education Eligibility with Sequential Decision-Making Methodology

    ERIC Educational Resources Information Center

    Decker, Scott L.; Englund, Julia; Albritton, Kizzy

    2012-01-01

    Changes to federal guidelines for the identification of children with disabilities have supported the use of multi-tiered models of service delivery. This study investigated the impact of measurement methodology as used across numerous tiers in determining special education eligibility. Four studies were completed using a sample of inner-city…

  6. 39 CFR 3055.5 - Changes to measurement systems, service standards, service goals, or reporting methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 39 Postal Service 1 2014-07-01 2014-07-01 false Changes to measurement systems, service standards, service goals, or reporting methodologies. 3055.5 Section 3055.5 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL SERVICE PERFORMANCE AND CUSTOMER SATISFACTION REPORTING Annual Reporting of...

  7. 39 CFR 3055.5 - Changes to measurement systems, service standards, service goals, or reporting methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 39 Postal Service 1 2011-07-01 2011-07-01 false Changes to measurement systems, service standards, service goals, or reporting methodologies. 3055.5 Section 3055.5 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL SERVICE PERFORMANCE AND CUSTOMER SATISFACTION REPORTING Annual Reporting of...

  8. 39 CFR 3055.5 - Changes to measurement systems, service standards, service goals, or reporting methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 39 Postal Service 1 2013-07-01 2013-07-01 false Changes to measurement systems, service standards, service goals, or reporting methodologies. 3055.5 Section 3055.5 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL SERVICE PERFORMANCE AND CUSTOMER SATISFACTION REPORTING Annual Reporting of...

  9. 39 CFR 3055.5 - Changes to measurement systems, service standards, service goals, or reporting methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 39 Postal Service 1 2012-07-01 2012-07-01 false Changes to measurement systems, service standards, service goals, or reporting methodologies. 3055.5 Section 3055.5 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL SERVICE PERFORMANCE AND CUSTOMER SATISFACTION REPORTING Annual Reporting of...

  10. Integrating Multi-Tiered Measurement Outcomes for Special Education Eligibility with Sequential Decision-Making Methodology

    ERIC Educational Resources Information Center

    Decker, Scott L.; Englund, Julia; Albritton, Kizzy

    2012-01-01

    Changes to federal guidelines for the identification of children with disabilities have supported the use of multi-tiered models of service delivery. This study investigated the impact of measurement methodology as used across numerous tiers in determining special education eligibility. Four studies were completed using a sample of inner-city…

  11. Computer Science and Technology: Measurement of Interative Computing: Methodology and Application.

    ERIC Educational Resources Information Center

    Cotton, Ira W.

    This dissertation reports the development and application of a new methodology for the measurement and evaluation of interactive computing, applied to either the users of an interactive computing system or to the system itself, including the service computer and any communications network through which the service is delivered. The focus is on the…

  12. Measuring College Students' Alcohol Consumption in Natural Drinking Environments: Field Methodologies for Bars and Parties

    ERIC Educational Resources Information Center

    Clapp, John D.; Holmes, Megan R.; Reed, Mark B.; Shillington, Audrey M.; Freisthler, Bridget; Lange, James E.

    2007-01-01

    In recent years researchers have paid substantial attention to the issue of college students' alcohol use. One limitation to the current literature is an over reliance on retrospective, self-report survey data. This article presents field methodologies for measuring college students' alcohol consumption in natural drinking environments.…

  13. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    ERIC Educational Resources Information Center

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  14. Measuring College Students' Alcohol Consumption in Natural Drinking Environments: Field Methodologies for Bars and Parties

    ERIC Educational Resources Information Center

    Clapp, John D.; Holmes, Megan R.; Reed, Mark B.; Shillington, Audrey M.; Freisthler, Bridget; Lange, James E.

    2007-01-01

    In recent years researchers have paid substantial attention to the issue of college students' alcohol use. One limitation to the current literature is an over reliance on retrospective, self-report survey data. This article presents field methodologies for measuring college students' alcohol consumption in natural drinking environments.…

  15. Difficult to measure constructs: conceptual and methodological issues concerning participation and environmental factors.

    PubMed

    Whiteneck, Gale; Dijkers, Marcel P

    2009-11-01

    For rehabilitation and disability research, participation and environment are 2 crucial constructs that have been placed center stage by the International Classification of Functioning, Disability and Health (ICF). However, neither construct is adequately conceptualized by the ICF, and both are difficult to measure. This article addresses conceptual and methodologic issues related to these ICF constructs, and recommends an improved distinction between activities and participation, as well as elaboration of environment. A division of the combined ICF categories for activity and participation into 2 separate taxonomies is proposed to guide future research. The issue of measuring participation from objective and subjective perspectives is examined, and maintaining these distinct conceptual domains in the measurement of participation is recommended. The methodological issues contributing to the difficulty of measuring participation are discussed, including potential dimensionality, alternative metrics, and the appropriateness of various measurement models. For environment, the need for theory to focus research on those aspects of the environment that interact with individuals' impairments and functional limitations in affecting activities and participation is discussed, along with potential measurement models for those aspects. The limitations resulting from reliance on research participants as reporters on their own environment are set forth. Addressing these conceptual and methodological issues is required before the measurement of participation and environmental factors can advance and these important constructs can be used more effectively in rehabilitation and disability observational research and trials.

  16. Scalar mixing and strain dynamics methodologies for PIV/LIF measurements of vortex ring flows

    NASA Astrophysics Data System (ADS)

    Bouremel, Yann; Ducci, Andrea

    2017-01-01

    Fluid mixing operations are central to possibly all chemical, petrochemical, and pharmaceutical industries either being related to biphasic blending in polymerisation processes, cell suspension for biopharmaceuticals production, and fractionation of complex oil mixtures. This work aims at providing a fundamental understanding of the mixing and stretching dynamics occurring in a reactor in the presence of a vortical structure, and the vortex ring was selected as a flow paradigm of vortices commonly encountered in stirred and shaken reactors in laminar flow conditions. High resolution laser induced fluorescence and particle imaging velocimetry measurements were carried out to fully resolve the flow dissipative scales and provide a complete data set to fully assess macro- and micro-mixing characteristics. The analysis builds upon the Lamb-Oseen vortex work of Meunier and Villermaux ["How vortices mix," J. Fluid Mech. 476, 213-222 (2003)] and the engulfment model of Baldyga and Bourne ["Simplification of micromixing calculations. I. Derivation and application of new model," Chem. Eng. J. 42, 83-92 (1989); "Simplification of micromixing calculations. II. New applications," ibid. 42, 93-101 (1989)] which are valid for diffusion-free conditions, and a comparison is made between three methodologies to assess mixing characteristics. The first method is commonly used in macro-mixing studies and is based on a control area analysis by estimating the variation in time of the concentration standard deviation, while the other two are formulated to provide an insight into local segregation dynamics, by either using an iso-concentration approach or an iso-concentration gradient approach to take into account diffusion.

  17. 2015 Review on the Extension of the AMedP-8(C) Methodology to New Agents, Materials, and Conditions

    DTIC Science & Technology

    2016-06-01

    Conditions Carl A. Curling Mark E. Bohannon INSTITUTE FOR DEFENSE ANALYSES 4850 Mark Center Drive Alexandria, Virginia 22311-1882 June 2016...Approved for public release; distribution is unlimited. IDA Document D-8047 Log: H 16-000769 About This Publication This work was conducted by the...Methodology to New Agents, Materials, and Conditions Carl A. Curling Mark E. Bohannon This page is intentionally blank. iii Executive Summary In

  18. Analytical Methodology Used To Assess/Refine Observatory Thermal Vacuum Test Conditions For the Landsat 8 Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Fantano, Louis

    2015-01-01

    Thermal and Fluids Analysis Workshop Silver Spring, MD NCTS 21070-15 The Landsat 8 Data Continuity Mission, which is part of the United States Geologic Survey (USGS), launched February 11, 2013. A Landsat environmental test requirement mandated that test conditions bound worst-case flight thermal environments. This paper describes a rigorous analytical methodology applied to assess refine proposed thermal vacuum test conditions and the issues encountered attempting to satisfy this requirement.

  19. Nonlinear Parabolic Equations Involving Measures as Initial Conditions.

    DTIC Science & Technology

    1981-09-01

    CHART N N N Afl4Uf’t 1N II Il MRC Technical Summary Report # 2277 0 NONLINEAR PARABOLIC EQUATIONS INVOLVING MEASURES AS INITIAL CONDITIONS I Haim Brezis ...NONLINEAR PARABOLIC EQUATIONS INVOLVING MEASURES AS INITIAL CONDITIONS Haim Brezis and Avner Friedman Technical Summary Report #2277 September 1981...with NRC, and not with the authors of this report. * 𔃾s ’a * ’ 4| NONLINEAR PARABOLIC EQUATIONS INVOLVING MEASURES AS INITIAL CONDITIONS Haim Brezis

  20. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review.

    PubMed

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-03-07

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the

  1. A system and methodology for measuring volatile organic compounds produced by hydroponic lettuce in a controlled environment

    NASA Technical Reports Server (NTRS)

    Charron, C. S.; Cantliffe, D. J.; Wheeler, R. M.; Manukian, A.; Heath, R. R.

    1996-01-01

    A system and methodology were developed for the nondestructive qualitative and quantitative analysis of volatile emissions from hydroponically grown 'Waldmann's Green' leaf lettuce (Lactuca sativa L.). Photosynthetic photon flux (PPF), photoperiod, and temperature were automatically controlled and monitored in a growth chamber modified for the collection of plant volatiles. The lipoxygenase pathway products (Z)-3-hexenal, (Z)-3-hexenol, and (Z)-3-hexenyl acetate were emitted by lettuce plants after the transition from the light period to the dark period. The volatile collection system developed in this study enabled measurements of volatiles emitted by intact plants, from planting to harvest, under controlled environmental conditions.

  2. A system and methodology for measuring volatile organic compounds produced by hydroponic lettuce in a controlled environment

    NASA Technical Reports Server (NTRS)

    Charron, C. S.; Cantliffe, D. J.; Wheeler, R. M.; Manukian, A.; Heath, R. R.

    1996-01-01

    A system and methodology were developed for the nondestructive qualitative and quantitative analysis of volatile emissions from hydroponically grown 'Waldmann's Green' leaf lettuce (Lactuca sativa L.). Photosynthetic photon flux (PPF), photoperiod, and temperature were automatically controlled and monitored in a growth chamber modified for the collection of plant volatiles. The lipoxygenase pathway products (Z)-3-hexenal, (Z)-3-hexenol, and (Z)-3-hexenyl acetate were emitted by lettuce plants after the transition from the light period to the dark period. The volatile collection system developed in this study enabled measurements of volatiles emitted by intact plants, from planting to harvest, under controlled environmental conditions.

  3. A system and methodology for measuring volatile organic compounds produced by hydroponic lettuce in a controlled environment.

    PubMed

    Charron, C S; Cantliffe, D J; Wheeler, R M; Manukian, A; Heath, R R

    1996-05-01

    A system and methodology were developed for the nondestructive qualitative and quantitative analysis of volatile emissions from hydroponically grown 'Waldmann's Green' leaf lettuce (Lactuca sativa L.). Photosynthetic photon flux (PPF), photoperiod, and temperature were automatically controlled and monitored in a growth chamber modified for the collection of plant volatiles. The lipoxygenase pathway products (Z)-3-hexenal, (Z)-3-hexenol, and (Z)-3-hexenyl acetate were emitted by lettuce plants after the transition from the light period to the dark period. The volatile collection system developed in this study enabled measurements of volatiles emitted by intact plants, from planting to harvest, under controlled environmental conditions.

  4. ASSESSING THE CONDITION OF SOUTH CAROLINA'S ESTUARIES: A NEW APPROACH INVOLVING INTEGRATED MEASURES OF CONDITION

    EPA Science Inventory

    The South Carolina Estuarine and Coastal Assessment Program (SCECAP) was initiated in 1999 to assess the condition of the state's coastal habitats using multiple measures of water quality, sediment quality, and biological condition. Sampling has subsequently been expanded to incl...

  5. ASSESSING THE CONDITION OF SOUTH CAROLINA'S ESTUARIES: A NEW APPROACH INVOLVING INTEGRATED MEASURES OF CONDITION

    EPA Science Inventory

    The South Carolina Estuarine and Coastal Assessment Program (SCECAP) was initiated in 1999 to assess the condition of the state's coastal habitats using multiple measures of water quality, sediment quality, and biological condition. Sampling has subsequently been expanded to incl...

  6. Small mass measurement instrument for measuring weight under weightless conditions

    NASA Astrophysics Data System (ADS)

    Solberg, R. F., Jr.

    1984-05-01

    A small mass measurement instrument (SMMI), developed for NASA experiments conducted in the near zero environment is described. The SMMI is based on the principle of an oscillating spring-mass system, for which a period of oscillation is a function of the system's mass. It has the capacity of weighing specimens in the weight range of below 1 g to over 10,000 g, with an accuracy of 0.05 percent. The instrument has a keyboard, liquid crystal display, and microprocessor, which provide capabilities for entering and deleting data, display of messages, prompts, and specimen weight values, memory, and self-calibration features. The SMMI is scheduled for use beginning with Spacelab 4. Included in the description are the SMMI block diagram, detailed descriptions of the principles involved in the construction of the assemblies of the instrument, and photographs of its various parts.

  7. [Basic questionnaire and methodological criteria for Surveys on Working Conditions, Employment, and Health in Latin America and the Caribbean].

    PubMed

    Benavides, Fernando G; Merino-Salazar, Pamela; Cornelio, Cecilia; Assunção, Ada Avila; Agudelo-Suárez, Andrés A; Amable, Marcelo; Artazcoz, Lucía; Astete, Jonh; Barraza, Douglas; Berhó, Fabián; Milián, Lino Carmenate; Delclòs, George; Funcasta, Lorena; Gerke, Johanna; Gimeno, David; Itatí-Iñiguez, María José; Lima, Eduardo de Paula; Martínez-Iñigo, David; Medeiros, Adriane Mesquita de; Orta, Lida; Pinilla, Javier; Rodrigo, Fernando; Rojas, Marianela; Sabastizagal, Iselle; Vallebuona, Clelia; Vermeylen, Greet; Villalobos, Gloria H; Vives, Alejandra

    2016-10-10

    This article aimed to present a basic questionnaire and minimum methodological criteria for consideration in future Surveys on Working Conditions, Employment, and Health in Latin America and the Caribbean. A virtual and face-to-face consensus process was conducted with participation by a group of international experts who used the surveys available up until 2013 as the point of departure for defining the proposal. The final questionnaire included 77 questions grouped in six dimensions: socio-demographic characteristics of workers and companies; employment conditions; working conditions; health status; resources and preventive activities; and family characteristics. The minimum methodological criteria feature the interviewee's home as the place for the interview and aspects related to the quality of the fieldwork. These results can help improve the comparability of future surveys in Latin America and the Caribbean, which would in turn help improve information on workers' heath in the region.

  8. The development and application of an automatic boundary segmentation methodology to evaluate the vaporizing characteristics of diesel spray under engine-like conditions

    NASA Astrophysics Data System (ADS)

    Ma, Y. J.; Huang, R. H.; Deng, P.; Huang, S.

    2015-04-01

    Studying the vaporizing characteristics of diesel spray could greatly help to reduce engine emission and improve performance. The high-speed schlieren imaging method is an important optical technique for investigating the macroscopic vaporizing morphological evolution of liquid fuel, and pre-combustion constant volume combustion bombs are often used to simulate the high pressure and high temperature conditions occurring in diesel engines. Complicated background schlieren noises make it difficult to segment the spray region in schlieren spray images. To tackle this problem, this paper develops a vaporizing spray boundary segmentation methodology based on an automatic threshold determination algorithm. The methodology was also used to quantify the macroscopic characteristics of vaporizing sprays including tip penetration, near-field and far-field angles, and projected spray area and spray volume. The spray boundary segmentation methodology was realized in a MATLAB-based program. Comparisons were made between the spray characteristics obtained using the program method and those acquired using a manual method and the Hiroyasu prediction model. It is demonstrated that the methodology can segment and measure vaporizing sprays precisely and efficiently. Furthermore, the experimental results show that the spray angles were slightly affected by the injection pressure at high temperature and high pressure and under inert conditions. A higher injection pressure leads to longer spray tip penetration and a larger projected area and volume, while elevating the temperature of the environment can significantly promote the evaporation of cold fuel.

  9. Conditional Inference and Logic for Intelligent Systems: A Theory of Measure-Free Conditioning

    DTIC Science & Technology

    1991-08-01

    4 TITLE AND SUBTITLE 5 FUNDING NUMBERS CONDITIONAL INFERENCE AND LOGIC FOR INTELLIGENT SYSTEMS PR: ZE90 PR: ZW40 A Theory of Measure-Free Conditioning...200 UNCLASSIFIED tf F I CONDIT[ONAL INFERENCE AND LOGIC FOR INTELIUGENT SYSTEMS: I, A THEORY OF MEASURE-FREE CONDTONING F by L R. Goodman Command and...complete and satisfactory theory of "measure-free" conditioning. If the concept of "conditional event" can be formalized and a suitable algebra of

  10. Thermal effusivity measurements for liquids: a self-consistent photoacoustic methodology.

    PubMed

    Balderas-López, J A

    2007-06-01

    A self-consistent photoacoustic methodology for the measurement of the thermal effusivity for liquids is presented. This methodology makes use of the analytical solution for the one-dimensional heat diffusion problem for a single layer, assuming a harmonic heat source in the surface absorption limit. The analytical treatment involves fitting procedures over normalized amplitudes and phases, obtained as the ratio of photoacoustic signals in the front configuration with and without the liquid sample, as functions of the modulation frequency. Two values of thermal effusivity for each liquid sample are obtained, one from the analysis of the normalized amplitudes and the other one from the normalized phases. The comparison between the experimental and theoretical phases allows the description of a simple criterion for deciding on the appropriate modulation frequency range for the analysis in each case. This methodology was applied for measuring the thermal effusivity of some pure liquids; a very good agreement between the thermal effusivity values obtained by this methodology and the corresponding ones reported in the literature was obtained.

  11. An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations.

    PubMed

    Mirzaev, Inom; Byrne, Erin C; Bortz, David M

    2016-01-01

    We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach.

  12. An inverse problem for a class of conditional probability measure-dependent evolution equations

    NASA Astrophysics Data System (ADS)

    Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.

    2016-09-01

    We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by partial differential equation models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach.

  13. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a..., territories, or possessions, an OPO must meet all 3 of the following outcome measures: (1) The OPO's...

  14. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a..., territories, or possessions, an OPO must meet all 3 of the following outcome measures: (1) The OPO's...

  15. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a..., territories, or possessions, an OPO must meet all 3 of the following outcome measures: (1) The OPO's...

  16. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a..., territories, or possessions, an OPO must meet all 3 of the following outcome measures: (1) The OPO's...

  17. A practical methodology to measure unbiased gas chromatographic retention factor vs. temperature relationships.

    PubMed

    Peng, Baijie; Kuo, Mei-Yi; Yang, Panhia; Hewitt, Joshua T; Boswell, Paul G

    2014-12-29

    Compound identification continues to be a major challenge. Gas chromatography-mass spectrometry (GC-MS) is a primary tool used for this purpose, but the GC retention information it provides is underutilized because existing retention databases are experimentally restrictive and unreliable. A methodology called "retention projection" has the potential to overcome these limitations, but it requires the retention factor (k) vs. T relationship of a compound to calculate its retention time. Direct methods of measuring k vs. T relationships from a series of isothermal runs are tedious and time-consuming. Instead, a series of temperature programs can be used to quickly measure the k vs. T relationships, but they are generally not as accurate when measured this way because they are strongly biased by non-ideal behavior of the GC system in each of the runs. In this work, we overcome that problem by using the retention times of 25 n-alkanes to back-calculate the effective temperature profile and hold-up time vs. T profiles produced in each of the six temperature programs. When the profiles were measured this way and taken into account, the k vs. T relationships measured from each of two different GC-MS instruments were nearly as accurate as the ones measured isothermally, showing less than two-fold more error. Furthermore, temperature-programmed retention times calculated in five other laboratories from the new k vs. T relationships had the same distribution of error as when they were calculated from k vs. T relationships measured isothermally. Free software was developed to make the methodology easy to use. The new methodology potentially provides a relatively fast and easy way to measure unbiased k vs. T relationships. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. A Practical Methodology to Measure Unbiased Gas Chromatographic Retention Factor vs. Temperature Relationships

    PubMed Central

    Peng, Baijie; Kuo, Mei-Yi; Yang, Panhia; Hewitt, Joshua T.; Boswell, Paul G.

    2014-01-01

    Compound identification continues to be a major challenge. Gas chromatography-mass spectrometry (GC-MS) is a primary tool used for this purpose, but the GC retention information it provides is underutilized because existing retention databases are experimentally restrictive and unreliable. A methodology called “retention projection” has the potential to overcome these limitations, but it requires the retention factor (k) vs. T relationship of a compound to calculate its retention time. Direct methods of measuring k vs. T relationships from a series of isothermal runs are tedious and time-consuming. Instead, a series of temperature programs can be used to quickly measure the k vs. T relationships, but they are generally not as accurate when measured this way because they are strongly biased by non-ideal behavior of the GC system in each of the runs. In this work, we overcome that problem by using the retention times of 25 n-alkanes to back-calculate the effective temperature profile and hold-up time vs. T profiles produced in each of six temperature programs. When the profiles were measured this way and taken into account, the k vs. T relationships measured from each of two different GC-MS instruments were nearly as accurate as the ones measured isothermally, showing less than 2-fold more error. Furthermore, temperature-programmed retention times calculated in five other labs from the new k vs. T relationships had the same distribution of error as when they were calculated from k vs. T relationships measured isothermally. Free software was developed to make the methodology easy to use. The new methodology potentially provides a relatively fast and easy way to measure unbiased k vs. T relationships. PMID:25496658

  19. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements.

    PubMed

    do Amaral, Leonardo L; de Oliveira, Harley F; Pavoni, Juliana F; Sampaio, Franciso; Ghillardi Netto, Thomaz

    2015-09-08

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source-to-detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92% ± 0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85% ± 0.26% and the mean percentage differences in the normalization point doses were -1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments.

  20. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements.

    PubMed

    do Amaral, Leonardo L; de Oliveira, Harley F; Pavoni, Juliana F; Sampaio, Francisco; Netto, Thomaz Ghilardi

    2015-09-01

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source-to-detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were -1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N.

  1. A Methodology for Evaluating the Relationship Between Measures of Evaluation (MOEVs): The STF Approach

    DTIC Science & Technology

    1991-06-01

    air C2, respectively, will be defined. Chapter IV will describe the STF approach methodology, developed by Monti Callero and Clairice T. Veit, both...1986. 6. Callero , Monti, Willard Naslund, Clairice T. Veit, Subjective Measurement of Tactical Air Command and Control--Vol. 1: Background and Approach...The Rand Corporation, March 1981. 7. Veit, Clairice T. and Monti Callero , Subjective Transfer Function Approach to Complex System Analysis, The Rand

  2. Methodological Challenges in Causal Research on Racial and Ethnic Patterns of Cognitive Trajectories: Measurement, Selection, and Bias

    PubMed Central

    Glymour, M. Maria; Weuve, Jennifer; Chen, Jarvis T.

    2011-01-01

    Research focused on understanding how and why cognitive trajectories differ across racial and ethnic groups can be compromised by several possible methodological challenges. These difficulties are especially relevant in research on racial and ethnic disparities and neuropsychological outcomes because of the particular influence of selection and measurement in these contexts. In this article, we review the counterfactual framework for thinking about causal effects versus statistical associations. We emphasize that causal inferences are key to predicting the likely consequences of possible interventions, for example in clinical settings. We summarize a number of common biases that can obscure causal relationships, including confounding, measurement ceilings/floors, baseline adjustment bias, practice or retest effects, differential measurement error, conditioning on common effects in direct and indirect effects decompositions, and differential survival. For each, we describe how to recognize when such biases may be relevant and some possible analytic or design approaches to remediating these biases. PMID:18819008

  3. Radiotherapy for supradiaphragmatic Hodgkin's disease: determination of the proper fetal shielding conditions using Monte Carlo methodology.

    PubMed

    Mazonakis, Michalis; Tzedakis, Antonis; Varveris, Charalambos; Damilakis, John

    2011-10-01

    This study aimed to estimate fetal dose from mantle field irradiation with 6 MV photons and to determine the proper fetal shielding conditions. The Monte Carlo N-particle code and mathematical phantoms representing pregnancy at the first, second and third trimesters of gestation were used to calculate fetal dose with or without the presence of a 5-cm-thick lead shield of dimensions 35×35 cm(2). Fetal exposure was calculated for lead thicknesses of 2, 3, 4, 6, 7 and 8 cm. The dependence of fetal dose upon the distance separating the shield from the beam edge and phantom's abdomen was investigated. Dose measurements were performed on a physical phantom using thermoluminescent dosimetry. The radiation dose to an unshielded and shielded fetus was 0.578-0.861% and 0.180-0.641% of the prescribed tumor dose, respectively, depending upon the gestational age. The lead thickness increase from 2 to 5 cm led to a fetal dose reduction up to 23.4%. The use of 5- to 8-cm-thick lead resulted in dose values differing less than 4.5%. The shift of the lead from the closer to the more distant position relative to the field edge increased fetal dose up to 42.5%. The respective increase by changing the distance from the phantom's abdomen was 21.9%. The difference between dose calculations and measurements at specific points was 8.3±3.9%. The presented data may be used for fetal dose assessment with different shielding settings before treatment and, then, for the design and construction of the appropriate shielding device.

  4. Optimization of fermentation conditions for P450 BM-3 monooxygenase production by hybrid design methodology*

    PubMed Central

    Lu, Yan; Mei, Le-he

    2007-01-01

    Factorial design and response surface techniques were used to design and optimize increasing P450 BM-3 expression in E. coli. Operational conditions for maximum production were determined with twelve parameters under consideration: the concentration of FeCl3, induction at OD578 (optical density measured at 578 nm), induction time and inoculum concentration. Initially, Plackett-Burman (PB) design was used to evaluate the process variables relevant in relation to P450 BM-3 production. Four statistically significant parameters for response were selected and utilized in order to optimize the process. With the 416C model of hybrid design, response surfaces were generated, and P450 BM-3 production was improved to 57.90×10−3 U/ml by the best combinations of the physicochemical parameters at optimum levels of 0.12 mg/L FeCl3, inoculum concentration of 2.10%, induction at OD578 equal to 1.07, and with 6.05 h of induction. PMID:17173359

  5. Optimization of processing conditions for the sterilization of retorted short-rib patties using the response surface methodology.

    PubMed

    Choi, Su-Hee; Cheigh, Chan-Ick; Chung, Myong-Soo

    2013-05-01

    The aim of this study was to determine the optimum sterilization conditions for short-rib patties in retort trays by considering microbiological safety, nutritive value, sensory characteristics, and textural properties. In total, 27 sterilization conditions with various temperatures, times, and processing methods were tested using a 3(3) factorial design. The response surface methodology (RSM) and contour analysis were applied to find the optimum sterilization conditions for the patties. Quality attributes were significantly affected by the sterilization temperature, time, and processing method. From RSM and contour analysis, the final optimum sterilization condition of the patties that simultaneously satisfied all specifications was determined to be 119.4°C for 18.55min using a water-cascading rotary mode. The findings of the present study suggest that using optimized sterilization conditions will improve the microbial safety, sensory attributes, and nutritional retention for retorted short-rib patties.

  6. A novel, noninvasive transdermal fluid sampling methodology: IGF-I measurement following exercise.

    PubMed

    Scofield, D E; McClung, H L; McClung, J P; Kraemer, W J; Rarick, K R; Pierce, J R; Cloutier, G J; Fielding, R A; Matheny, R W; Young, A J; Nindl, B C

    2011-06-01

    This study tested the hypothesis that transdermal fluid (TDF) provides a more sensitive and accurate measure of exercise-induced increases in insulin-like growth factor-I (IGF-I) than serum, and that these increases are detectable proximal, but not distal, to the exercising muscle. A novel, noninvasive methodology was used to collect TDF, followed by sampling of total IGF-I (tIGF-I) and free IGF-I (fIGF-I) in TDF and serum following an acute bout of exercise. Experiment 1: eight men (23 ± 3 yrs, 79 ± 7 kg) underwent two conditions (resting and 60 min of cycling exercise at 60% Vo(2)(peak)) in which serum and forearm TDF were collected for comparison. There were no significant changes in tIGF-I or fIGF-I in TDF obtained from the forearm or from serum following exercise (P > 0.05); however, the proportion of fIGF-I to tIGF-I in TDF was approximately fourfold greater than that of serum (P ≤ 0.05). These data suggest that changes in TDF IGF-I are not evident when TDF is sampled distal from the working tissue. To determine whether exercise-induced increases in local IGF-I could be detected when TDF was sampled directly over the active muscle group, we performed a second experiment. Experiment 2: fourteen subjects (22 ± 4 yr, 68 ± 11 kg) underwent an acute plyometric exercise condition consisting of 10 sets of 10 plyometric jumps with 2-min rest between sets. We observed a significant increase in TDF tIGF-I following exercise (P ≤ 0.05) but no change in serum tIGF-I (P > 0.05). Overall, these data suggest that TDF may provide a noninvasive means of monitoring acute exercise-induced changes in local IGF-I when sampled in proximity to exercising muscles. Moreover, our finding that the proportion of free to tIGF-I was greater in TDF than in serum suggests that changes in local IGF-I may be captured more readily using this system.

  7. Sensitive Measures of Condition Change in EEG Data

    SciTech Connect

    Hively, L.M.; Gailey, P.C.; Protopopescu, V.

    1999-03-10

    We present a new, robust, model-independent technique for measuring condition change in nonlinear data. We define indicators of condition change by comparing distribution functions (DF) defined on the attractor for time windowed data sets via L{sub 1}-distance and {chi}{sup 2} statistics. The new measures are applied to EEG data with the objective of detecting the transition between non-seizure and epileptic brain activity in an accurate and timely manner. We find a clear superiority of the new metrics in comparison to traditional nonlinear measures as discriminators of condition change.

  8. Rapid assessment methodology in NORM measurements from building materials of Uzbekistan.

    PubMed

    Safarov, A A; Safarov, A N; Azimov, A N; Darby, I G

    2017-04-01

    Utilizing low cost NaI(Tl) scintillation detector systems we present methodology for the rapid screening of building material samples and the determination of their Radium Equivalent Activity (Raeq). Materials from Uzbekistan as a representative developing country have been measured and a correction coefficient for Radium activity is deduced. The use of the correction coefficient offers the possibility to decrease analysis times thus enabling the express measurement of a large quantity of samples. The reduction in time, cost and the use of inexpensive equipment can democratize the practice of screening NORM in building materials in the international community.

  9. Direct sample positioning and alignment methodology for strain measurement by diffraction

    NASA Astrophysics Data System (ADS)

    Ratel, N.; Hughes, D. J.; King, A.; Malard, B.; Chen, Z.; Busby, P.; Webster, P. J.

    2005-05-01

    An ISO (International Organization for Standardization) TTA (Technology Trends Assessment) was published in 2001 for the determination of residual stress using neutron diffraction which identifies sample alignment and positioning as a key source of strain measurement error. Although the measurement uncertainty by neutron and synchrotron x-ray diffraction for an individual measurement of lattice strain is typically of the order of 10-100×10-6, specimens commonly exhibit strain gradients of 1000×10-6mm-1 or more, making sample location a potentially considerable source of error. An integrated approach to sample alignment and positioning is described which incorporates standard base-plates and sample holders, instrument alignment procedures, accurate digitization using a coordinate measuring machine and automatic generation of instrument control scripts. The methodology that has been developed is illustrated by the measurement of the transverse residual strain field in a welded steel T-joint using neutrons.

  10. Measurements of Intracellular Ca2+ Content and Phosphatidylserine Exposure in Human Red Blood Cells: Methodological Issues.

    PubMed

    Wesseling, Mauro C; Wagner-Britz, Lisa; Boukhdoud, Fatima; Asanidze, Salome; Nguyen, Duc Bach; Kaestner, Lars; Bernhardt, Ingolf

    2016-01-01

    The increase of the intracellular Ca2+ content as well as the exposure of phosphatidylserine (PS) on the outer cell membrane surface after activation of red blood cells (RBCs) by lysophosphatidic acid (LPA) has been investigated by a variety of research groups. Carrying out experiments, which we described in several previous publications, we observed some discrepancies when comparing data obtained by different investigators within our research group and also between batches of LPA. In addition, we found differences comparing the results of double and single labelling experiments (for Ca2+ and PS). Furthermore, the results of PS exposure depended on the fluorescent dye used (annexin V-FITC versus annexin V alexa fluor® 647). Therefore, it seems necessary to investigate these methodological approaches in more detail to be able to quantify results and to compare data obtained by different research groups. The intracellular Ca2+ content and the PS exposure of RBCs separated from whole blood have been investigated after treatment with LPA (2.5 µM) obtained from three different companies (Sigma-Aldrich, Cayman Chemical Company, and Santa Cruz Biotechnology Inc.). Fluo-4 and x-rhod-1 have been used to detect intracellular Ca2+ content, annexin V alexa fluor® 647 and annexin V-FITC have been used for PS exposure measurements. Both parameters (Ca2+ content, PS exposure) were studied using flow cytometry and fluorescence microscopy. The percentage of RBCs showing increased intracellular Ca2+ content as well as PS exposure changes significantly between different LPA manufacturers as well as on the condition of mixing of LPA with the RBC suspension. Furthermore, the percentage of RBCs showing PS exposure is reduced in double labelling compared to single labelling experiments and depends also on the fluorescent dye used. Finally, data on Ca2+ content are slightly affected whereas PS exposure data are not affected significantly by the measuring method (flow cytometry

  11. An in-situ soil structure characterization methodology for measuring soil compaction

    NASA Astrophysics Data System (ADS)

    Dobos, Endre; Kriston, András; Juhász, András; Sulyok, Dénes

    2016-04-01

    The agricultural cultivation has several direct and indirect effects on the soil properties, among which the soil structure degradation is the best known and most detectable one. Soil structure degradation leads to several water and nutrient management problems, which reduce the efficiency of agricultural production. There are several innovative technological approaches aiming to reduce these negative impacts on the soil structure. The tests, validation and optimization of these methods require an adequate technology to measure the impacts on the complex soil system. This study aims to develop an in-situ soil structure and root development testing methodology, which can be used in field experiments and which allows one to follow the real time changes in the soil structure - evolution / degradation and its quantitative characterization. The method is adapted from remote sensing image processing technology. A specifically transformed A/4 size scanner is placed into the soil into a safe depth that cannot be reached by the agrotechnical treatments. Only the scanner USB cable comes to the surface to allow the image acquisition without any soil disturbance. Several images from the same place can be taken throughout the vegetation season to follow the soil consolidation and structure development after the last tillage treatment for the seedbed preparation. The scanned image of the soil profile is classified using supervised image classification, namely the maximum likelihood classification algorithm. The resulting image has two principal classes, soil matrix and pore space and other complementary classes to cover the occurring thematic classes, like roots, stones. The calculated data is calibrated with filed sampled porosity data. As the scanner is buried under the soil with no changes in light conditions, the image processing can be automated for better temporal comparison. Besides the total porosity each pore size fractions and their distributions can be calculated for

  12. A methodology to determine boundary conditions from forced convection experiments using liquid crystal thermography

    NASA Astrophysics Data System (ADS)

    Jakkareddy, Pradeep S.; Balaji, C.

    2016-05-01

    This paper reports the results of an experimental study to estimate the heat flux and convective heat transfer coefficient using liquid crystal thermography and Bayesian inference in a heat generating sphere, enclosed in a cubical Teflon block. The geometry considered for the experiments comprises a heater inserted in a hollow hemispherical aluminium ball, resulting in a volumetric heat generation source that is placed at the center of the Teflon block. Calibrated thermochromic liquid crystal sheets are used to capture the temperature distribution at the front face of the Teflon block. The forward model is the three dimensional conduction equation which is solved within the Teflon block to obtain steady state temperatures, using COMSOL. Match up experiments are carried out for various velocities by minimizing the residual between TLC and simulated temperatures for every assumed loss coefficient, to obtain a correlation of average Nusselt number against Reynolds number. This is used for prescribing the boundary condition for the solution to the forward model. A surrogate model obtained by artificial neural network built upon the data from COMSOL simulations is used to drive a Markov Chain Monte Carlo based Metropolis Hastings algorithm to generate the samples. Bayesian inference is adopted to solve the inverse problem for determination of heat flux and heat transfer coefficient from the measured temperature field. Point estimates of the posterior like the mean, maximum a posteriori and standard deviation of the retrieved heat flux and convective heat transfer coefficient are reported. Additionally the effect of number of samples on the performance of the estimation process has been investigated.

  13. A methodology to determine boundary conditions from forced convection experiments using liquid crystal thermography

    NASA Astrophysics Data System (ADS)

    Jakkareddy, Pradeep S.; Balaji, C.

    2017-02-01

    This paper reports the results of an experimental study to estimate the heat flux and convective heat transfer coefficient using liquid crystal thermography and Bayesian inference in a heat generating sphere, enclosed in a cubical Teflon block. The geometry considered for the experiments comprises a heater inserted in a hollow hemispherical aluminium ball, resulting in a volumetric heat generation source that is placed at the center of the Teflon block. Calibrated thermochromic liquid crystal sheets are used to capture the temperature distribution at the front face of the Teflon block. The forward model is the three dimensional conduction equation which is solved within the Teflon block to obtain steady state temperatures, using COMSOL. Match up experiments are carried out for various velocities by minimizing the residual between TLC and simulated temperatures for every assumed loss coefficient, to obtain a correlation of average Nusselt number against Reynolds number. This is used for prescribing the boundary condition for the solution to the forward model. A surrogate model obtained by artificial neural network built upon the data from COMSOL simulations is used to drive a Markov Chain Monte Carlo based Metropolis Hastings algorithm to generate the samples. Bayesian inference is adopted to solve the inverse problem for determination of heat flux and heat transfer coefficient from the measured temperature field. Point estimates of the posterior like the mean, maximum a posteriori and standard deviation of the retrieved heat flux and convective heat transfer coefficient are reported. Additionally the effect of number of samples on the performance of the estimation process has been investigated.

  14. Traction and film thickness measurements under starved elastohydrodynamic conditions

    NASA Technical Reports Server (NTRS)

    Wedeven, L. D.

    1974-01-01

    Traction measurements under starved elastohydrodynamic conditions were obtained for a point contact geometry. Simultaneous measurements of the film thickness and the locations of the inlet lubricant boundary were made optically. The thickness of a starved film for combination rolling and sliding conditions varies with the location of the inlet boundary in the same way found previously for pure rolling. A starved film was observed to possess greater traction than a flooded film for the same slide roll ratio. For a given slide roll ratio a starved film simply increases the shear rate in the Hertz region. The maximum shear rate depends on the degree of starvation and has no theoretical limit. Traction measurements under starved conditions were compared with flooded conditions under equivalent shear rates in the Hertz region. When the shear rates in the Hertz region were low and the film severely starved, the measured tractions were found to be much lower than expected.

  15. A methodology to condition distorted acoustic emission signals to identify fracture timing from human cadaver spine impact tests.

    PubMed

    Arun, Mike W J; Yoganandan, Narayan; Stemper, Brian D; Pintar, Frank A

    2014-12-01

    While studies have used acoustic sensors to determine fracture initiation time in biomechanical studies, a systematic procedure is not established to process acoustic signals. The objective of the study was to develop a methodology to condition distorted acoustic emission data using signal processing techniques to identify fracture initiation time. The methodology was developed from testing a human cadaver lumbar spine column. Acoustic sensors were glued to all vertebrae, high-rate impact loading was applied, load-time histories were recorded (load cell), and fracture was documented using CT. Compression fracture occurred to L1 while other vertebrae were intact. FFT of raw voltage-time traces were used to determine an optimum frequency range associated with high decibel levels. Signals were bandpass filtered in this range. Bursting pattern was found in the fractured vertebra while signals from other vertebrae were silent. Bursting time was associated with time of fracture initiation. Force at fracture was determined using this time and force-time data. The methodology is independent of selecting parameters a priori such as fixing a voltage level(s), bandpass frequency and/or using force-time signal, and allows determination of force based on time identified during signal processing. The methodology can be used for different body regions in cadaver experiments.

  16. Assessing the Practical Equivalence of Conversions when Measurement Conditions Change

    ERIC Educational Resources Information Center

    Liu, Jinghua; Dorans, Neil J.

    2012-01-01

    At times, the same set of test questions is administered under different measurement conditions that might affect the psychometric properties of the test scores enough to warrant different score conversions for the different conditions. We propose a procedure for assessing the practical equivalence of conversions developed for the same set of test…

  17. Effects of Testing Conditions on Self-Concept Measurement

    ERIC Educational Resources Information Center

    Chandler, Theodore A.; And Others

    1976-01-01

    Many self-concept measures employ several different scales to which the subject responds in a set order at one sitting. This study examined effects of different testing conditions. The Index of Adjustment and Values (IAV) was administered to 191 graduate students under two different sequences and two time delay conditions. Results indicate…

  18. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a... donation rate of eligible donors as a percentage of eligible deaths is no more than 1.5 standard...

  19. Methodology for the calibration of and data acquisition with a six-degree-of-freedom acceleration measurement device

    NASA Astrophysics Data System (ADS)

    Lee, Harvey; Plank, Gordon; Weinstock, Herbert; Coltman, Michael

    1989-06-01

    Described here is a methodology for calibrating and gathering data with a six-degree-of-freedom acceleration measurement device that is intended to measure head acceleration of anthropomorphic dummies and human volunteers in automotive crash testing and head impact trauma studies. Error models (system equations) were developed for systems using six accelerometers in a coplanar (3-2-1) configuration, nine accelerometers in a coplanar (3-3-3) configuration and nine accelerometers in a non-coplanar (3-2-2-2) configuration and the accuracy and stability of these systems were compared. The model was verified under various input and computational conditions. Results of parametric sensitivity analyses which included parameters such as system geometry, coordinate system location, data sample rate and accelerometer cross axis sensitivities are presented. Recommendations to optimize data collection and reduction are given. Complete source listings of all of the software developed are presented.

  20. Optimisation of Ultrasound-Assisted Extraction Conditions for Phenolic Content and Antioxidant Capacity from Euphorbia tirucalli Using Response Surface Methodology.

    PubMed

    Vuong, Quan V; Goldsmith, Chloe D; Dang, Trung Thanh; Nguyen, Van Tang; Bhuyan, Deep Jyoti; Sadeqzadeh, Elham; Scarlett, Christopher J; Bowyer, Michael C

    2014-09-17

    Euphorbia tirucalli (E. tirucalli) is now widely distributed around the world and is well known as a source of traditional medicine in many countries. This study aimed to utilise response surface methodology (RSM) to optimise ultrasonic-assisted extraction (UAE) conditions for total phenolic compounds (TPC) and antioxidant capacity from E. tirucalli leaf. The results showed that ultrasonic temperature, time and power effected TPC and antioxidant capacity; however, the effects varied. Ultrasonic power had the strongest influence on TPC; whereas ultrasonic temperature had the greatest impact on antioxidant capacity. Ultrasonic time had the least impact on both TPC and antioxidant capacity. The optimum UAE conditions were determined to be 50 °C, 90 min. and 200 W. Under these conditions, the E. tirucalli leaf extract yielded 2.93 mg GAE/g FW of TPC and exhibited potent antioxidant capacity. These conditions can be utilised for further isolation and purification of phenolic compounds from E. tirucalli leaf.

  1. Optimisation of Ultrasound-Assisted Extraction Conditions for Phenolic Content and Antioxidant Capacity from Euphorbia tirucalli Using Response Surface Methodology

    PubMed Central

    Vuong, Quan V.; Goldsmith, Chloe D.; Dang, Trung Thanh; Nguyen, Van Tang; Bhuyan, Deep Jyoti; Sadeqzadeh, Elham; Scarlett, Christopher J.; Bowyer, Michael C.

    2014-01-01

    Euphorbia tirucalli (E. tirucalli) is now widely distributed around the world and is well known as a source of traditional medicine in many countries. This study aimed to utilise response surface methodology (RSM) to optimise ultrasonic-assisted extraction (UAE) conditions for total phenolic compounds (TPC) and antioxidant capacity from E. tirucalli leaf. The results showed that ultrasonic temperature, time and power effected TPC and antioxidant capacity; however, the effects varied. Ultrasonic power had the strongest influence on TPC; whereas ultrasonic temperature had the greatest impact on antioxidant capacity. Ultrasonic time had the least impact on both TPC and antioxidant capacity. The optimum UAE conditions were determined to be 50 °C, 90 min. and 200 W. Under these conditions, the E. tirucalli leaf extract yielded 2.93 mg GAE/g FW of TPC and exhibited potent antioxidant capacity. These conditions can be utilised for further isolation and purification of phenolic compounds from E. tirucalli leaf. PMID:26785074

  2. [Drawbacks and new methodological approaches to forensic medical examination of sexual conditions].

    PubMed

    Dmitrieva, O A

    2005-01-01

    Drawbacks of forensic-medical examination of sexual conditions are due to problems of management and diagnosis. Updating expertise of sexual conditions means standardization, application of modern examination tools and devices, making more than one examinations, skills in examination of the man in suspected erectile dysfunction, purposeful search for sperma.

  3. An innovative methodology for measurement of stress distribution of inflatable membrane structures

    NASA Astrophysics Data System (ADS)

    Zhao, Bing; Chen, Wujun; Hu, Jianhui; Chen, Jianwen; Qiu, Zhenyu; Zhou, Jinyu; Gao, Chengjun

    2016-02-01

    The inflatable membrane structure has been widely used in the fields of civil building, industrial building, airship, super pressure balloon and spacecraft. It is important to measure the stress distribution of the inflatable membrane structure because it influences the safety of the structural design. This paper presents an innovative methodology for the measurement and determination of the stress distribution of the inflatable membrane structure under different internal pressures, combining photogrammetry and the force-finding method. The shape of the inflatable membrane structure is maintained by the use of pressurized air, and the internal pressure is controlled and measured by means of an automatic pressure control system. The 3D coordinates of the marking points pasted on the membrane surface are acquired by three photographs captured from three cameras based on photogrammetry. After digitizing the markings on the photographs, the 3D curved surfaces are rebuilt. The continuous membrane surfaces are discretized into quadrilateral mesh and simulated by membrane links to calculate the stress distributions using the force-finding method. The internal pressure is simplified to the external node forces in the normal direction according to the contributory area of the node. Once the geometry x, the external force r and the topology C are obtained, the unknown force densities q in each link can be determined. Therefore, the stress distributions of the inflatable membrane structure can be calculated, combining the linear adjustment theory and the force density method based on the force equilibrium of inflated internal pressure and membrane internal force without considering the mechanical properties of the constitutive material. As the use of the inflatable membrane structure is attractive in the field of civil building, an ethylene-tetrafluoroethylene (ETFE) cushion is used with the measurement model to validate the proposed methodology. The comparisons between the

  4. Structural acoustic control of plates with variable boundary conditions: design methodology.

    PubMed

    Sprofera, Joseph D; Cabell, Randolph H; Gibbs, Gary P; Clark, Robert L

    2007-07-01

    A method for optimizing a structural acoustic control system subject to variations in plate boundary conditions is provided. The assumed modes method is used to build a plate model with varying levels of rotational boundary stiffness to simulate the dynamics of a plate with uncertain edge conditions. A transducer placement scoring process, involving Hankel singular values, is combined with a genetic optimization routine to find spatial locations robust to boundary condition variation. Predicted frequency response characteristics are examined, and theoretically optimized results are discussed in relation to the range of boundary conditions investigated. Modeled results indicate that it is possible to minimize the impact of uncertain boundary conditions in active structural acoustic control by optimizing the placement of transducers with respect to those uncertainties.

  5. Ultrasonic Technique for Density Measurement of Liquids in Extreme Conditions

    PubMed Central

    Kazys, Rymantas; Sliteris, Reimondas; Rekuviene, Regina; Zukauskas, Egidijus; Mazeika, Liudas

    2015-01-01

    An ultrasonic technique, invariant to temperature changes, for a density measurement of different liquids under in situ extreme conditions is presented. The influence of geometry and material parameters of the measurement system (transducer, waveguide, matching layer) on measurement accuracy and reliability is analyzed theoretically along with experimental results. The proposed method is based on measurement of the amplitude of the ultrasonic wave, reflected from the interface of the solid/liquid medium under investigation. In order to enhance sensitivity, the use of a quarter wavelength acoustic matching layer is proposed. Therefore, the sensitivity of the measurement system increases significantly. Density measurements quite often must be performed in extreme conditions at high temperature (up to 220 °C) and high pressure. In this case, metal waveguides between piezoelectric transducer and the measured liquid are used in order to protect the conventional transducer from the influence of high temperature and to avoid depolarization. The presented ultrasonic density measurement technique is suitable for density measurement in different materials, including liquids and polymer melts in extreme conditions. A new calibration algorithm was proposed. The metrological evaluation of the measurement method was performed. The expanded measurement uncertainty Uρ = 7.4 × 10−3 g/cm3 (1%). PMID:26262619

  6. The methodologies and instruments of vehicle particulate emission measurement for current and future legislative regulations

    NASA Astrophysics Data System (ADS)

    Otsuki, Yoshinori; Nakamura, Hiroshi; Arai, Masataka; Xu, Min

    2015-09-01

    Since the health risks associated with fine particles whose aerodynamic diameters are smaller than 2.5 μm was first proven, regulations restricting particulate matter (PM) mass emissions from internal combustion engines have become increasingly severe. Accordingly, the gravimetric method of PM mass measurement is facing its lower limit of detection as the emissions from vehicles are further reduced. For example, the variation in the adsorption of gaseous components such as hydrocarbons from unburned fuel and lubricant oil and the presence of agglomerated particles, which are not directly generated in engine combustion but re-entrainment particulates from walls of sampling pipes, can cause uncertainty in measurement. The PM mass measurement systems and methodologies have been continuously refined in order to improve measurement accuracy. As an alternative metric, the particle measurement programme (PMP) within the United Nations Economic Commission for Europe (UNECE) developed a solid particle number measurement method in order to improve the sensitivity of particulate emission measurement from vehicles. Consequently, particle number (PN) limits were implemented into the regulations in Europe from 2011. Recently, portable emission measurement systems (PEMS) for in-use vehicle emission measurements are also attracting attention, currently in North America and Europe, and real-time PM mass and PN instruments are under evaluation.

  7. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  8. Design and Verification Methodology of Boundary Conditions for Finite Volume Schemes

    DTIC Science & Technology

    2012-07-01

    and Grossman advocate a curvature corrected symmetry condition for an inviscid wall [3]. Balakrishnan and Fernandez advocate a variety of other methods...boundary source terms is straightforward, generally requiring much less algebraic manipulation than interior source terms. A number of test cases...Meeting, Reno, NV, January 1986. [3] A. Dadone and B. Grossman . Surface boundary conditions for the numerical solution of the euler equations. AIAA

  9. Don't fear 'fear conditioning': Methodological considerations for the design and analysis of studies on human fear acquisition, extinction, and return of fear.

    PubMed

    Lonsdorf, Tina B; Menz, Mareike M; Andreatta, Marta; Fullana, Miguel A; Golkar, Armita; Haaker, Jan; Heitland, Ivo; Hermann, Andrea; Kuhn, Manuel; Kruse, Onno; Drexler, Shira Meir; Meulders, Ann; Nees, Frauke; Pittig, Andre; Richter, Jan; Römer, Sonja; Shiban, Youssef; Schmitz, Anja; Straube, Benjamin; Vervliet, Bram; Wendt, Julia; Baas, Johanna M P; Merz, Christian J

    2017-03-03

    The so-called 'replicability crisis' has sparked methodological discussions in many areas of science in general, and in psychology in particular. This has led to recent endeavours to promote the transparency, rigour, and ultimately, replicability of research. Originating from this zeitgeist, the challenge to discuss critical issues on terminology, design, methods, and analysis considerations in fear conditioning research is taken up by this work, which involved representatives from fourteen of the major human fear conditioning laboratories in Europe. This compendium is intended to provide a basis for the development of a common procedural and terminology framework for the field of human fear conditioning. Whenever possible, we give general recommendations. When this is not feasible, we provide evidence-based guidance for methodological decisions on study design, outcome measures, and analyses. Importantly, this work is also intended to raise awareness and initiate discussions on crucial questions with respect to data collection, processing, statistical analyses, the impact of subtle procedural changes, and data reporting specifically tailored to the research on fear conditioning.

  10. Measuring exhaled nitric oxide in infants during tidal breathing: methodological issues.

    PubMed

    Franklin, Peter J; Turner, Stephen W; Mutch, Raewyn C; Stick, Stephen M

    2004-01-01

    Exhaled nitric oxide (FENO) may provide a tool for identifying infants at risk of developing allergic disease in childhood. In infants there is no standardized collection technique; however, the easiest method is to measure FENO during tidal breathing. In this study we investigated various methodological issues for tidal breathing (TB) FENO in infants. These included the effect of ambient NO, oral or nasal breathing, sedation, and tidal expiratory flow. Furthermore, we compared TB FENO in 88 infants with and without wheeze. Ambient NO greater than 5 ppb significantly affected FENO. There was no significant difference between NO levels measured during either oral or nasal breathing; however, there was a significant difference between levels collected from infants before and after sedation (P < 0.001). Tidal breathing FENO decreased with increasing tidal flows (P < 0.001) and increased with age (P = 0.002). There was no significant difference in mixed expired NO between healthy and wheezy children, but children with doctor-diagnosed eczema had significantly raised levels (P = 0.014). There seem to be important methodological limitations for measuring FENO in infants during TB. Copyright 2004 Wiley-Liss, Inc.

  11. Aerosol classification using airborne High Spectral Resolution Lidar measurements - methodology and examples

    NASA Astrophysics Data System (ADS)

    Burton, S. P.; Ferrare, R. A.; Hostetler, C. A.; Hair, J. W.; Rogers, R. R.; Obland, M. D.; Butler, C. F.; Cook, A. L.; Harper, D. B.; Froyd, K. D.

    2012-01-01

    The NASA Langley Research Center (LaRC) airborne High Spectral Resolution Lidar (HSRL) on the NASA B200 aircraft has acquired extensive datasets of aerosol extinction (532 nm), aerosol optical depth (AOD) (532 nm), backscatter (532 and 1064 nm), and depolarization (532 and 1064 nm) profiles during 18 field missions that have been conducted over North America since 2006. The lidar measurements of aerosol intensive parameters (lidar ratio, depolarization, backscatter color ratio, and spectral depolarization ratio) are shown to vary with location and aerosol type. A methodology based on observations of known aerosol types is used to qualitatively classify the extensive set of HSRL aerosol measurements into eight separate types. Several examples are presented showing how the aerosol intensive parameters vary with aerosol type and how these aerosols are classified according to this new methodology. The HSRL-based classification reveals vertical variability of aerosol types during the NASA ARCTAS field experiment conducted over Alaska and northwest Canada during 2008. In two examples derived from flights conducted during ARCTAS, the HSRL classification of biomass burning smoke is shown to be consistent with aerosol types derived from coincident airborne in situ measurements of particle size and composition. The HSRL retrievals of AOD and inferences of aerosol types are used to apportion AOD to aerosol type; results of this analysis are shown for several experiments.

  12. Aerosol classification using airborne High Spectral Resolution Lidar measurements - methodology and examples

    NASA Astrophysics Data System (ADS)

    Burton, S. P.; Ferrare, R. A.; Hostetler, C. A.; Hair, J. W.; Rogers, R. R.; Obland, M. D.; Butler, C. F.; Cook, A. L.; Harper, D. B.; Froyd, K. D.

    2011-09-01

    The NASA Langley Research Center (LaRC) airborne High Spectral Resolution Lidar (HSRL) on the NASA B200 aircraft has acquired extensive datasets of aerosol extinction (532 nm), aerosol optical thickness (AOT) (532 nm), backscatter (532 and 1064 nm), and depolarization (532 and 1064 nm) profiles during 18 field missions that have been conducted over North America since 2006. The lidar measurements of aerosol intensive parameters (lidar ratio, depolarization, backscatter color ratio, and spectral depolarization ratio) are shown to vary with location and aerosol type. A methodology based on observations of known aerosol types is used to qualitatively classify the extensive set of HSRL aerosol measurements into eight separate types. Several examples are presented showing how the aerosol intensive parameters vary with aerosol type and how these aerosols are classified according to this new methodology. The HSRL-based classification reveals vertical variability of aerosol types during the NASA ARCTAS field experiment conducted over Alaska and northwest Canada during 2008. In two examples derived from flights conducted during ARCTAS, the HSRL classification of biomass burning smoke is shown to be consistent with aerosol types derived from coincident airborne in situ measurements of particle size and composition. The HSRL retrievals of AOT and inferences of aerosol types are used to apportion AOT to aerosol type; results of this analysis are shown for several experiments.

  13. Measurement of Lubricating Condition between Swashplate and Shoe in Swashplate Compressors under Practical Operating Conditions

    NASA Astrophysics Data System (ADS)

    Suzuki, Hisashi; Fukuta, Mitsuhiro; Yanagisawa, Tadashi

    In this paper, lubricating conditions between a swashplate and a shoe in a swashplate compressor for automotive air conditioners is investigated experimentally. The conditions are measured with an electric resistance method that utilizes the swash plate and the shoe as electrodes respectively. The instrumented compressor is connected to an experimental cycle with R134a and operated under various operating conditions of pressure and rotational speed. An improved measurement technique and applying a solid contact ratio to the measurement results permit to measure the lubricating condition at high rotational speed (more than 8000 rpm) and to predic an occurrence of scuffing between the swashplate and the shoe, and therefore enables a detailed study of lubricating characteristics. It is shown by the measurement that the voltage of the contact signal decreases, which means better lubricating condition, with the decrease of the compression pressure and with the increase of the rotational speed from 1000 rpm through 5000 rpm. The lubricating condition tends to worsen at more than 5000 rpm. Furthermore, it is confirmed that the lubricating condition under transient operation is worse obviously as compared with that under steady-state operation.

  14. Absolute Radiation Measurements in Earth and Mars Entry Conditions

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.

    2014-01-01

    This paper reports on the measurement of radiative heating for shock heated flows which simulate conditions for Mars and Earth entries. Radiation measurements are made in NASA Ames' Electric Arc Shock Tube at velocities from 3-15 km/s in mixtures of N2/O2 and CO2/N2/Ar. The technique and limitations of the measurement are summarized in some detail. The absolute measurements will be discussed in regards to spectral features, radiative magnitude and spatiotemporal trends. Via analysis of spectra it is possible to extract properties such as electron density, and rotational, vibrational and electronic temperatures. Relaxation behind the shock is analyzed to determine how these properties relax to equilibrium and are used to validate and refine kinetic models. It is found that, for some conditions, some of these values diverge from non-equilibrium indicating a lack of similarity between the shock tube and free flight conditions. Possible reasons for this are discussed.

  15. A method of measuring dynamic strain under electromagnetic forming conditions.

    PubMed

    Chen, Jinling; Xi, Xuekui; Wang, Sijun; Lu, Jun; Guo, Chenglong; Wang, Wenquan; Liu, Enke; Wang, Wenhong; Liu, Lin; Wu, Guangheng

    2016-04-01

    Dynamic strain measurement is rather important for the characterization of mechanical behaviors in electromagnetic forming process, but it has been hindered by high strain rate and serious electromagnetic interference for years. In this work, a simple and effective strain measuring technique for physical and mechanical behavior studies in the electromagnetic forming process has been developed. High resolution (∼5 ppm) of strain curves of a budging aluminum tube in pulsed electromagnetic field has been successfully measured using this technique. The measured strain rate is about 10(5) s(-1), which depends on the discharging conditions, nearly one order of magnitude of higher than that under conventional split Hopkins pressure bar loading conditions (∼10(4) s(-1)). It has been found that the dynamic fracture toughness of an aluminum alloy is significantly enhanced during the electromagnetic forming, which explains why the formability is much larger under electromagnetic forging conditions in comparison with conventional forging processes.

  16. Dielectric Barrier Discharge (DBD) Plasma Actuators Thrust-Measurement Methodology Incorporating New Anti-Thrust Hypothesis

    NASA Technical Reports Server (NTRS)

    Ashpis, David E.; Laun, Matthew C.

    2014-01-01

    We discuss thrust measurements of Dielectric Barrier Discharge (DBD) plasma actuators devices used for aerodynamic active flow control. After a review of our experience with conventional thrust measurement and significant non-repeatability of the results, we devised a suspended actuator test setup, and now present a methodology of thrust measurements with decreased uncertainty. The methodology consists of frequency scans at constant voltages. The procedure consists of increasing the frequency in a step-wise fashion from several Hz to the maximum frequency of several kHz, followed by frequency decrease back down to the start frequency of several Hz. This sequence is performed first at the highest voltage of interest, then repeated at lower voltages. The data in the descending frequency direction is more consistent and selected for reporting. Sample results show strong dependence of thrust on humidity which also affects the consistency and fluctuations of the measurements. We also observed negative values of thrust or "anti-thrust", at low frequencies between 4 Hz and up to 64 Hz. The anti-thrust is proportional to the mean-squared voltage and is frequency independent. Departures from the parabolic anti-thrust curve are correlated with appearance of visible plasma discharges. We propose the anti-thrust hypothesis. It states that the measured thrust is a sum of plasma thrust and anti-thrust, and assumes that the anti-thrust exists at all frequencies and voltages. The anti-thrust depends on actuator geometry and materials and on the test installation. It enables the separation of the plasma thrust from the measured total thrust. This approach enables more meaningful comparisons between actuators at different installations and laboratories. The dependence on test installation was validated by surrounding the actuator with a large diameter, grounded, metal sleeve.

  17. Optimization of Culture Conditions for Fermentation of Soymilk Using Lactobacillus casei by Response Surface Methodology.

    PubMed

    Khoshayand, Feriyar; Goodarzi, Sanaz; Shahverdi, Ahmad Reza; Khoshayand, Mohammad Reza

    2011-12-01

    Soymilk was fermented with Lactobacillus casei, and statistical experimental design was used to investigate factors affecting viable cells of L. casei, including temperature, glucose, niacin, riboflavin, pyridoxine, folic acid and pantothenic acid. Initial screening by Plackett-Burman design revealed that among these factors, temperature, glucose and niacin have significant effects on the growth of L. casei. Further optimization with Box-Behnken design and response surface analysis showed that a second-order polynomial model fits the experimental data appropriately. The optimum conditions for temperature, glucose and niacin were found to be 15.77 °C, 5.23 and 0.63 g/L, respectively. The concentration of viable L. casei cells under these conditions was 8.23 log10 (CFU/mL). The perfect agreement between the observed values and the values predicted by the equation confirms the statistical significance of the model and the model's adequate precision in predicting optimum conditions.

  18. Technical Guide for "Measuring Up 2006": Documenting Methodology, Indicators, and Data Sources. National Center #06-6

    ERIC Educational Resources Information Center

    National Center for Public Policy and Higher Education, 2006

    2006-01-01

    This "Technical Guide" describes the methodology and concepts used to measure and grade the performance of the 50 states in the higher education arena. Part I presents the methodology for grading states and provides information on data collection and reporting. Part II explains the indicators that comprise each of the graded categories.…

  19. Financial Measures Project: Measuring Financial Conditions of Colleges and Universities, 1978 Working Conference.

    ERIC Educational Resources Information Center

    Coldren, Sharon L., Ed.

    Papers are presented from a 1978 working conference on measuring financial conditions of colleges and universities. Contents include the following: "The Federal Government's Interest in the Development of Financial Measures" by M. Chandler; "Improving the Conceptual Framework for Measuring Financial Condition Using Institutional…

  20. Effects of Specimen Collection Methodologies and Storage Conditions on the Short-Term Stability of Oral Microbiome Taxonomy

    PubMed Central

    Luo, Ting; Srinivasan, Usha; Ramadugu, Kirtana; Shedden, Kerby A.; Neiswanger, Katherine; Trumble, Erika; Li, Jiean J.; McNeil, Daniel W.; Crout, Richard J.; Weyant, Robert J.; Marazita, Mary L.

    2016-01-01

    , resources available at a study site, and shipping requirements. The research presented in this paper measures the effects of multiple storage parameters and collection methodologies on the measured ecology of the oral microbiome from healthy adults and children. These results will potentially enable investigators to conduct oral microbiome studies at maximal efficiency by guiding informed administrative decisions pertaining to the necessary field or clinical work. PMID:27371581

  1. Effects of Specimen Collection Methodologies and Storage Conditions on the Short-Term Stability of Oral Microbiome Taxonomy.

    PubMed

    Luo, Ting; Srinivasan, Usha; Ramadugu, Kirtana; Shedden, Kerby A; Neiswanger, Katherine; Trumble, Erika; Li, Jiean J; McNeil, Daniel W; Crout, Richard J; Weyant, Robert J; Marazita, Mary L; Foxman, Betsy

    2016-09-15

    a study site, and shipping requirements. The research presented in this paper measures the effects of multiple storage parameters and collection methodologies on the measured ecology of the oral microbiome from healthy adults and children. These results will potentially enable investigators to conduct oral microbiome studies at maximal efficiency by guiding informed administrative decisions pertaining to the necessary field or clinical work. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  2. Hyper-X Mach 10 Engine Flowpath Development: Fifth Entry Test Conditions and Methodology

    NASA Technical Reports Server (NTRS)

    Bakos, R. J.; Tsai, C.-Y.; Rogers, R. C.; Shih, A. T.

    2001-01-01

    A series of Hyper-X Mach 10 flowpath ground tests are underway to obtain engine performance and operation data and to confirm and refine the flowpath design methods. The model used is a full-scale height, partial-width replica of the Hyper-X Research Vehicle propulsive flowpath with truncated forebody and aftbody. This is the fifth test entry for this model in the NASA-HYPULSE facility at GASL. For this entry the facility nozzle and model forebody were modified to better simulate the engine inflow conditions at the target flight conditions. The forebody was modified to be a wide flat plate with no flow fences, the facility nozzle Mach number was increased, and the model was positioned to be tested in a semi-direct-connect arrangement. This paper presents a review of the test conditions, model calibrations, and a description of steady flow confirmation. The test series included runs using hydrogen fuel, and a silane-in-hydrogen fuel mixture. Other test parameters included the model mounting angle (relative to the tunnel flow), and the test gas oxygen fraction to account for the presence of [NO] in the test gas at the M10 conditions.

  3. Using intensive indicators of wetland condition to evaluate a rapid assessment methodology in Oregon tidal wetlands

    EPA Science Inventory

    In 2011, the US EPA and its partners will conduct the first-ever national survey on the condition of the Nation’s wetlands. This survey will utilize a three-tiered assessment approach that includes landscape level indicators (Level 1), rapid indicators (Level 2), and intensive, ...

  4. Using intensive indicators of wetland condition to evaluate a rapid assessment methodology in Oregon tidal wetlands

    EPA Science Inventory

    In 2011, the US EPA and its partners will conduct the first-ever national survey on the condition of the Nation’s wetlands. This survey will utilize a three-tiered assessment approach that includes landscape level indicators (Level 1), rapid indicators (Level 2), and intensive, ...

  5. Deception detection with behavioral, autonomic, and neural measures: Conceptual and methodological considerations that warrant modesty.

    PubMed

    Meijer, Ewout H; Verschuere, Bruno; Gamer, Matthias; Merckelbach, Harald; Ben-Shakhar, Gershon

    2016-05-01

    The detection of deception has attracted increased attention among psychological researchers, legal scholars, and ethicists during the last decade. Much of this has been driven by the possibility of using neuroimaging techniques for lie detection. Yet, neuroimaging studies addressing deception detection are clouded by lack of conceptual clarity and a host of methodological problems that are not unique to neuroimaging. We review the various research paradigms and the dependent measures that have been adopted to study deception and its detection. In doing so, we differentiate between basic research designed to shed light on the neurocognitive mechanisms underlying deceptive behavior and applied research aimed at detecting lies. We also stress the distinction between paradigms attempting to detect deception directly and those attempting to establish involvement by detecting crime-related knowledge, and discuss the methodological difficulties and threats to validity associated with each paradigm. Our conclusion is that the main challenge of future research is to find paradigms that can isolate cognitive factors associated with deception, rather than the discovery of a unique (brain) correlate of lying. We argue that the Comparison Question Test currently applied in many countries has weak scientific validity, which cannot be remedied by using neuroimaging measures. Other paradigms are promising, but the absence of data from ecologically valid studies poses a challenge for legal admissibility of their outcomes.

  6. Gas concentration measurement by optical similitude absorption spectroscopy: methodology and experimental demonstration.

    PubMed

    Anselmo, Christophe; Welschinger, Jean-Yves; Cariou, Jean-Pierre; Miffre, Alain; Rairoux, Patrick

    2016-06-13

    We propose a new methodology to measure gas concentration by light-absorption spectroscopy when the light source spectrum is larger than the spectral width of one or several molecular gas absorption lines. We named it optical similitude absorption spectroscopy (OSAS), as the gas concentration is derived from a similitude between the light source and the target gas spectra. The main OSAS-novelty lies in the development of a robust inversion methodology, based on the Newton-Raphson algorithm, which allows retrieving the target gas concentration from spectrally-integrated differential light-absorption measurements. As a proof, OSAS is applied in laboratory to the 2ν3 methane absorption band at 1.66 µm with uncertainties revealed by the Allan variance. OSAS has also been applied to non-dispersive infra-red and the optical correlation spectroscopy arrangements. This all-optics gas concentration retrieval does not require the use of a gas calibration cell and opens new tracks to atmospheric gas pollution and greenhouse gases sources monitoring.

  7. Advanced haptic sensor for measuring human skin conditions

    NASA Astrophysics Data System (ADS)

    Tsuchimi, Daisuke; Okuyama, Takeshi; Tanaka, Mami

    2009-12-01

    This paper is concerned with the development of a tactile sensor using PVDF (Polyvinylidene Fluoride) film as a sensory receptor of the sensor to evaluate softness, smoothness, and stickiness of human skin. Tactile sense is the most important sense in the sensation receptor of the human body along with eyesight, and we can examine skin condition quickly using these sense. But, its subjectivity and ambiguity make it difficult to quantify skin conditions. Therefore, development of measurement device which can evaluate skin conditions easily and objectively is demanded by dermatologists, cosmetic industries, and so on. In this paper, an advanced haptic sensor system that can measure multiple information of skin condition in various parts of human body is developed. The applications of the sensor system to evaluate softness, smoothness, and stickiness of skin are investigated through two experiments.

  8. Advanced haptic sensor for measuring human skin conditions

    NASA Astrophysics Data System (ADS)

    Tsuchimi, Daisuke; Okuyama, Takeshi; Tanaka, Mami

    2010-01-01

    This paper is concerned with the development of a tactile sensor using PVDF (Polyvinylidene Fluoride) film as a sensory receptor of the sensor to evaluate softness, smoothness, and stickiness of human skin. Tactile sense is the most important sense in the sensation receptor of the human body along with eyesight, and we can examine skin condition quickly using these sense. But, its subjectivity and ambiguity make it difficult to quantify skin conditions. Therefore, development of measurement device which can evaluate skin conditions easily and objectively is demanded by dermatologists, cosmetic industries, and so on. In this paper, an advanced haptic sensor system that can measure multiple information of skin condition in various parts of human body is developed. The applications of the sensor system to evaluate softness, smoothness, and stickiness of skin are investigated through two experiments.

  9. Multispectroscopic methodology to study Libyan desert glass and its formation conditions.

    PubMed

    Gomez-Nubla, Leticia; Aramendia, Julene; Fdez-Ortiz de Vallejuelo, Silvia; Alonso-Olazabal, Ainhoa; Castro, Kepa; Zuluaga, Maria Cruz; Ortega, Luis Ángel; Murelaga, Xabier; Madariaga, Juan Manuel

    2017-03-27

    Libyan desert glass (LDG) is a melt product whose origin is still a matter of controversy. With the purpose of adding new information about this enigma, the present paper analyzes the inner part of LDG specimens and compares them with the results of LDG surfaces. An integrated analytical methodology was used combining different techniques such as Raman spectroscopy, in point-by-point and imaging modes, scanning electron microscopy with X-ray microanalysis (SEM-EDS), energy-dispersive micro X-ray fluorescence spectrometry (μ-EDXRF), electron probe micro analyzer (EPMA), and optical cathodoluminescence (Optical-CL). According to our results, flow structures of the melt and the amorphous nature of the matrix could be discerned. Moreover, the observed displacement of Raman bands, such as in the cases of quartz and zircon, and the identification of certain compounds such as coesite (the most clarifying phase of high pressures), α-cristobalite, gypsum, anhydrite, corundum, rutile, amorphous calcite, aragonite, and calcite allowed us to know that LDGs could be subjected to shock pressures between 6 and more than 30 GPa, and temperatures between 300 and 1470 °C. The differences of temperature and pressure would be provoked by different cooling processes during the impact. Besides, in most cases the minerals corresponding to high pressure and temperatures were located in the inner part of the LDGs, with some exceptions that could be explained because they were trapped subsequently to the impact; there was more than one impact or heterogeneous cooling.Furthermore, nitrogen and oxygen gases were identified inside bubbles, which could have been introduced from the terrestrial atmosphere during the meteorite impact.These data helped us to clarify some clues about the origin of these enigmatic samples.

  10. Noncontacting measurement technologies for space propulsion condition monitoring

    NASA Technical Reports Server (NTRS)

    Randall, M. R.; Barkhoudarian, S.; Collins, J. J.; Schwartzbart, A.

    1987-01-01

    This paper describes four noncontacting measurement technologies that can be used in a turbopump condition monitoring system. The isotope wear analyzer, fiberoptic deflectometer, brushless torque-meter, and fiberoptic pyrometer can be used to monitor component wear, bearing degradation, instantaneous shaft torque, and turbine blade cracking, respectively. A complete turbopump condition monitoring system including these four technologies could predict remaining component life, thus reducing engine operating costs and increasing reliability.

  11. Noncontacting measurement technologies for space propulsion condition monitoring

    NASA Technical Reports Server (NTRS)

    Randall, M. R.; Barkhoudarian, S.; Collins, J. J.; Schwartzbart, A.

    1987-01-01

    This paper describes four noncontacting measurement technologies that can be used in a turbopump condition monitoring system. The isotope wear analyzer, fiberoptic deflectometer, brushless torque-meter, and fiberoptic pyrometer can be used to monitor component wear, bearing degradation, instantaneous shaft torque, and turbine blade cracking, respectively. A complete turbopump condition monitoring system including these four technologies could predict remaining component life, thus reducing engine operating costs and increasing reliability.

  12. Measuring Pavlovian fear with conditioned freezing and conditioned suppression reveals different roles for the basolateral amygdala.

    PubMed

    McDannald, Michael A; Galarce, Ezequiel M

    2011-02-16

    In Pavlovian fear conditioning, pairing a neutral cue with aversive foot shock endows a cue with fear-eliciting properties. Studies of Pavlovian fear conditioning measuring freezing have demonstrated the basolateral amygdala (BLA) to be critical to both fear learning and memory. The nucleus accumbens core (NAc), while not important to freezing, is important to the enhancement of instrumental responding by cues paired with food reward. In the present study we investigated the role of the BLA and the NAc in another property of fear cues, the ability to suppress instrumental responding for food rewards (conditioned suppression). Sham, BLA and NAc-lesioned rats received a fear discrimination procedure in which one visual cue (CS+) predicted foot shock while a second cue (CS-) did not. Conditioning took place over a baseline of instrumental responding, allowing for concurrent measure of freezing and instrumental suppression. NAc lesions left fear conditioning fully intact. BLA lesions impaired acquisition and discrimination of fear when assessed with conditioned freezing. However, BLA lesions only altered fear acquisition and left discrimination completely intact when assessed with conditioned suppression. These findings suggest a critical role for the BLA in fear when assessed with conditioned freezing but a diminished role when assessed with conditioned suppression.

  13. Analytical methodology for the study of endosulfan bioremediation under controlled conditions with white rot fungi.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cesio, Verónica; Cerdeiras, M Pía; Heinzen, Horacio

    2012-10-15

    A general procedure to study the biodegradation of endosulfan under laboratory conditions by white rot fungi isolated from native sources growing in YNB (yeast nitrogen base) media with 1% of glucose is presented. The evaluation of endosulfan biodegradation as well as endosulfan sulfate, endosulfan ether and endosulfan alcohol production throughout the whole bioremedation process was performed using an original and straightforward validated analytical procedure with recoveries between 78 and 112% at all concentration levels studied except for endosulfan sulfate at 50 mg L(-1) that yielded 128% and RSDs<20%. Under the developed conditions, the basidiomycete Bjerkandera adusta was able to degrade 83% of (alpha+beta) endosulfan after 27 days, 6 mg kg(-1) of endosulfan diol were determined; endosulfan ether and endosulfan sulfate were produced below 1 mg kg(-1) (LOQ, limit of quantitation).

  14. Methodological approach towards the definition of new storage conditions for inert wastes.

    PubMed

    Perrodin, Y; Méhu, J; Grelier-Volatier, L; Charbonnierb, P; Baranger, P; Thoraval, L

    2002-01-01

    In 1997, the French Ministry of Environment launched studies aiming to define a specific regulation concerning inert waste disposal in order to limit potential impact of such facilities on the environment by fixing minimum requirements. A model (chemical model/hydrodynamic model) was developed to determine dumping conditions. This model was then applied on two defined scenarios (landfill surface, effective rainfalls...) in order to study the sulphate concentrations in aquifer system immediately downstream from the storage facility. Results allow us to determine in which conditions the sulphates concentrations are compatibles with the potentially drinkable character of the groundwater. They more specifically concern the nature of the waste disposed of, the efficient rainfalls and the landfill area.

  15. Advanced methodology for measuring the extensive elastic compliance and mechanical loss directly in k31 mode piezoelectric ceramic plates

    NASA Astrophysics Data System (ADS)

    Majzoubi, Maryam; Shekhani, Husain N.; Bansal, Anushka; Hennig, Eberhard; Scholehwar, Timo; Uchino, Kenji

    2016-12-01

    Dielectric, elastic, and piezoelectric constants, and their corresponding losses are defined under constant conditions of two categories; namely, intensive (i.e., E, electric field or T, stress), and extensive (i.e., D, dielectric displacement or x, strain) ones. So far, only the intensive parameters and losses could be measured directly in a k31 mode sample. Their corresponding extensive parameters could be calculated indirectly using the coupling factor and "K" matrix. However, the extensive loss parameters, calculated through this indirect method, could have large uncertainty, due to the error propagation in calculation. In order to overcome this issue, extensive losses should be measured separately from the measurable intensive ones in lead-zirconate-titanate (PZT) k31 mode rectangular plate ceramics. We propose a new mechanical-excitation methodology, using a non-destructive testing approach by means of a partial electrode configuration, instead of the conventional full electrode configuration. For this purpose, a non-electrode sample was prepared, where the electrode covered only 10% of the top and bottom surfaces at the center to actuate the whole sample, and also monitor the responding vibration. The admittance spectrum of this sample, corresponds to PZT properties under dielectric displacement D constant condition. Furthermore, ceramics with partial-electrodes were also prepared to create short and open circuit boundary conditions, attributing to resonance and anti-resonance modes. In the proposed way, we were able to measure both intensive and extensive elastic compliances and mechanical losses directly for the first time. The accuracy of this new method is compared with the conventional measurements by use of indirect calculations. The preliminary results (by neglecting the 10% actuator part difference at this point) were obtained, which were in good agreements (less than 3% difference) with the previous indirect method.

  16. Intercomparison of magnetic field measurements near MV/LV transformer substations: methodological planning and results.

    PubMed

    Violanti, S; Fraschetta, M; Adda, S; Caputo, E

    2009-12-01

    Within the framework of Environmental Agencies system's activities, coordinated by ISPRA (superior institute for environmental protection and research), a comparison among measurements was designed and accomplished, in order to go into depth on the matter of measurement problems and to evaluate magnetic field at power frequencies. These measurements have been taken near medium voltage /low voltage transformer substation. This project was developed with the contribution of several experts who belong to different Regional Agencies. In three of these regions, substations having specific international standard characteristics were chosen; then a measurement and data analysis protocol was arranged. Data analysis showed a good level of coherence among results obtained by different laboratories. However, a range of problems emerged, either during the protocol predisposition and definition of the data analysis procedure or during the execution of measures and data reprocessing, because of the spatial and temporal variability of magnetic field. These problems represent elements of particular interest in determining a correct measurement methodology, whose purpose is the comparison with limits of exposure, attention values and quality targets.

  17. Comparison of methodologies in determining bone marrow fat percentage under different environmental conditions.

    PubMed

    Murden, David; Hunnam, Jaimie; De Groef, Bert; Rawlin, Grant; McCowan, Christina

    2017-01-01

    The use of bone marrow fat percentage has been recommended in assessing body condition at the time of death in wild and domestic ruminants, but few studies have looked at the effects of time and exposure on animal bone marrow. We investigated the utility of bone marrow fat extraction as a tool for establishing antemortem body condition in postmortem specimens from sheep and cattle, particularly after exposure to high heat, and compared different techniques of fat extraction for this purpose. Femora were collected from healthy and "skinny" sheep and cattle. The bones were either frozen or subjected to 40°C heat; heated bones were either wrapped in plastic to minimize desiccation or were left unwrapped. Marrow fat percentage was determined at different time intervals by oven-drying, or by solvent extraction using hexane in manual equipment or a Soxhlet apparatus. Extraction was performed, where possible, on both wet and dried tissue. Multiple samples were tested from each bone. Bone marrow fat analysis using a manual, hexane-based extraction technique was found to be a moderately sensitive method of assessing antemortem body condition of cattle up to 6 d after death. Multiple replicates should be analyzed where possible. Samples from "skinny" sheep showed a different response to heat from those of "healthy" sheep; "skinny" samples were so reduced in quantity by day 6 (the first sampling day) that no individual testing could be performed. Further work is required to understand the response of sheep marrow.

  18. A smartphone-driven methodology for estimating physical activities and energy expenditure in free living conditions.

    PubMed

    Guidoux, Romain; Duclos, Martine; Fleury, Gérard; Lacomme, Philippe; Lamaudière, Nicolas; Manenq, Pierre-Henri; Paris, Ludivine; Ren, Libo; Rousset, Sylvie

    2014-12-01

    This paper introduces a function dedicated to the estimation of total energy expenditure (TEE) of daily activities based on data from accelerometers integrated into smartphones. The use of mass-market sensors such as accelerometers offers a promising solution for the general public due to the growing smartphone market over the last decade. The TEE estimation function quality was evaluated using data from intensive numerical experiments based, first, on 12 volunteers equipped with a smartphone and two research sensors (Armband and Actiheart) in controlled conditions (CC) and, then, on 30 other volunteers in free-living conditions (FLC). The TEE given by these two sensors in both conditions and estimated from the metabolic equivalent tasks (MET) in CC served as references during the creation and evaluation of the function. The TEE mean gap in absolute value between the function and the three references was 7.0%, 16.4% and 2.7% in CC, and 17.0% and 23.7% according to Armband and Actiheart, respectively, in FLC. This is the first step in the definition of a new feedback mechanism that promotes self-management and daily-efficiency evaluation of physical activity as part of an information system dedicated to the prevention of chronic diseases.

  19. [Methodology and Implementation of Forced Oscillation Technique for Respiratory Mechanics Measurement].

    PubMed

    Zhang, Zhengbo; Ni, Lu; Liu, Xiaoli; Li, Deyu; Wang, Weidong

    2015-11-01

    The forced oscillation technique (FOT) is a noninvasive method for respiratory mechanics measurement. For the FOT, external signals (e.g. forced oscillations around 4-40 Hz) are used to drive the respiratory system, and the mechanical characteristic of the respiratory system can be determined with the linear system identification theory. Thus, respiratory mechanical properties and components at different frequency and location of the airway can be explored by specifically developed forcing waveforms. In this paper, the theory, methodology and clinical application of the FOT is reviewed, including measure ment theory, driving signals, models of respiratory system, algorithm for impedance identification, and requirement on apparatus. Finally, the future development of this technique is also discussed.

  20. [Methodological issues in the measurement of alcohol consumption: the importance of drinking patterns].

    PubMed

    Valencia Martín, José L; González, M José; Galán, Iñaki

    2014-08-01

    Measurement of alcohol consumption is essential for proper investigation of its effects on health. However, its estimation is extremely complex, because of the diversity of forms of alcohol consumption and their highly heterogeneous classification. Moreover, each form may have different effects on health; therefore, not considering the most important drinking patterns when estimating alcohol intake could mask the important role of consumption patterns in these effects. All these issues make it very difficult to compare the results of different studies and to establish consistent associations for understanding the true effects of alcohol consumption, both overall and specific to each drinking pattern. This article reviews the main methods and sources of information available in Spain for estimating the most important aspects of alcohol consumption, as well as the most frequent methodological problems encountered in the measurement and classification of drinking patterns.

  1. Optimization of fermentation conditions for 1,3-propanediol production by marine Klebsiella pneumonia HSL4 using response surface methodology

    NASA Astrophysics Data System (ADS)

    Li, Lili; Zhou, Sheng; Ji, Huasong; Gao, Ren; Qin, Qiwei

    2014-09-01

    The industrially important organic compound 1,3-propanediol (1,3-PDO) is mainly used as a building block for the production of various polymers. In the present study, response surface methodology protocol was followed to determine and optimize fermentation conditions for the maximum production of 1,3-PDO using marine-derived Klebsiella pneumoniae HSL4. Four nutritional supplements together with three independent culture conditions were optimized as follows: 29.3 g/L glycerol, 8.0 g/L K2 HPO4, 7.6 g/L (NH4)2 SO4, 3.0 g/L KH2 PO4, pH 7.1, cultivation at 35°C for 12 h. Under the optimal conditions, a maximum 1,3-PDO concentration of 14.5 g/L, a productivity of 1.21 g/(L·h) and a conversion of glycerol of 0.49 g/g were obtained. In comparison with the control conditions, fermentation under the optimized conditions achieved an increase of 38.8% in 1,3-PDO concentration, 39.0% in productivity and 25.7% in glycerol conversion in flask. This enhancement trend was further confirmed when the fermentation was conducted in a 5-L fermentor. The optimized fermentation conditions could be an important basis for developing lowcost, large-scale methods for industrial production of 1,3-PDO in the future.

  2. An ultrasonic methodology for muscle cross section measurement of support space flight

    NASA Astrophysics Data System (ADS)

    Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.

    2004-09-01

    The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal

  3. A new metric for measuring condition in large predatory sharks.

    PubMed

    Irschick, D J; Hammerschlag, N

    2014-09-01

    A simple metric (span condition analysis; SCA) is presented for quantifying the condition of sharks based on four measurements of body girth relative to body length. Data on 104 live sharks from four species that vary in body form, behaviour and habitat use (Carcharhinus leucas, Carcharhinus limbatus, Ginglymostoma cirratum and Galeocerdo cuvier) are given. Condition shows similar levels of variability among individuals within each species. Carcharhinus leucas showed a positive relationship between condition and body size, whereas the other three species showed no relationship. There was little evidence for strong differences in condition between males and females, although more male sharks are needed for some species (e.g. G. cuvier) to verify this finding. SCA is potentially viable for other large marine or terrestrial animals that are captured live and then released.

  4. Conditions of viscosity measurement for detecting irradiated peppers

    NASA Astrophysics Data System (ADS)

    Hayashi, Toru; Todoriki, Setsuko; Okadome, Hiroshi; Kohyama, Kaoru

    1995-04-01

    Viscosity of gelatinized suspensions of black and white peppers decreased depending upon dose. The viscosity was influenced by gelatinization and viscosity measurement conditions. The difference between unirradiated pepper and an irradiated one was larger at a higher pH and temperature for gelatinization. A viscosity parameter normalized with the starch content of pepper sample and the viscosity of a 5% suspension of corn starch could get rid of the influence of the conditions for viscosity measurement such as a type of viscometer, shear rate and temperature.

  5. Radiation measurements during cavities conditioning on APS RF test stand

    SciTech Connect

    Grudzien, D.M.; Kustom, R.L.; Moe, H.J.; Song, J.J.

    1993-07-01

    In order to determine the shielding structure around the Advanced Photon Source (APS) synchrotron and storage ring RF stations, the X-ray radiation has been measured in the near field and far field regions of the RF cavities during the normal conditioning process. Two cavity types, a prototype 352-MHz single-cell cavity and a 352-MHz five-cell cavity, are used on the APS and are conditioned in the RF test stand. Vacuum measurements are also taken on a prototype 352-MHz single-cell cavity and a 352-MHz five-cell cavity. The data will be compared with data on the five-cell cavities from CERN.

  6. Mass measuring instrument for use under microgravity conditions

    SciTech Connect

    Fujii, Yusaku; Yokota, Masayuki; Hashimoto, Seiji; Sugita, Yoichi; Ito, Hitomi; Shimada, Kazuhito

    2008-05-15

    A prototype instrument for measuring astronaut body mass under microgravity conditions has been developed and its performance was evaluated by parabolic flight tests. The instrument, which is the space scale, is applied as follows. Connect the subject astronaut to the space scale with a rubber cord. Use a force transducer to measure the force acting on the subject and an optical interferometer to measure the velocity of the subject. The subject's mass is calculated as the impulse divided by the velocity change, i.e., M={integral}Fdt/{delta}v. Parabolic flight by using a jet aircraft produces a zero-gravity condition lasting approximately 20 s. The performance of the prototype space scale was evaluated during such a flight by measuring the mass of a sample object.

  7. An unstructured direct simulation Monte Carlo methodology with Kinetic-Moment inflow and outflow boundary conditions

    NASA Astrophysics Data System (ADS)

    Gatsonis, Nikolaos A.; Chamberlin, Ryan E.; Averkin, Sergey N.

    2013-01-01

    The mathematical and computational aspects of the direct simulation Monte Carlo on unstructured tetrahedral grids (U3DSMC) with a Kinetic-Moment (KM) boundary conditions method are presented. The algorithms for particle injection, particle loading, particle motion, and particle tracking are presented. The KM method applicable to a subsonic or supersonic inflow/outflow boundary, couples kinetic (particle) U3DSMC properties with fluid (moment) properties. The KM method obtains the number density, temperature and mean velocity needed to define the equilibrium, drifting Maxwellian distribution at a boundary. The moment component of KM is based on the local one dimensional inviscid (LODI) boundary conditions method consistent with the 5-moment compressible Euler equations. The kinetic component of KM is based on U3DSMC for interior properties and the equilibrium drifting Maxwellian at the boundary. The KM method is supplemented with a time-averaging procedure, allows for choices in sampling-cell procedures, minimizes fluctuations and accelerates the convergence in subsonic flows. Collision sampling in U3DSMC implements the no-time-counter method and includes elastic and inelastic collisions. The U3DSMC with KM boundary conditions is validated and verified extensively with simulations of subsonic nitrogen flows in a cylindrical tube with imposed inlet pressure and density and imposed outlet pressure. The simulations cover the regime from slip to free-molecular with inlet Knudsen numbers between 0.183 and 18.27 and resulting inlet Mach numbers between 0.037 and 0.027. The pressure and velocity profiles from U3DSMC-KM simulations are compared with analytical solutions obtained from first-order and second-order slip boundary conditions. Mass flow rates from U3DSMC-KM are compared with validated analytical solutions for the entire Knudsen number regime considered. Error and sensitivity analysis is performed and numerical fractional errors are in agreement with theoretical

  8. Conditioning a segmented stem profile model for two diameter measurements

    Treesearch

    Raymond L. Czaplewski; Joe P. Mcclure

    1988-01-01

    The stem profile model of Max and Burkhart (1976) is conditioned for dbh and a second upper stem measurement. This model was applied to a loblolly pine data set using diameter outside bark at 5.3m (i.e., height of 17.3 foot Girard form class) as the second upper stem measurement, and then compared to the original, unconditioned model. Variance of residuals was reduced...

  9. Methodology to measure fundamental performance parameters of x-ray detectors

    NASA Astrophysics Data System (ADS)

    Busse, Falko; Ruetten, Walter; Wischmann, Hans-Aloys; Geiger, Bernhard; Spahn, Martin F.; Bastiaens, Raoul J. M.; Ducourant, Thierry

    2001-06-01

    To judge the potential benefit of a new x-ray detector technology and to be able to compare different technologies, some standard performance measurements must be defined. In addition to technology-related parameters which may influence weight, shape, image distortions and readout speed, there are fundamental performance parameters which directly influence the achievable image quality and dose efficiency of x-ray detectors. A standardization activity for detective quantum efficiency (DQE) for static detectors is already in progress. In this paper we present a methodology for noise power spectrum (NPS), low frequency drop (LFD) and signal to electronic noise ratio (SENR), and the influence of these parameters on DQE. The individual measurement methods are described in detail with their theoretical background and experimental procedure. Corresponding technical phantoms have been developed. The design of the measurement methods and technical phantoms is tuned so that only minimum requirements are placed on the detector properties. The measurement methods can therefore be applied to both static and dynamic x-ray systems. Measurement results from flat panel imagers and II/TV systems are presented.

  10. Improving inferior vena cava filter retrieval rates with the define, measure, analyze, improve, control methodology.

    PubMed

    Sutphin, Patrick D; Reis, Stephen P; McKune, Angie; Ravanzo, Maria; Kalva, Sanjeeva P; Pillai, Anil K

    2015-04-01

    To design a sustainable process to improve optional inferior vena cava (IVC) filter retrieval rates based on the Define, Measure, Analyze, Improve, Control (DMAIC) methodology of the Six Sigma process improvement paradigm. DMAIC, an acronym for Define, Measure, Analyze, Improve, and Control, was employed to design and implement a quality improvement project to increase IVC filter retrieval rates at a tertiary academic hospital. Retrievable IVC filters were placed in 139 patients over a 2-year period. The baseline IVC filter retrieval rate (n = 51) was reviewed through a retrospective analysis, and two strategies were devised to improve the filter retrieval rate: (a) mailing of letters to clinicians and patients for patients who had filters placed within 8 months of implementation of the project (n = 43) and (b) a prospective automated scheduling of a clinic visit at 4 weeks after filter placement for all new patients (n = 45). The effectiveness of these strategies was assessed by measuring the filter retrieval rates and estimated increase in revenue to interventional radiology. IVC filter retrieval rates increased from a baseline of 8% to 40% with the mailing of letters and to 52% with the automated scheduling of a clinic visit 4 weeks after IVC filter placement. The estimated revenue per 100 IVC filters placed increased from $2,249 to $10,518 with the mailing of letters and to $17,022 with the automated scheduling of a clinic visit. Using the DMAIC methodology, a simple and sustainable quality improvement intervention was devised that markedly improved IVC filter retrieval rates in eligible patients. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  11. Probiotics production and alternative encapsulation methodologies to improve their viabilities under adverse environmental conditions.

    PubMed

    Coghetto, Chaline Caren; Brinques, Graziela Brusch; Ayub, Marco Antônio Záchia

    2016-12-01

    Probiotic products are dietary supplements containing live microorganisms producing beneficial health effects on the host by improving intestinal balance and nutrient absorption. Among probiotic microorganisms, those classified as lactic acid bacteria are of major importance to the food and feed industries. Probiotic cells can be produced using alternative carbon and nitrogen sources, such as agroindustrial residues, at the same time contributing to reduce process costs. On the other hand, the survival of probiotic cells in formulated food products, as well as in the host gut, is an essential nutritional aspect concerning health benefits. Therefore, several cell microencapsulation techniques have been investigated as a way to improve cell viability and survival under adverse environmental conditions, such as the gastrointestinal milieu of hosts. In this review, different aspects of probiotic cells and technologies of their related products are discussed, including formulation of culture media, and aspects of cell microencapsulation techniques required to improve their survival in the host.

  12. Stabilizing Conditional Standard Errors of Measurement in Scale Score Transformations

    ERIC Educational Resources Information Center

    Moses, Tim; Kim, YoungKoung

    2017-01-01

    The focus of this article is on scale score transformations that can be used to stabilize conditional standard errors of measurement (CSEMs). Three transformations for stabilizing the estimated CSEMs are reviewed, including the traditional arcsine transformation, a recently developed general variance stabilization transformation, and a new method…

  13. Measurement of automobile exhaust emissions under realistic road conditions

    SciTech Connect

    Staab, J.; Schurmann, D.

    1987-01-01

    An exhaust gas measurement system for on-board use has been developed, which enables the direct and continuous determination of the exhaust mass emissions in vehicles on the road. Such measurements under realistic traffic conditions are a valuable supplement to measurements taken on test benches, the latter, however, still being necessary. In the last two years numerous test runs were undertaken. The reliability of the on-board system could be demonstrated and a very informative view of the exhaust emissions behavior of a vehicle on the road was obtained from the test results.

  14. Methodology for application of field rainfall simulator to revise c-factor database for conditions of the Czech Republic

    NASA Astrophysics Data System (ADS)

    Neumann, Martin; Dostál, Tomáš; Krása, Josef; Kavka, Petr; Davidová, Tereza; Brant, Václav; Kroulík, Milan; Mistr, Martin; Novotný, Ivan

    2016-04-01

    The presentation will introduce a methodology of determination of crop and cover management factor (C-faktor) for the universal soil loss equation (USLE) using field rainfall simulator. The aim of the project is to determine the C-factor value for the different phenophases of the main crops of the central-european region, while also taking into account the different agrotechnical methods. By using the field rainfall simulator, it is possible to perform the measurements in specific phenophases, which is otherwise difficult to execute due to the variability and fortuity of the natural rainfall. Due to the number of measurements needed, two identical simulators will be used, operated by two independent teams, with coordinated methodology. The methodology will mainly specify the length of simulation, the rainfall intensity, and the sampling technique. The presentation includes a more detailed account of the methods selected. Due to the wide range of variable crops and soils, it is not possible to execute the measurements for all possible combinations. We therefore decided to perform the measurements for previously selected combinations of soils,crops and agrotechnologies that are the most common in the Czech Republic. During the experiments, the volume of the surface runoff and amount of sediment will be measured in their temporal distribution, as well as several other important parameters. The key values of the 3D matrix of the combinations of the crop, agrotechnique and soil will be determined experimentally. The remaining values will be determined by interpolation or by a model analogy. There are several methods used for C-factor calculation from measured experimental data. Some of these are not suitable to be used considering the type of data gathered. The presentation will discuss the benefits and drawbacks of these methods, as well as the final design of the method used. The problems concerning the selection of a relevant measurement method as well as the final

  15. Measurement of apolipoproteins B and A by radial immunodiffusion: methodological assessment and clinical applications.

    PubMed

    Cano, M D; Gonzalvo, C; Scheen, A J; Castillo, M J

    1994-01-01

    The clinical evaluation of apolipoproteins is of interest in order to characterize the risk profile for ischemic heart disease both in normolipidemic and hyperlipidemic subjects. In the non-specialized and/or small practice clinical laboratory, the measurement of some apolipoproteins can be undertaken by simple methods of immunological analysis, among which radial immunodiffusion can be of interest due to its simplicity of use and because it does not require specific equipment. In this work several methodological questions concerning the measurement of plasma apolipoproteins B and A by radial immunodiffusion have been addressed; the results show that this method is particularly reliable for the apo B assay. Regression analysis between values obtained with radial immunodiffusion and radioimmunoassay was r = 0.972 for apo B and r = 0.782 for apo A. The recovery rate was above 90% for both apolipoproteins (93.8% for apo B and 99.5% for apo A). The inter and intraassay coefficients of variation were below 5%, and the detection limits were estimated as 9.6 mg/dl for apo A and 6.9 mg/dl for apo B. Neither the ingestion of a standard breakfast (500 Cal, 17 g fat, 120 mg cholesterol) 2 h prior to testing nor freezing the sample significantly affected the measurement of apolipoproteins B and A. Mean plasma concentrations of both apolipoproteins measured by radial immunodiffusion in normo and hyperlipidemic subjects are also presented.

  16. Design of measurement methodology for the evaluation of human exposure to vibration in residential environments.

    PubMed

    Sica, G; Peris, E; Woodcock, J S; Moorhouse, A T; Waddington, D C

    2014-06-01

    Exposure-response relationships are important tools for policy makers to assess the impact of an environmental stressor on the populace. Their validity lies partly in their statistical strength which is greatly influenced by the size of the sample from which the relationship is derived. As such, the derivation of meaningful exposure-response relationships requires estimates of vibration exposure at a large number of receiver locations. In the United Kingdom a socio-vibrational survey has been conducted with the aim of deriving exposure-response relationships for annoyance due to vibration from (a) railway traffic and (b) the construction of a new light rail system. Response to vibration was measured via a questionnaire conducted face-to-face with residents in their own homes and vibration exposure was estimated using data from a novel measurement methodology. In total, 1281 questionnaires were conducted: 931 for vibration from railway traffic and 350 for vibration from construction sources. Considering the interdisciplinary nature of this work along with the volume of experimental data required, a number of significant technical and logistical challenges needed to be overcome through the planning and implementation of the fieldwork. Four of these challenges are considered in this paper: the site identification for providing a robust sample of the residents affected, the strategies used for measuring both exposure and response and the coordination between the teams carrying out the social survey and the vibration measurements. © 2013 Elsevier B.V. All rights reserved.

  17. Chlorophyll-a determination via continuous measurement of plankton fluorescence: methodology development.

    PubMed

    Pinto, A M; Von Sperling, E; Moreira, R M

    2001-11-01

    A methodology is presented for the continuous measurement of chlorophyll-a concentration due to plankton, in surface water environments. A Turner 10-AU fluorometer equipped with the F4T5.B2/BP lamp (blue lamp), a Cs 5-60 equivalent excitation path filter, and a 680 nm emission filter, has been used. This configuration allows the in vivo, in situ determination of chlorophyll-a by measuring the fluorescence due to the pigments. In field work the fluorometer, data logging and positioning equipment were placed aboard a manageable boat which navigated following a scheme of regularly spaced crossings. Some water samples were collected during the measurement for laboratory chlorophyll-a measurements by the spectrophotometric method, thus providing for calibration and comparison. Spatial chlorophyll-a concentration distributions can be easily defined in large volumes, such as reservoirs, etc. Two distinct environments have been monitored: in the Vargem das Flores reservoir chlorophyll-a concentrations varied between 0.7 and 2.6 mg/m3, whereas in the Lagoa Santa lake these values lied in the 12 to 18 mg/m3 range. The simplicity, versatility and economy of the method, added to the large amount of data that can be gathered in a single run, clearly justify its use in field environmental studies.

  18. Preferences for health care and self-management among Dutch adolescents with chronic conditions: a Q-methodological investigation.

    PubMed

    Jedeloo, Susan; van Staa, AnneLoes; Latour, Jos M; van Exel, N Job A

    2010-05-01

    Adolescents with chronic conditions have to learn to self-manage their health in preparation for transitioning to adult care. Nurses often struggle with how to approach youth with chronic conditions successfully. Little is known about the preferences and attitudes of these young people themselves. To uncover preferences for self-management and hospital care of adolescents with various chronic conditions. A Q-methodological study was conducted. Semi-structured interviews were held with adolescents who rank-ordered 37 opinion statements on preferences for care delivery and self-management. They were asked to motivate their ranking. By-person factor analysis was conducted to uncover patterns in the rankings of statements. The factors were described as preference profiles. A purposive sample of 66 adolescents (12-19 years) treated in a university children's hospital in the Netherlands was invited to participate. Thirty-one adolescents, 16 boys and 15 girls with various chronic conditions eventually participated (response 47%). Eight participants (26%) had a recently acquired chronic condition, while the rest (74%) had been diagnosed at birth or in the first 5 years of life. Four distinct preference profiles for health care delivery and self-management were identified: 'Conscious & Compliant'; 'Backseat Patient'; 'Self-confident & Autonomous'; and 'Worried & Insecure'. Profiles differ in the level of independence, involvement with self-management, adherence to therapeutic regimen, and appreciation of the parents' and health care providers' role. The desire to participate in treatment-related decisions is important to all preference profiles. The profiles are recognizable to adolescents and nurses alike. As Q-methodology allows no inferences with respect to the relative distribution of these profiles in a given population, only tentative hypotheses were formulated about associations between profiles and patient characteristics. This study increases our understanding of

  19. The Harmonizing Outcome Measures for Eczema (HOME) roadmap: a methodological framework to develop core sets of outcome measurements in dermatology.

    PubMed

    Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C

    2015-01-01

    Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care.

  20. Optimal condition for measurement observable via error-propagation

    NASA Astrophysics Data System (ADS)

    Zhong, Wei; Lu, Xiao Ming; Jing, Xiao Xing; Wang, Xiaoguang

    2014-09-01

    Propagation of error is a widely used estimation tool in experiments where the estimation precision of the parameter depends on the fluctuation of the physical observable. Thus the observable that is chosen will greatly affect the estimation sensitivity. Here we study the optimal observable for the ultimate sensitivity bounded by the quantum Cramér-Rao theorem in parameter estimation. By invoking the Schrödinger-Robertson uncertainty relation, we derive the necessary and sufficient condition for the optimal observables saturating the ultimate sensitivity for the single parameter estimate. By applying this condition to Greenberg-Horne-Zeilinger states, we obtain the general expression of the optimal observable for separable measurements to achieve the Heisenberg-limit precision and show that it is closely related to the parity measurement. However, Jose et al (2013 Phys. Rev. A 87 022330) have claimed that the Heisenberg limit may not be obtained via separable measurements. We show this claim is incorrect.

  1. An implication of novel methodology to study pancreatic acinar mitochondria under in situ conditions.

    PubMed

    Manko, Bohdan O; Klevets, Myron Yu; Manko, Volodymyr V

    2013-03-01

    Mitochondria maintain numerous energy-consuming processes in pancreatic acinar cells, yet characteristics of pancreatic mitochondrial oxidative phosphorylation in native conditions are poorly studied. Besides, it is not known which type of solution is most adequate to preserve functions of pancreatic mitochondria in situ. Here we propose a novel experimental protocol suitable for in situ analysis of pancreatic mitochondria metabolic states. Isolated rat pancreatic acini were permeabilized with low doses of digitonin. Different metabolic states of mitochondria were examined in KCl- and sucrose-based solutions using Clark oxygen electrode. Respiration of digitonin-treated, unlike of intact, acini was substantially intensified by succinate or mixture of pyruvate plus malate. Substrate-stimulated respiration rate did not depend on solution composition. In sucrose-based solution, oligomycin inhibited State 3 respiration at succinate oxidation by 65.4% and at pyruvate plus malate oxidation by 60.2%, whereas in KCl-based solution, by 32.0% and 36.1%, respectively. Apparent respiratory control indices were considerably higher in sucrose-based solution. Rotenone or thenoyltrifluoroacetone severely inhibited respiration, stimulated by pyruvate plus malate or succinate, respectively. This revealed low levels of non-mitochondrial oxygen consumption of permeabilized acinar cells. These results suggest a stronger coupling between respiration and oxidative phosphorylation in sucrose-based solution.

  2. Measuring domestic water use: a systematic review of methodologies that measure unmetered water use in low-income settings.

    PubMed

    Tamason, Charlotte C; Bessias, Sophia; Villada, Adriana; Tulsiani, Suhella M; Ensink, Jeroen H J; Gurley, Emily S; Mackie Jensen, Peter Kjaer

    2016-11-01

    To present a systematic review of methods for measuring domestic water use in settings where water meters cannot be used. We systematically searched EMBASE, PubMed, Water Intelligence Online, Water Engineering and Development Center, IEEExplore, Scielo, and Science Direct databases for articles that reported methodologies for measuring water use at the household level where water metering infrastructure was absent or incomplete. A narrative review explored similarities and differences between the included studies and provide recommendations for future research in water use. A total of 21 studies were included in the review. Methods ranged from single-day to 14-consecutive-day visits, and water use recall ranged from 12 h to 7 days. Data were collected using questionnaires, observations or both. Many studies only collected information on water that was carried into the household, and some failed to mention whether water was used outside the home. Water use in the selected studies was found to range from two to 113 l per capita per day. No standardised methods for measuring unmetered water use were found, which brings into question the validity and comparability of studies that have measured unmetered water use. In future studies, it will be essential to define all components that make up water use and determine how they will be measured. A pre-study that involves observations and direct measurements during water collection periods (these will have to be determined through questioning) should be used to determine optimal methods for obtaining water use information in a survey. Day-to-day and seasonal variation should be included. A study that investigates water use recall is warranted to further develop standardised methods to measure water use; in the meantime, water use recall should be limited to 24 h or fewer. © 2016 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  3. Assessing Long-Term Wind Conditions by Combining Different Measure-Correlate-Predict Algorithms: Preprint

    SciTech Connect

    Zhang, J.; Chowdhury, S.; Messac, A.; Hodge, B. M.

    2013-08-01

    This paper significantly advances the hybrid measure-correlate-predict (MCP) methodology, enabling it to account for variations of both wind speed and direction. The advanced hybrid MCP method uses the recorded data of multiple reference stations to estimate the long-term wind condition at a target wind plant site. The results show that the accuracy of the hybrid MCP method is highly sensitive to the combination of the individual MCP algorithms and reference stations. It was also found that the best combination of MCP algorithms varies based on the length of the correlation period.

  4. Measurement Methodology of the Fissile Mass Flow Monitor for the HEU Transparency Implementation Instrumentation in Russia

    SciTech Connect

    Uckan, T.

    2001-06-28

    The highly enriched uranium (HEU) Transparency Agreement between the U.S. and Russian Federation (RF) requires implementation of transparency measures in the Russian facilities that are supplying product low enriched uranium (LEU) to the U.S. from down blended weapon-grade HEU material. To satisfy the agreement's non-proliferation objectives, the U.S. DOE is implementing the fissile mass flow monitor (FMFM) instrumentation developed by Oak Ridge National Laboratory. The FMFM provides unattended non-intrusive measurements of {sup 235}U mass flow of the uranium hexafluoride (UF{sub 6}) gas in the process lines of HEU, the LEU blend stock, and the resulting lower assay product LEU (P-LEU) that is used for U.S. reactors. The instrumentation continuously traces the HEU flow through the blending point to the product LEU, enabling the U.S. to verify HEU material down blending. The FMFM relies on producing delayed gamma rays emitted from fission fragments carried by the UF{sub 6} flow. A thermalized californium-252 ({sup 252}Cf)-neutron source placed in an annular sleeve filled with moderator material that surrounds the pipe is modulated by a neutron absorbent shutter to induce fission in UF{sub 6}. For this technique to be effectively applicable the average range of resulting fission fragments in the UF{sub 6} gas must be smaller than the pipe diameter. The fission fragment range can be very large in low-density materials. Therefore, a methodology has been developed to determine the fission fragment range and its distribution to assess the fraction of the fission fragments that will remain in the flow; this methodology is the primary topic of discussions in this paper.

  5. Measurement of additional shear during sludge conditioning and dewatering.

    PubMed

    Ormeci, Banu; Ahmad, Ayaz

    2009-07-01

    Optimum polymer dose is influenced both by the polymer demand of the sludge and the shear applied during conditioning. Sludge exposed to additional shear following conditioning will experience a decrease in cake solids concentration for the same polymer dose. Therefore, it is necessary to measure or quantify the additional shear in order to optimize the conditioning and dewatering. There is currently no direct or indirect method to achieve this. The main objective of this study was to develop a method based on torque rheology to measure the amount of shear that a sludge network experiences during conditioning and dewatering. Anaerobically digested sludge samples were exposed to increasing levels of mixing intensities and times, and rheological characteristics of samples were measured using a torque rheometer. Several rheological parameters were evaluated including the peak torque and totalized torque (area under the rheograms). The results of this study show that at the optimum polymer dose, a linear relationship exists between the applied shear and the area under the rheograms, and this relationship can be used to estimate an unknown amount of shear that the sludge was exposed to. The method is useful as a research tool to study the effect of shear on dewatering but also as an optimization tool in a dewatering automation system based on torque rheology.

  6. Optimizing conditions for E-and Z-ajoene formation from garlic juice using response surface methodology

    PubMed Central

    Yoo, Miyoung; Lee, Sanghee; Kim, Sunyoung; Shin, Dongbin

    2014-01-01

    The optimum conditions for the formation of E- and Z-ajoene from garlic juice mixed with soybean oil were determined using response surface methodology. A central composite design was used to investigate the effects of three independent variables temperature (°C, X1), reaction time (hours, X2), and oil volume (multiplied by weight, X3). The dependent variables were Z-ajoene (Y1) and E-ajoene (Y2) in oil-macerated garlic. The optimal conditions for E- and Z-ajoene using ridge analysis were 98.80°C, 6.87 h, and weight multiplied by weight 2.57, and 42.24°C, 9.71 h, and weight multiplied by weight 3.08, respectively. These conditions resulted in E- and Z-ajoene compound predicted values of 234.17 and 752.62 μg/g from garlic juice, respectively. The experimental values of E- and Z-ajoene were 222.75 and 833.59 μg/g, respectively. The estimated maximum values at the predicted optimum conditions were in good agreement with experimental values. PMID:25473520

  7. Application of nitric oxide measurements in clinical conditions beyond asthma

    PubMed Central

    Malinovschi, Andrei; Ludviksdottir, Dora; Tufvesson, Ellen; Rolla, Giovanni; Bjermer, Leif; Alving, Kjell; Diamant, Zuzana

    2015-01-01

    Fractional exhaled nitric oxide (FeNO) is a convenient, non-invasive method for the assessment of active, mainly Th2-driven, airway inflammation, which is sensitive to treatment with standard anti-inflammatory therapy. Consequently, FeNO serves as a valued tool to aid diagnosis and monitoring in several asthma phenotypes. More recently, FeNO has been evaluated in several other respiratory, infectious, and/or immunological conditions. In this short review, we provide an overview of several clinical studies and discuss the status of potential applications of NO measurements in clinical conditions beyond asthma. PMID:26672962

  8. Fuel Conditioning Facility Electrorefiner Model Predictions versus Measurements

    SciTech Connect

    D Vaden

    2007-10-01

    Electrometallurgical treatment of spent nuclear fuel is performed in the Fuel Conditioning Facility (FCF) at the Idaho National Laboratory (INL) by electrochemically separating uranium from the fission products and structural materials in a vessel called an electrorefiner (ER). To continue processing without waiting for sample analyses to assess process conditions, an ER process model predicts the composition of the ER inventory and effluent streams via multicomponent, multi-phase chemical equilibrium for chemical reactions and a numerical solution to differential equations for electro-chemical transport. The results of the process model were compared to the electrorefiner measured data.

  9. Curriculum-based measurement of math problem solving: a methodology and rationale for establishing equivalence of scores.

    PubMed

    Montague, Marjorie; Penfield, Randall D; Enders, Craig; Huang, Jia

    2010-02-01

    The purpose of this article is to discuss curriculum-based measurement (CBM) as it is currently utilized in research and practice and to propose a new approach for developing measures to monitor the academic progress of students longitudinally. To accomplish this, we first describe CBM and provide several exemplars of CBM in reading and mathematics. Then, we present the research context for developing a set of seven curriculum-based measures for monitoring student progress in math problem solving. The rationale for and advantages of using statistical equating methodology are discussed. Details of the methodology as it was applied to the development of these math problem solving measures are provided.

  10. Exhaled nitric oxide measurements in the first 2 years of life: methodological issues, clinical and epidemiological applications

    PubMed Central

    Gabriele, Carmelo; de Benedictis, Fernando M; de Jongste, Johan C

    2009-01-01

    Fractional exhaled nitric oxide (FeNO) is a useful tool to diagnose and monitor eosinophilic bronchial inflammation in asthmatic children and adults. In children younger than 2 years of age FeNO has been successfully measured both with the tidal breathing and with the single breath techniques. However, there are a number of methodological issues that need to be addressed in order to increase the reproducibility of the FeNO measurements within and between infants. Indeed, a standardized method to measure FeNO in the first 2 years of life would be extremely useful in order to meaningfully interpret FeNO values in this age group. Several factors related to the measurement conditions have been found to influence FeNO, such as expiratory flow, ambient NO and nasal contamination. Furthermore, the exposure to pre- and postnatal risk factors for respiratory morbidity has been shown to influence FeNO values. Therefore, these factors should always be assessed and their association with FeNO values in the specific study population should be evaluated and, eventually, controlled for. There is evidence consistently suggesting that FeNO is increased in infants with family history of atopy/atopic diseases and in infants with recurrent wheezing. These findings could support the hypothesis that eosinophilic bronchial inflammation is present at an early stage in those infants at increased risk of developing persistent respiratory symptoms and asthma. Furthermore, it has been shown that FeNO measurements could represent a useful tool to assess bronchial inflammation in other airways diseases, such as primary ciliary dyskinesia, bronchopulmonary dysplasia and cystic fibrosis. Further studies are needed in order to improve the reproducibility of the measurements, and large prospective studies are warranted in order to evaluate whether FeNO values measured in the first years of life can predict the future development of asthma or other respiratory diseases. PMID:19712438

  11. Measuring hand hygiene compliance rates in different special care settings: a comparative study of methodologies.

    PubMed

    Magnus, Thyago Pereira; Marra, Alexandre R; Camargo, Thiago Zinsly Sampaio; Victor, Elivane da Silva; da Costa, Lidiane Soares Sodré; Cardoso, Vanessa Jonas; dos Santos, Oscar Fernando Pavão; Edmond, Michael B

    2015-04-01

    The purpose of this study was to compare methods for assessing compliance with hand hygiene in an intensive care unit (ICU), a step-down unit (SDU), and a hematology-oncology unit. Over a 20-week period, we compared hand hygiene compliance measurements by three different methods: direct observation, electronic handwash counter for alcohol gel, and measuring the volume of product used (alcohol gel) in an ICU, an SDU, and a hematology-oncology unit of a tertiary care, private hospital. By direct observation we evaluated 1078 opportunities in the ICU, 1075 in the SDU, and 517 in the hematology-oncology unit, with compliance rates of 70.7%, 75.4%, and 73.3%, respectively. A total of 342,299, 235,914, and 248,698 hand hygiene episodes were recorded by the electronic devices in the ICU, SDU, and hematology-oncology unit, respectively. There were also 127.2 ml, 85.3 ml, and 67.6 ml of alcohol gel used per patient-day in these units. We could find no correlation between the three methods. Hand hygiene compliance was reasonably high in these units, as measured by direct observation. However, a lack of correlation with results obtained by other methodologies brings into question the validity of direct observation results, and suggests that periodic audits using other methods may be needed. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Methodological considerations in the measurement of institutional and structural forms of HIV discrimination.

    PubMed

    Chan, K Y; Reidpath, D D

    2005-07-01

    The systematic measurement of HIV/AIDS-related discrimination is imperative within the current rhetoric that holds discrimination as one of the two 'biggest' barriers to HIV/AIDS pandemic intervention. This paper provides a methodological critique of the UNAIDS (2000b) Protocol for the Identification of Discrimination against People Living with HIV (the Protocol). Specifically, the paper focuses on the Protocol's capacity to accurately identify and measure institutional levels of HIV-related discrimination that allows data that are reliable and comparable across time and contexts. Conceptual issues including the Protocol's objective as an indicator versus a direct measure of discrimination and the role of the Protocol as a tool of research versus a tool of advocacy are explored. Design issues such as the operationalization of discrimination, appropriateness of indicator content, sampling and data collection strategies and issues of scoring are also evaluated. It is hoped that the matters outlined will provide readers with ways of critically reflecting and evaluating the findings of the research papers presented in this Special Issue, as well as pointing to ways of improving research design.

  13. Moving to Capture Children’s Attention: Developing a Methodology for Measuring Visuomotor Attention

    PubMed Central

    Coats, Rachel O.; Mushtaq, Faisal; Williams, Justin H. G.; Aucott, Lorna S.; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child’s development. However, methodological limitations currently make large-scale assessment of children’s attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of ‘Visual Motor Attention’ (VMA)—a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method’s core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults’ attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  14. Characteristics measurement methodology of the large-size autostereoscopic 3D LED display

    NASA Astrophysics Data System (ADS)

    An, Pengli; Su, Ping; Zhang, Changjie; Cao, Cong; Ma, Jianshe; Cao, Liangcai; Jin, Guofan

    2014-11-01

    Large-size autostereoscopic 3D LED displays are commonly used in outdoor or large indoor space, and have the properties of long viewing distance and relatively low light intensity at the viewing distance. The instruments used to measure the characteristics (crosstalk, inconsistency, chromatic dispersion, etc.) of the displays should have long working distance and high sensitivity. In this paper, we propose a methodology for characteristics measurement based on a distribution photometer with a working distance of 5.76m and the illumination sensitivity of 0.001 mlx. A display panel holder is fabricated and attached on the turning stage of the distribution photometer. Specific test images are loaded on the display separately, and the luminance data at the distance of 5.76m to the panel are measured. Then the data are transformed into the light intensity at the optimum viewing distance. According to definitions of the characteristics of the 3D displays, the crosstalk, inconsistency, chromatic dispersion could be calculated. The test results and analysis of the characteristics of an autostereoscopic 3D LED display are proposed.

  15. Development of plant condition measurement - The Jimah Model

    NASA Astrophysics Data System (ADS)

    Evans, Roy F.; Syuhaimi, Mohd; Mazli, Mohammad; Kamarudin, Nurliyana; Maniza Othman, Faiz

    2012-05-01

    The Jimah Model is an information management model. The model has been designed to facilitate analysis of machine condition by integrating diagnostic data with quantitative and qualitative information. The model treats data as a single strand of information - metaphorically a 'genome' of data. The 'Genome' is structured to be representative of plant function and identifies the condition of selected components (or genes) in each machine. To date in industry, computer aided work processes used with traditional industrial practices, have been unable to consistently deliver a standard of information suitable for holistic evaluation of machine condition and change. Significantly the reengineered site strategies necessary for implementation of this "data genome concept" have resulted in enhanced knowledge and management of plant condition. In large plant with high initial equipment cost and subsequent high maintenance costs, accurate measurement of major component condition becomes central to whole of life management and replacement decisions. A case study following implementation of the model at a major power station site in Malaysia (Jimah) shows that modeling of plant condition and wear (in real time) can be made a practical reality.

  16. Temperature Distribution Measurement of The Wing Surface under Icing Conditions

    NASA Astrophysics Data System (ADS)

    Isokawa, Hiroshi; Miyazaki, Takeshi; Kimura, Shigeo; Sakaue, Hirotaka; Morita, Katsuaki; Japan Aerospace Exploration Agency Collaboration; Univ of Notre Dame Collaboration; Kanagawa Institute of Technology Collaboration; Univ of Electro-(UEC) Team, Comm

    2016-11-01

    De- or anti-icing system of an aircraft is necessary for a safe flight operation. Icing is a phenomenon which is caused by a collision of supercooled water frozen to an object. For the in-flight icing, it may cause a change in the wing cross section that causes stall, and in the worst case, the aircraft would fall. Therefore it is important to know the surface temperature of the wing for de- or anti-icing system. In aerospace field, temperature-sensitive paint (TSP) has been widely used for obtaining the surface temperature distribution on a testing article. The luminescent image from the TSP can be related to the temperature distribution. (TSP measurement system) In icing wind tunnel, we measured the surface temperature distribution of the wing model using the TSP measurement system. The effect of icing conditions on the TSP measurement system is discussed.

  17. A new methodology for measurement of sludge residence time distribution in a paddle dryer using X-ray fluorescence analysis.

    PubMed

    Charlou, Christophe; Milhé, Mathieu; Sauceau, Martial; Arlabosse, Patricia

    2015-02-01

    Drying is a necessary step before sewage sludge energetic valorization. Paddle dryers allow working with such a complex material. However, little is known about sludge flow in this kind of processes. This study intends to set up an original methodology for sludge residence time distribution (RTD) measurement in a continuous paddle dryer, based on the detection of mineral tracers by X-ray fluorescence. This accurate analytical technique offers a linear response to tracer concentration in dry sludge; the protocol leads to a good repeatability of RTD measurements. Its equivalence to RTD measurement by NaCl conductivity in sludge leachates is assessed. Moreover, it is shown that tracer solubility has no influence on RTD: liquid and solid phases have the same flow pattern. The application of this technique on sludge with different storage duration at 4 °C emphasizes the influence of this parameter on sludge RTD, and thus on paddle dryer performances: the mean residence time in a paddle dryer is almost doubled between 24 and 48 h of storage for identical operating conditions.

  18. Phoretic and Radiometric Force Measurements on Microparticles in Microgravity Conditions

    NASA Technical Reports Server (NTRS)

    Davis, E. James

    1996-01-01

    Thermophoretic, diffusiophoretic and radiometric forces on microparticles are being measured over a wide range of gas phase and particle conditions using electrodynamic levitation of single particles to simulate microgravity conditions. The thermophoretic force, which arises when a particle exists in a gas having a temperature gradient, is measured by levitating an electrically charged particle between heated and cooled plates mounted in a vacuum chamber. The diffusiophoretic force arising from a concentration gradient in the gas phase is measured in a similar manner except that the heat exchangers are coated with liquids to establish a vapor concentration gradient. These phoretic forces and the radiation pressure force acting on a particle are measured directly in terms of the change in the dc field required to levitate the particle with and without the force applied. The apparatus developed for the research and the experimental techniques are discussed, and results obtained by thermophoresis experiments are presented. The determination of the momentum and energy accommodation coefficients associated with molecular collisions between gases molecules and particles and the measurement of the interaction between electromagnetic radiation and small particles are of particular interest.

  19. Using Reported Rates of Sexually Transmitted Diseases to Illustrate Potential Methodological Issues in the Measurement of Racial and Ethnic Disparities.

    PubMed

    Chesson, Harrell W; Patel, Chirag G; Gift, Thomas L; Bernstein, Kyle T; Aral, Sevgi O

    2017-09-01

    Racial disparities in the burden of sexually transmitted diseases (STDs) have been documented and described for decades. Similarly, methodological issues and limitations in the use of disparity measures to quantify disparities in health have also been well documented. The purpose of this study was to use historic STD surveillance data to illustrate four of the most well-known methodological issues associated with the use of disparity measures. We manually searched STD surveillance reports to find examples of racial/ethnic distributions of reported STDs that illustrate key methodological issues in the use of disparity measures. The disparity measures we calculated included the black-white rate ratio, the Index of Disparity (weighted and unweighted by subgroup population), and the Gini coefficient. The 4 examples we developed included illustrations of potential differences in relative and absolute disparity measures, potential differences in weighted and nonweighted disparity measures, the importance of the reference point when calculating disparities, and differences in disparity measures in the assessment of trends in disparities over time. For example, the gonorrhea rate increased for all minority groups (relative to whites) from 1992 to 1993, yet the Index of Disparity suggested that racial/ethnic disparities had decreased. Although imperfect, disparity measures can be useful to quantify racial/ethnic disparities in STDs, to assess trends in these disparities, and to inform interventions to reduce these disparities. Our study uses reported STD rates to illustrate potential methodological issues with these disparity measures and highlights key considerations when selecting disparity measures for quantifying disparities in STDs.

  20. Absolute gain measurement by the image method under mismatched condition

    NASA Technical Reports Server (NTRS)

    Lee, Richard Q.; Baddour, Maurice F.

    1987-01-01

    Purcell's image method for measuring the absolute gain of an antenna is particularly attractive for small test antennas. The method is simple to use and utilizes only one antenna with a reflecting plane to provide an image for the receiving antenna. However, the method provides accurate results only if the antenna is matched to its waveguide. In this paper, a waveguide junction analysis is developed to determine the gain of an antenna under mismatched condition. Absolute gain measurements for two standard gain horn antennas have been carried out. Experimental results agree closely with published data.

  1. Wall Conditioning and Impurity Measurements in the PEGASUS Experiment

    NASA Astrophysics Data System (ADS)

    Ono, M.; Fonck, R.; Toonen, R.; Thorson, T.; Tritz, K.; Winz, G.

    1999-11-01

    Wall conditioning and impurity effects on plasma evolution are increasingly relevant to the PEGASUS program. Surface conditioning consists of hydrogen glow discharge cleaning (GDC) to remove water and oxides, followed by He GDC to reduce the hydrogen inventory. Isotope exchange measurements indicate that periodic He GDC almost eliminates uncontrolled fueling from gas desorbed from the limiting surfaces. Additional wall conditioning will include Ti gettering and/or boronization. Impurity monitoring is provided by the recent installation of a SPRED multichannel VUV spectrometer (wavelength range = 10-110 nm; 1 msec time resolution), several interference filter (IF) monochromators, and a multichannel Ross-filter SXR diode assembly (for CV, CVI, OVII, and OVIII). The IF monitors indicate increased C radiation upon contact of the plasma with the upper and lower limiters for highly elongated plasmas. This radiation appears correlated with a subsequent rollover in the plasma current, and motivates an upgrade to the poloidal limiters to provide better plasma-wall interaction control.

  2. Optimizing spray drying conditions of sour cherry juice based on physicochemical properties, using response surface methodology (RSM).

    PubMed

    Moghaddam, Arasb Dabbagh; Pero, Milad; Askari, Gholam Reza

    2017-01-01

    In this study, the effects of main spray drying conditions such as inlet air temperature (100-140 °C), maltodextrin concentration (MDC: 30-60%), and aspiration rate (AR) (30-50%) on the physicochemical properties of sour cherry powder such as moisture content (MC), hygroscopicity, water solubility index (WSI), and bulk density were investigated. This investigation was carried out by employing response surface methodology and the process conditions were optimized by using this technique. The MC of the powder was negatively related to the linear effect of the MDC and inlet air temperature (IT) and directly related to the AR. Hygroscopicity of the powder was significantly influenced by the MDC. By increasing MDC in the juice, the hygroscopicity of the powder was decreased. MDC and inlet temperature had a positive effect, but the AR had a negative effect on the WSI of powder. MDC and inlet temperature negatively affected the bulk density of powder. By increasing these two variables, the bulk density of powder was decreased. The optimization procedure revealed that the following conditions resulted in a powder with the maximum solubility and minimum hygroscopicity: MDC = 60%, IT = 134 °C, and AR = 30% with a desirability of 0.875.

  3. Methodological considerations in service use assessment for children and youth with mental health conditions; issues for economic evaluation.

    PubMed

    Woolderink, M; Lynch, F L; van Asselt, A D I; Beecham, J; Evers, S M A A; Paulus, A T G; van Schayck, C P

    2015-05-01

    Economic evaluations are increasingly used in decision-making. Accurate measurement of service use is critical to economic evaluation. This qualitative study, based on expert interviews, aims to identify best approaches to service use measurement for child mental health conditions, and to identify problems in current methods. Results suggest considerable agreement on strengths (e.g., availability of accurate instruments to measure service use) and weaknesses, (e.g., lack of unit prices for services outside the health sector) or alternative approaches to service use measurement. Experts also identified some unresolved problems, for example the lack of uniform definitions for some mental health services.

  4. Quantification of Greenhouse Gas Emission Rates from strong Point Sources by Airborne IPDA-Lidar Measurements: Methodology and Experimental Results

    NASA Astrophysics Data System (ADS)

    Ehret, G.; Amediek, A.; Wirth, M.; Fix, A.; Kiemle, C.; Quatrevalet, M.

    2016-12-01

    We report on a new method and on the first demonstration to quantify emission rates from strong greenhouse gas (GHG) point sources using airborne Integrated Path Differential Absorption (IPDA) Lidar measurements. In order to build trust in the self-reported emission rates by countries, verification against independent monitoring systems is a prerequisite to check the reported budget. A significant fraction of the total anthropogenic emission of CO2 and CH4 originates from localized strong point sources of large energy production sites or landfills. Both are not monitored with sufficiently accuracy by the current observation system. There is a debate whether airborne remote sensing could fill in the gap to infer those emission rates from budgeting or from Gaussian plume inversion approaches, whereby measurements of the GHG column abundance beneath the aircraft can be used to constrain inverse models. In contrast to passive sensors, the use of an active instrument like CHARM-F for such emission verification measurements is new. CHARM-F is a new airborne IPDA-Lidar devised for the German research aircraft HALO for the simultaneous measurement of the column-integrated dry-air mixing ratio of CO2 and CH4 commonly denoted as XCO2 und XCH4, respectively. It has successfully been tested in a serious of flights over Central Europe to assess its performance under various reflectivity conditions and in a strongly varying topography like the Alps. The analysis of a methane plume measured in crosswind direction of a coal mine ventilation shaft revealed an instantaneous emission rate of 9.9 ± 1.7 kt CH4 yr-1. We discuss the methodology of our point source estimation approach and give an outlook on the CoMet field experiment scheduled in 2017 for the measurement of anthropogenic and natural GHG emissions by a combination of active and passive remote sensing instruments on research aircraft.

  5. Modeling of the effect of freezer conditions on the principal constituent parameters of ice cream by using response surface methodology.

    PubMed

    Inoue, K; Ochi, H; Taketsuka, M; Saito, H; Sakurai, K; Ichihashi, N; Iwatsuki, K; Kokubo, S

    2008-05-01

    A systematic analysis was carried out by using response surface methodology to create a quantitative model of the synergistic effects of conditions in a continuous freezer [mix flow rate (L/h), overrun (%), cylinder pressure (kPa), drawing temperature ( degrees C), and dasher speed (rpm)] on the principal constituent parameters of ice cream [rate of fat destabilization (%), mean air cell diameter (mum), and mean ice crystal diameter (mum)]. A central composite face-centered design was used for this study. Thirty-one combinations of the 5 above-mentioned freezer conditions were designed (including replicates at the center point), and ice cream samples were manufactured and examined in a continuous freezer under the selected conditions. The responses were the 3 variables given above. A quadratic model was constructed, with the freezer conditions as the independent variables and the ice cream characteristics as the dependent variables. The coefficients of determination (R(2)) were greater than 0.9 for all 3 responses, but Q(2), the index used here for the capability of the model for predicting future observed values of the responses, was negative for both the mean ice crystal diameter and the mean air cell diameter. Therefore, pruned models were constructed by removing terms that had contributed little to the prediction in the original model and by refitting the regression model. It was demonstrated that these pruned models provided good fits to the data in terms of R(2), Q(2), and ANOVA. The effects of freezer conditions were expressed quantitatively in terms of the 3 responses. The drawing temperature ( degrees C) was found to have a greater effect on ice cream characteristics than any of the other factors.

  6. Use of response surface methodology to optimise environmental stress conditions on Penicillium glabrum, a food spoilage mould.

    PubMed

    Nevarez, Laurent; Vasseur, Valérie; Debaets, Stella; Barbier, Georges

    2010-01-01

    Fungi are ubiquitous microorganisms often associated with spoilage and biodeterioration of a large variety of foods and feedstuffs. Their growth may be influenced by temporary changes in intrinsic or environmental factors such as temperature, water activity, pH, preservatives, atmosphere composition, all of which may represent potential sources of stress. Molecular-based analyses of their physiological responses to environmental conditions would help to better manage the risk of alteration and potential toxicity of food products. However, before investigating molecular stress responses, appropriate experimental stress conditions must be precisely defined. Penicillium glabrum is a filamentous fungus widely present in the environment and frequently isolated in the food processing industry as a contaminant of numerous products. Using response surface methodology, the present study evaluated the influence of two environmental factors (temperature and pH) on P. glabrum growth to determine 'optimised' environmental stress conditions. For thermal and pH shocks, a large range of conditions was applied by varying factor intensity and exposure time according to a two-factorial central composite design. Temperature and exposure duration varied from 30 to 50 °C and from 10 min to 230 min, respectively. The effects of interaction between both variables were observed on fungal growth. For pH, the duration of exposure, from 10 to 230 min, had no significant effect on fungal growth. Experiments were thus carried out on a range of pH from 0.15 to 12.50 for a single exposure time of 240 min. Based on fungal growth results, a thermal shock of 120 min at 40 °C or a pH shock of 240 min at 1.50 or 9.00 may therefore be useful to investigate stress responses to non-optimal conditions.

  7. Optimization of Fermentation Conditions for Recombinant Human Interferon Beta Production by Escherichia coli Using the Response Surface Methodology

    PubMed Central

    Morowvat, Mohammad Hossein; Babaeipour, Valiollah; Rajabi Memari, Hamid; Vahidi, Hossein

    2015-01-01

    Background: The periplasmic overexpression of recombinant human interferon beta (rhIFN-β)-1b using a synthetic gene in Escherichia coli BL21 (DE3) was optimized in shake flasks using Response Surface Methodology (RSM) based on the Box-Behnken Design (BBD). Objectives: This study aimed to predict and develop the optimal fermentation conditions for periplasmic expression of rhIFN-β-1b in shake flasks whilst keeping the acetate excretion as the lowest amount and exploit the best results condition for rhIFN-β in a bench top bioreactor. Materials and Methods: The process variables studied were the concentration of glucose as carbon source, cell density prior the induction (OD 600 nm) and induction temperature. Ultimately, a three-factor three-level BBD was employed during the optimization process. The rhIFN-β production and the acetate excretion served as the evaluated responses. Results: The proposed optimum fermentation condition consisted of 7.81 g L-1 glucose, OD 600 nm prior induction 1.66 and induction temperature of 30.27°C. The model prediction of 0.267 g L-1 of rhIFN-β and 0.961 g L-1 of acetate at the optimum conditions was verified experimentally as 0.255 g L-1 and 0.981 g L-1 of acetate. This agreement between the predicted and observed values confirmed the precision of the applied method to predict the optimum conditions. Conclusions: It can be concluded that the RSM is an effective method for the optimization of recombinant protein expression using synthetic genes in E. coli. PMID:26034535

  8. Body temperature as a conditional response measure for pavlovian fear conditioning.

    PubMed

    Godsil, B P; Quinn, J J; Fanselow, M S

    2000-01-01

    On six days rats were exposed to each of two contexts. They received an electric shock in one context and nothing in the other. Rats were tested later in each environment without shock. The rats froze and defecated more often in the shock-paired environment; they also exhibited a significantly larger elevation in rectal temperature in that environment. The rats discriminated between each context, and we suggest that the elevation in temperature is the consequence of associative learning. Thus, body temperature can be used as a conditional response measure in Pavlovian fear conditioning experiments that use footshock as the unconditional stimulus.

  9. Analysis and methodology for measuring oxygen concentration in liquid sodium with a plugging meter

    SciTech Connect

    Nollet, B. K.; Hvasta, M.; Anderson, M.

    2012-07-01

    Oxygen concentration in liquid sodium is a critical measurement in assessing the potential for corrosion damage in sodium-cooled fast reactors (SFRs). There has been little recent work on sodium reactors and oxygen detection. Thus, the technical expertise dealing with oxygen measurements within sodium is no longer readily available in the U.S. Two methods of oxygen detection that have been investigated are the plugging meter and the galvanic cell. One of the overall goals of the Univ. of Wisconsin's sodium research program is to develop an affordable, reliable galvanic cell oxygen sensor. Accordingly, attention must first be dedicated to a well-known standard known as a plugging meter. Therefore, a sodium loop has been constructed on campus in effort to develop the plugging meter technique and gain experience working with liquid metal. The loop contains both a galvanic cell test section and a plugging meter test section. Consistent plugging results have been achieved below 20 [wppm], and a detailed process for achieving effective plugging has been developed. This paper will focus both on an accurate methodology to obtain oxygen concentrations from a plugging meter, and on how to easily control the oxygen concentration of sodium in a test loop. Details of the design, materials, manufacturing, and operation will be presented. Data interpretation will also be discussed, since a modern discussion of plugging data interpretation does not currently exist. (authors)

  10. Urea enzymatic hydrolysis reaction: optimization by response surface methodology based on potentiometric measurements.

    PubMed

    Deyhimi, Farzad; Bajalan, Maryam

    2008-11-01

    The enzymatic hydrolysis reaction of urea by urease is optimized in this work by the chemometric response surface methodology (RSM), based on an initial rate potentiometric measurement using an NH(4)(+) ion-selective electrode (ISE). In this investigation, the ranges of critical variables determined by a preliminary "one at a time" (OVAT) procedure were used as input for the subsequent RSM chemometric analysis. The RSM quadratic response was found to be quite appropriate for modeling and optimization of the hydrolysis reaction as illustrated by the relatively high value of the determination coefficient (R(2)=90.1%), along with the satisfactory results obtained by the analysis of variance (ANOVA). All the evaluated analytical characteristics of the optimized method such as: the linear calibration curve, the upper and lower detection limits, the within-day precisions at low and at high levels, the assay recovery in pool serum media, along with the activation kinetic parameters, were also reported. Further, in order to check the quality of the optimization and the validity of the model, the assay of urea, both in aqueous laboratory and human serum samples, were performed. It has to be noted that the kinetic initial rate measurement method used in this work, permitted to overcome the general problem of NH(4)(+) ISE low selectivity against Na(+) and K(+) interfering ions in real samples.

  11. Extremely low frequency electromagnetic field measurements at the Hylaty station and methodology of signal analysis

    NASA Astrophysics Data System (ADS)

    Kulak, Andrzej; Kubisz, Jerzy; Klucjasz, Slawomir; Michalec, Adam; Mlynarczyk, Janusz; Nieckarz, Zenon; Ostrowski, Michal; Zieba, Stanislaw

    2014-06-01

    We present the Hylaty geophysical station, a high-sensitivity and low-noise facility for extremely low frequency (ELF, 0.03-300 Hz) electromagnetic field measurements, which enables a variety of geophysical and climatological research related to atmospheric, ionospheric, magnetospheric, and space weather physics. The first systematic observations of ELF electromagnetic fields at the Jagiellonian University were undertaken in 1994. At the beginning the measurements were carried out sporadically, during expeditions to sparsely populated areas of the Bieszczady Mountains in the southeast of Poland. In 2004, an automatic Hylaty ELF station was built there, in a very low electromagnetic noise environment, which enabled continuous recording of the magnetic field components of the ELF electromagnetic field in the frequency range below 60 Hz. In 2013, after 8 years of successful operation, the station was upgraded by extending its frequency range up to 300 Hz. In this paper we show the station's technical setup, and how it has changed over the years. We discuss the design of ELF equipment, including antennas, receivers, the time control circuit, and power supply, as well as antenna and receiver calibration. We also discuss the methodology we developed for observations of the Schumann resonance and wideband observations of ELF field pulses. We provide examples of various kinds of signals recorded at the station.

  12. Measuring the impact of arthritis on worker productivity: perspectives, methodologic issues, and contextual factors.

    PubMed

    Tang, Kenneth; Escorpizo, Reuben; Beaton, Dorcas E; Bombardier, Claire; Lacaille, Diane; Zhang, Wei; Anis, Aslam H; Boonen, Annelies; Verstappen, Suzanne M M; Buchbinder, Rachelle; Osborne, Richard H; Fautrel, Bruno; Gignac, Monique A M; Tugwell, Peter S

    2011-08-01

    Leading up to the Outcome Measures in Rheumatology (OMERACT) 10 meeting, the goal of the Worker Productivity Special Interest Group (WP-SIG) was to make progress on 3 key issues that relate to the application and interpretation of worker productivity outcomes in arthritis: (1) to review existing conceptual frameworks to help consolidate our intended target and scope of measurement; (2) to examine the methodologic issues associated with our goal of combining multiple indicators of worker productivity loss (e.g., absenteeism <-> presenteeism) into a single comprehensive outcome; and (3) to examine the relevant contextual factors of work and potential implications for the interpretation of scores derived from existing outcome measures. Progress was made on all 3 issues at OMERACT 10. We identified 3 theoretical frameworks that offered unique but converging perspectives on worker productivity loss and/or work disability to provide guidance with classification, selection, and future recommendation of outcomes. Several measurement and analytic approaches to combine absenteeism and presenteeism outcomes were proposed, and the need for further validation of such approaches was also recognized. Finally, participants at the WP-SIG were engaged to brainstorm and provide preliminary endorsements to support key contextual factors of worker productivity through an anonymous "dot voting" exercise. A total of 24 specific factors were identified, with 16 receiving ≥ 1 vote among members, reflecting highly diverse views on specific factors that were considered most important. Moving forward, further progress on these issues remains a priority to help inform the best application of worker productivity outcomes in arthritis research.

  13. H/L ratio as a measurement of stress in laying hens - methodology and reliability.

    PubMed

    Lentfer, T L; Pendl, H; Gebhardt-Henrich, S G; Fröhlich, E K F; Von Borell, E

    2015-04-01

    Measuring the ratio of heterophils and lymphocytes (H/L) in response to different stressors is a standard tool for assessing long-term stress in laying hens but detailed information on the reliability of measurements, measurement techniques and methods, and absolute cell counts is often lacking. Laying hens offered different sites of the nest boxes at different ages were compared in a two-treatment crossover experiment to provide detailed information on the procedure for measuring and the difficulties in the interpretation of H/L ratios in commercial conditions. H/L ratios were pen-specific and depended on the age and aviary system. There was no effect for the position of the nest. Heterophiles and lymphocytes were not correlated within individuals. Absolute cell counts differed in the number of heterophiles and lymphocytes and H/L ratios, whereas absolute leucocyte counts between individuals were similar. The reliability of the method using relative cell counts was good, yielding a correlation coefficient between double counts of r > 0.9. It was concluded that population-based reference values may not be sensitive enough to detect individual stress reactions and that the H/L ratio as an indicator of stress under commercial conditions may not be useful because of confounding factors and that other, non-invasive, measurements should be adopted.

  14. Probability distributions of continuous measurement results for conditioned quantum evolution

    NASA Astrophysics Data System (ADS)

    Franquet, A.; Nazarov, Yuli V.

    2017-02-01

    We address the statistics of continuous weak linear measurement on a few-state quantum system that is subject to a conditioned quantum evolution. For a conditioned evolution, both the initial and final states of the system are fixed: the latter is achieved by the postselection in the end of the evolution. The statistics may drastically differ from the nonconditioned case, and the interference between initial and final states can be observed in the probability distributions of measurement outcomes as well as in the average values exceeding the conventional range of nonconditioned averages. We develop a proper formalism to compute the distributions of measurement outcomes, and evaluate and discuss the distributions in experimentally relevant setups. We demonstrate the manifestations of the interference between initial and final states in various regimes. We consider analytically simple examples of nontrivial probability distributions. We reveal peaks (or dips) at half-quantized values of the measurement outputs. We discuss in detail the case of zero overlap between initial and final states demonstrating anomalously big average outputs and sudden jump in time-integrated output. We present and discuss the numerical evaluation of the probability distribution aiming at extending the analytical results and describing a realistic experimental situation of a qubit in the regime of resonant fluorescence.

  15. Noninvasive measurement system for human respiratory condition and body temperature

    NASA Astrophysics Data System (ADS)

    Toba, Eiji; Sekiguchi, Sadamu; Nishimatsu, Toyonori

    1995-06-01

    A special chromel (C) and alumel wire (A) thermopile has been developed which can measure the human respiratory condition and body temperature without directly contacting a sensor to the human body. The measurement system enables high speed, real time, noninvasive, and simultaneous measurement of respiratory rates and body temperature with the same sensor. The special CA thermopile, with each sensing junction of approximately 25 μm, was constructed by using spot welded thermopile junctions. The thermoelectric power of 17 pairs of special CA thermopile is 0.7 mV/ °C. The special CA thermopile provides high sensitivity and fine frequency characteristics, of which the gain is flat to approximately 10 Hz.

  16. Extension of laboratory-measured soil spectra to field conditions

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.; Weismiller, R. A.; Biehl, L. L.; Robinson, B. F.

    1982-01-01

    Spectral responses of two glaciated soils, Chalmers silty clay loam and Fincastle silt loam, formed under prairie grass and forest vegetation, respectively, were measured in the laboratory under controlled moisture equilibria using an Exotech Model 20C spectroradiometer to obtain spectral data in the laboratory under artificial illumination. The same spectroradiometer was used outdoors under solar illumination to obtain spectral response from dry and moistened field plots with and without corn residue cover, representing the two different soils. Results indicate that laboratory-measured spectra of moist soil are directly proportional to the spectral response of that same field-measured moist bare soil over the 0.52 micrometer to 1.75 micrometer wavelength range. The magnitudes of difference in spectral response between identically treated Chalmers and Fincastle soils are greatest in the 0.6 micrometers to 0.8 micrometer transition region between the visible and near infrared, regardless of field condition or laboratory preparation studied.

  17. Thermophysical Properties Measurement of High-Temperature Liquids Under Microgravity Conditions in Controlled Atmospheric Conditions

    NASA Technical Reports Server (NTRS)

    Watanabe, Masahito; Ozawa, Shumpei; Mizuno, Akotoshi; Hibiya, Taketoshi; Kawauchi, Hiroya; Murai, Kentaro; Takahashi, Suguru

    2012-01-01

    Microgravity conditions have advantages of measurement of surface tension and viscosity of metallic liquids by the oscillating drop method with an electromagnetic levitation (EML) device. Thus, we are preparing the experiments of thermophysical properties measurements using the Materials-Science Laboratories ElectroMagnetic-Levitator (MSL-EML) facilities in the international Space station (ISS). Recently, it has been identified that dependence of surface tension on oxygen partial pressure (Po2) must be considered for industrial application of surface tension values. Effect of Po2 on surface tension would apparently change viscosity from the damping oscillation model. Therefore, surface tension and viscosity must be measured simultaneously in the same atmospheric conditions. Moreover, effect of the electromagnetic force (EMF) on the surface oscillations must be clarified to obtain the ideal surface oscillation because the EMF works as the external force on the oscillating liquid droplets, so extensive EMF makes apparently the viscosity values large. In our group, using the parabolic flight levitation experimental facilities (PFLEX) the effect of Po2 and external EMF on surface oscillation of levitated liquid droplets was systematically investigated for the precise measurements of surface tension and viscosity of high temperature liquids for future ISS experiments. We performed the observation of surface oscillations of levitated liquid alloys using PFLEX on board flight experiments by Gulfstream II (G-II) airplane operated by DAS. These observations were performed under the controlled Po2 and also under the suitable EMF conditions. In these experiments, we obtained the density, the viscosity and the surface tension values of liquid Cu. From these results, we discuss about as same as reported data, and also obtained the difference of surface oscillations with the change of the EMF conditions.

  18. Detection of Chamber Conditioning Through Optical Emission and Impedance Measurements

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.; Rao, M. V. V. S.; Sharma, Surendra P.; Meyyappan, Meyya

    2001-01-01

    During oxide etch processes, buildup of fluorocarbon residues on reactor sidewalls can cause run-to-run drift and will necessitate some time for conditioning and seasoning of the reactor. Though diagnostics can be applied to study and understand these phenomena, many of them are not practical for use in an industrial reactor. For instance, measurements of ion fluxes and energy by mass spectrometry show that the buildup of insulating fluorocarbon films on the reactor surface will cause a shift in both ion energy and current in an argon plasma. However, such a device cannot be easily integrated into a processing system. The shift in ion energy and flux will be accompanied by an increase in the capacitance of the plasma sheath. The shift in sheath capacitance can be easily measured by a common commercially available impedance probe placed on the inductive coil. A buildup of film on the chamber wall is expected to affect the production of fluorocarbon radicals, and thus the presence of such species in the optical emission spectrum of the plasma can be monitored as well. These two techniques are employed on a GEC (Gaseous Electronics Conference) Reference Cell to assess the validity of optical emission and impedance monitoring as a metric of chamber conditioning. These techniques are applied to experimental runs with CHF3 and CHF3/O2/Ar plasmas, with intermediate monitoring of pure argon plasmas as a reference case for chamber conditions.

  19. Detection of Chamber Conditioning Through Optical Emission and Impedance Measurements

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.; Rao, M. V. V. S.; Sharma, Surendra P.; Meyyappan, Meyya

    2001-01-01

    During oxide etch processes, buildup of fluorocarbon residues on reactor sidewalls can cause run-to-run drift and will necessitate some time for conditioning and seasoning of the reactor. Though diagnostics can be applied to study and understand these phenomena, many of them are not practical for use in an industrial reactor. For instance, measurements of ion fluxes and energy by mass spectrometry show that the buildup of insulating fluorocarbon films on the reactor surface will cause a shift in both ion energy and current in an argon plasma. However, such a device cannot be easily integrated into a processing system. The shift in ion energy and flux will be accompanied by an increase in the capacitance of the plasma sheath. The shift in sheath capacitance can be easily measured by a common commercially available impedance probe placed on the inductive coil. A buildup of film on the chamber wall is expected to affect the production of fluorocarbon radicals, and thus the presence of such species in the optical emission spectrum of the plasma can be monitored as well. These two techniques are employed on a GEC (Gaseous Electronics Conference) Reference Cell to assess the validity of optical emission and impedance monitoring as a metric of chamber conditioning. These techniques are applied to experimental runs with CHF3 and CHF3/O2/Ar plasmas, with intermediate monitoring of pure argon plasmas as a reference case for chamber conditions.

  20. Positional uncertainty of isocontours: condition analysis and probabilistic measures.

    PubMed

    Pöthkow, Kai; Hege, Hans-Christian

    2011-10-01

    Uncertainty is ubiquitous in science, engineering and medicine. Drawing conclusions from uncertain data is the normal case, not an exception. While the field of statistical graphics is well established, only a few 2D and 3D visualization and feature extraction methods have been devised that consider uncertainty. We present mathematical formulations for uncertain equivalents of isocontours based on standard probability theory and statistics and employ them in interactive visualization methods. As input data, we consider discretized uncertain scalar fields and model these as random fields. To create a continuous representation suitable for visualization we introduce interpolated probability density functions. Furthermore, we introduce numerical condition as a general means in feature-based visualization. The condition number-which potentially diverges in the isocontour problem-describes how errors in the input data are amplified in feature computation. We show how the average numerical condition of isocontours aids the selection of thresholds that correspond to robust isocontours. Additionally, we introduce the isocontour density and the level crossing probability field; these two measures for the spatial distribution of uncertain isocontours are directly based on the probabilistic model of the input data. Finally, we adapt interactive visualization methods to evaluate and display these measures and apply them to 2D and 3D data sets. © 2011 IEEE

  1. Classification of heart valve condition using acoustic measurements

    SciTech Connect

    Clark, G.

    1994-11-15

    Prosthetic heart valves and the many great strides in valve design have been responsible for extending the life spans of many people with serious heart conditions. Even though the prosthetic valves are extremely reliable, they are eventually susceptible to long-term fatigue and structural failure effects expected from mechanical devices operating over long periods of time. The purpose of our work is to classify the condition of in vivo Bjork-Shiley Convexo-Concave (BSCC) heart valves by processing acoustic measurements of heart valve sounds. The structural failures of interest for Bscc valves is called single leg separation (SLS). SLS can occur if the outlet strut cracks and separates from the main structure of the valve. We measure acoustic opening and closing sounds (waveforms) using high sensitivity contact microphones on the patient`s thorax. For our analysis, we focus our processing and classification efforts on the opening sounds because they yield direct information about outlet strut condition with minimal distortion caused by energy radiated from the valve disc.

  2. Optimization of Manufacturing Conditions for Improving Storage Stability of Coffee-Supplemented Milk Beverage Using Response Surface Methodology

    PubMed Central

    Kim, Jae-Hoon; Oh, Duk-Geun; Kim, Moojoong; Chung, Donghwa

    2017-01-01

    This study aimed at optimizing the manufacturing conditions of a milk beverage supplemented with coffee, and monitoring its physicochemical and sensory properties during storage. Raw milk, skim milk powder, coffee extract, and emulsifiers were used to manufacture the beverage. Two sucrose fatty acid esters, F110 and F160, were identified as suitable emulsifiers. The optimum conditions for the beverage manufacture, which can satisfy two conditions at the same time, determined by response surface methodology (RSM), were 5,000 rpm primary homogenization speed and 0.207% sucrose fatty acid emulsifier addition. The particle size and zeta-potential of the beverage under the optimum condition were 190.1 nm and - 25.94±0.06 mV, respectively. In comparison study between F110 added group (GF110) and F160 added group (GF160) during storage, all samples maintained its pH around 6.6 to 6.7, and there was no significant difference (p<0.05). In addition, GF110 showed significantly higher zeta-potential than GF160 (p<0.05). The particle size of GF110 and GF160 were approximately 190.1 and 223.1 nm, respectively at initial. However, size distribution of the GF160 tended to increase during storage. Moreover, increase of the particle size in GF160 was observed in microphotographs of it during storage. The L* values gradually decreased within all groups, whereas the a* and b* values did not show significant variations (p<0.05). Compared with GF160, bitterness, floating cream, and rancid flavor were more pronounced in the GF110. Based on the result obtained from the present study, it appears that the sucrose fatty acid ester F110 is more suitable emulsifier when it comes to manufacturing this beverage than the F160, and also contributes to extending product shelf-life. PMID:28316475

  3. Effect of Culture Condition Variables on Human Endostatin Gene Expression in Escherichia coli Using Response Surface Methodology

    PubMed Central

    Mohajeri, Abbas; Pilehvar-Soltanahmadi, Yones; Abdolalizadeh, Jalal; Karimi, Pouran; Zarghami, Nosratollah

    2016-01-01

    Background Recombinant human endostatin (rhES) is an angiogenesis inhibitor used as a specific drug for the treatment of non-small-cell lung cancer. As mRNA concentration affects the recombinant protein expression level, any factor affecting mRNA concentration can alter the protein expression level. Response surface methodology (RSM) based on the Box-Behnken design (BBD) is a statistical tool for experimental design and for optimizing biotechnological processes. Objectives This investigation aimed to predict and develop the optimal culture conditions for mRNA expression of the synthetic human endostatin (hES) gene in Escherichia coli BL21 (DE3). Materials and Methods The hES gene was amplified, cloned, and expressed in the E. coli expression system. Three factors, including isopropyl β-D-1-thiogalactopyranoside (IPTG) concentration, post-induction time, and cell density before induction, were selected as important factors. The mRNA expression level was determined using real-time PCR. The expression levels of hES mRNA under the different growth conditions were analyzed. SDS-PAGE and western blot analyses were carried out for further confirmation of interest-gene expression. Results A maximum rhES mRNA level of 376.16% was obtained under the following conditions: 0.6 mM IPTG, 7 hours post-induction time, and 0.9 cell density before induction. The level of rhES mRNA was significantly correlated with post-induction time, IPTG concentration, and cell density before induction (P < 0.05). The expression of the hES gene was confirmed by western blot. Conclusions The obtained results indicate that RSM is an effective method for the optimization of culture conditions for hES gene expression in E. coli. PMID:27800134

  4. What about temperature? Measuring permeability at magmatic conditions.

    NASA Astrophysics Data System (ADS)

    Kushnir, Alexandra R. L.; Martel, Caroline; Champallier, Rémi; Reuschlé, Thierry

    2015-04-01

    The explosive potential of volcanoes is intimately linked to permeability, which is governed by the connectivity of the porous structure of the magma and surrounding edifice. As magma ascends, volatiles exsolve from the melt and expand, creating a gas phase within the conduit. In the absence of a permeable structure capable of dissipating these gases, the propulsive force of an explosive eruption arises from the gas expansion and the build up of subsurface overpressures. Thus, characterizing the permeability of volcanic rocks under in-situ conditions (high temperature and pressure) allows us to better understand the outgassing potential and explosivity of volcanic systems. Current studies of the permeabilities of volcanic rocks generally measure permeability at room temperature using gas permeameters or model permeability using analytic imaging. Our goal is to perform and assess permeability measurements made at high temperature and high pressure in the interest of approaching the permeability of the samples at magmatic conditions. We measure the permeability of andesitic samples expelled during the 2010 Mt. Merapi eruption. We employ and compare two protocols for measuring permeability at high temperature and under high pressure using argon gas in an internally heated Paterson apparatus with an isolated pore fluid system. We first use the pulse decay method to measure the permeability of our samples, then compare these values to permeability measurements performed under steady state flow. We consider the steady state flow method the more rigorous of the two protocols, as we are more capable of accounting for the temperature gradient within the entire pore fluid system. At temperatures in excess of 700°C and pressures of 100 MPa, permeability values plummet by several orders of magnitude. These values are significantly lower than those commonly reported for room temperature permeameter measurements. The reduction in permeability at high temperature is a

  5. Anonymous indexing of health conditions for a similarity measure.

    PubMed

    Song, Insu; Marsh, Nigel V

    2012-07-01

    A health social network is an online information service which facilitates information sharing between closely related members of a community with the same or a similar health condition. Over the years, many automated recommender systems have been developed for social networking in order to help users find their communities of interest. For health social networking, the ideal source of information for measuring similarities of patients is the medical information of the patients. However, it is not desirable that such sensitive and private information be shared over the Internet. This is also true for many other security sensitive domains. A new information-sharing scheme is developed where each patient is represented as a small number of (possibly disjoint) d-words (discriminant words) and the d-words are used to measure similarities between patients without revealing sensitive personal information. The d-words are simple words like "food,'' and thus do not contain identifiable personal information. This makes our method an effective one-way hashing of patient assessments for a similarity measure. The d-words can be easily shared on the Internet to find peers who might have similar health conditions.

  6. The application of conditioning paradigms in the measurement of pain.

    PubMed

    Li, Jun-Xu

    2013-09-15

    Pain is a private experience that involves both sensory and emotional components. Animal studies of pain can only be inferred by their responses, and therefore the measurement of reflexive responses dominates the pain literature for nearly a century. It has been argued that although reflexive responses are important to unveil the sensory nature of pain in organisms, pain affect is equally important but largely ignored in pain studies primarily due to the lack of validated animal models. One strategy to begin to understand pain affect is to use conditioning principles to indirectly reveal the affective condition of pain. This review critically analyzed several procedures that are thought to measure affective learning of pain. The procedures regarding the current knowledge, the applications, and their advantages and disadvantages in pain research are discussed. It is proposed that these procedures should be combined with traditional reflex-based pain measurements in future studies of pain, which could greatly benefit both the understanding of neural underpinnings of pain and preclinical assessment of novel analgesics. © 2013 Elsevier B.V. All rights reserved.

  7. The application of conditioning paradigms in the measurement of pain

    PubMed Central

    Li, Jun-Xu

    2013-01-01

    Pain is a private experience that involves both sensory and emotional components. Animal studies of pain can only be inferred by their responses, and therefore the measurement of reflexive responses dominate the pain literature for nearly a century. It has been argued that although reflexive responses are important to unveil the sensory nature of pain in organisms, pain affect is equally important but largely ignored in pain studies primarily due to the lack of validated animal models. One strategy to begin to understand pain affect is to use conditioning principles to indirectly reveal the affective condition of pain. This review critically analyzed several procedures that are thought to measure affective learning of pain. The procedures regarding the current knowledge, the applications, and their advantages and disadvantages in pain research are discussed. It is proposed that these procedures should be combined with traditional reflex-based pain measurements in future studies of pain, which could greatly benefit both the understanding of neural underpinnings of pain and preclinical assessment of novel analgesics. PMID:23500202

  8. Optimization of entrapping conditions to improve the release of BMP-2 from PELA carriers by response surface methodology.

    PubMed

    Li, Xialin; Min, Shaoxiong; Zhao, Xiaoyan; Lu, Zhifang; Jin, Anmin

    2014-12-23

    A microcapsule prepared from triblock copolymer poly(lactic acid)-poly(ethylene glycol)-poly(lactic acid) (PLA-PEG-PLA, PELA) was investigated as a controlled release carrier for recombinant human bone morphogenetic protein-2 (rhBMP-2). The rhBMP-2/PELA microspheres were prepared using the water-in-oil-in-water (W/O/W) solvent evaporation method. This work was conducted to optimize the entrapping conditions of the rhBMP-2 loaded PELA copolymer. The effects on encapsulation efficiency (EE) of different molecular weights (MW) of PEG in the copolymer, the amount of PELA, the amount of rhBMP-2, the span-20 concentration, the polyvinyl alcohol (PVA) concentration and stirring time were tested. On the basis of single-factor experiments, the optimum parameters were achieved using response surface methodology (RSM). The results showed that the highest EE of BMP-2 was achieved with a span-20 concentration of 0.5%, PEG MW 4000 Da, a stirring time of 30 min at 800 rpm min(-1), 282.3 mg of PELA, 1 μg of rhBMP-2 and PVA concentration 0.79%. Under these optimal conditions, it was predicted that the highest EE to be achieved would be 76.5%; the actual EE achieved was 75%.

  9. Statistical Optimization of Ultraviolet Irradiate Conditions for Vitamin D2 Synthesis in Oyster Mushrooms (Pleurotus ostreatus) Using Response Surface Methodology

    PubMed Central

    Wu, Wei-Jie; Ahn, Byung-Yong

    2014-01-01

    Response surface methodology (RSM) was used to determine the optimum vitamin D2 synthesis conditions in oyster mushrooms (Pleurotus ostreatus). Ultraviolet B (UV-B) was selected as the most efficient irradiation source for the preliminary experiment, in addition to the levels of three independent variables, which included ambient temperature (25–45°C), exposure time (40–120 min), and irradiation intensity (0.6–1.2 W/m2). The statistical analysis indicated that, for the range which was studied, irradiation intensity was the most critical factor that affected vitamin D2 synthesis in oyster mushrooms. Under optimal conditions (ambient temperature of 28.16°C, UV-B intensity of 1.14 W/m2, and exposure time of 94.28 min), the experimental vitamin D2 content of 239.67 µg/g (dry weight) was in very good agreement with the predicted value of 245.49 µg/g, which verified the practicability of this strategy. Compared to fresh mushrooms, the lyophilized mushroom powder can synthesize remarkably higher level of vitamin D2 (498.10 µg/g) within much shorter UV-B exposure time (10 min), and thus should receive attention from the food processing industry. PMID:24736742

  10. Current psychometric and methodological issues in the measurement of overgeneral autobiographical memory.

    PubMed

    Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk

    2012-12-01

    Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented.

  11. High Accuracy Temperature Measurements Using RTDs with Current Loop Conditioning

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.

    1997-01-01

    To measure temperatures with a greater degree of accuracy than is possible with thermocouples, RTDs (Resistive Temperature Detectors) are typically used. Calibration standards use specialized high precision RTD probes with accuracies approaching 0.001 F. These are extremely delicate devices, and far too costly to be used in test facility instrumentation. Less costly sensors which are designed for aeronautical wind tunnel testing are available and can be readily adapted to probes, rakes, and test rigs. With proper signal conditioning of the sensor, temperature accuracies of 0.1 F is obtainable. For reasons that will be explored in this paper, the Anderson current loop is the preferred method used for signal conditioning. This scheme has been used in NASA Lewis Research Center's 9 x 15 Low Speed Wind Tunnel, and is detailed.

  12. Closed-loop snowplow applicator control using road condition measurements

    NASA Astrophysics Data System (ADS)

    Erdogan, Gurkan; Alexander, Lee; Rajamani, Rajesh

    2011-04-01

    Closed-loop control of a snowplow applicator, based on direct measurement of the road surface condition, is a valuable technology for the optimisation of winter road maintenance costs and for the protection of the environment from the negative impacts of excessive usage of de-icing chemicals. To this end, a novel friction measurement wheel is designed to provide a continuous measurement of road friction coefficient, which is, in turn, utilised to control the applicator automatically on a snowplow. It is desired that the automated snowplow applicator deploy de-icing materials right from the beginning of any slippery surface detected by the friction wheel, meaning that no portion of the slippery road surface should be left untreated behind, as the snowplow travels over it at a reasonably high speed. This paper describes the developed wheel-based measurement system, the friction estimation algorithm and the expected performance of the closed-loop applicator system. Conventional and zero velocity applicators are introduced and their hardware time delays are measured in addition to the time delay of the friction estimation algorithm. The overall performance of the closed-loop applicator control system is shown to be reliable at typical snowplowing speeds if the zero velocity applicator is used.

  13. Innovative methodology for electrical conductivity measurements and metal partition in biosolid pellets

    NASA Astrophysics Data System (ADS)

    Jordan, Manuel Miguel; Rincón-Mora, Beatriz; Belén Almendro-Candel, María; Navarro-Pedreño, José; Gómez-Lucas, Ignacio; Bech, Jaume

    2017-04-01

    Use of biosolids to improve the nutrient content in a soil is a common practice. The obligation to restore abandoned mine and the correct application of biosolids is guaranteed by the legislation on waste management, biosolids and soil conservation (Jordán et al. 2008). The present research was conducted to determine electrical conductivity in dry wastes (pellets) using a innovative methodology (Camilla and Jordán, 2009). On the other hand, the present study was designed to examine the distribution of selected heavy metals in biosolid pellets, and also to relate the distribution patterns of these metals. In this context, heavy metal concentrations were studied in biosolid pellets under different pressures. Electrical conductivity measurements were taken in biosolid pellets under pressures on the order of 50 to 150 MPa and with currents of 10-15 A. Measurements of electrical conductivity and heavy metal content for different areas (H1, H2, and H3) were taken. Total content of metals was determined following microwave digestion and analysed by ICP/MS. Triplicate portions were weighed in polycarbonate centrifuge tubes and sequentially extracted. The distribution of chemical forms of Cd, Ni, Cr, and Pb in the biosolids was studied using a sequential extraction procedure that fractionates the metal into soluble-exchangeable, specifically sorbed-carbonate bound, oxidizable, reducible, and residual forms. The residual, reducible, and carbonate-sorbed forms were dominant. Higher Cr and Ni content were detected in pellets made with biosolids from the H3. The highest Cd and Ni values were detected in the H2. The trends of the conductivity curves were similar for the sludge from the isolation surface (H1) and for the mesophilous area (H2). In the case of the thermophilous area (H3), the electrical conductivity showed extremely high values. This behaviour was similar in the case of the Cr and Ni content. However, in the case of Cd and Pb, the highest values were detected in

  14. Innovative Methodologies for thermal Energy Release Measurement: case of La Solfatara volcano (Italy)

    NASA Astrophysics Data System (ADS)

    Marfe`, Barbara; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Marotta, Enrica; Peluso, Rosario

    2015-04-01

    This work is devoted to improve the knowledge on the parameters that control the heat flux anomalies associated with the diffuse degassing processes of volcanic and hydrothermal areas. The methodologies currently used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. A new method, based on the use of thermal imaging cameras, has been applied to estimate the heat flux and its time variations. This approach will allow faster heat flux measurement than already accredited methods, improving in this way the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The idea is to extrapolate the heat flux from the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. We use thermal imaging cameras, at short distances (meters to hundreds of meters), to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature. Preliminary studies have been carried out throughout the whole of the La Solfatara crater in order to investigate a possible correlation between the surface temperature and the shallow thermal gradient. We have used a FLIR SC640 thermal camera and K type thermocouples to assess the two measurements at the same time. Results suggest a good correlation between the shallow temperature gradient ΔTs and the surface temperature Ts depurated from background, and despite the campaigns took place during a period of time of a few years, this correlation seems to be stable over the time. This is an extremely motivating result for a further development of a measurement method based only on the use of small range thermal imaging camera. Surveys with thermal cameras may be manually done using a tripod to take thermal images of small contiguous areas and then joining

  15. Measurement error and outcome distributions: Methodological issues in regression analyses of behavioral coding data.

    PubMed

    Holsclaw, Tracy; Hallgren, Kevin A; Steyvers, Mark; Smyth, Padhraic; Atkins, David C

    2015-12-01

    Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased Type I and Type II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in online supplemental materials.

  16. Measurement error and outcome distributions: Methodological issues in regression analyses of behavioral coding data

    PubMed Central

    Holsclaw, Tracy; Hallgren, Kevin A.; Steyvers, Mark; Smyth, Padhraic; Atkins, David C.

    2015-01-01

    Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non-normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased type-I and type-II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally-technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in supplementary materials. PMID:26098126

  17. New apparent beam width artifact and measurement methodology for CD-SEM resolution monitoring

    NASA Astrophysics Data System (ADS)

    Mayer, Jason A.; Huizenga, Kylee J.; Solecky, Eric P.; Archie, Charles N.; Banke, G. W., Jr.; Cogley, Robert M.; Nathan, Claudine; Robert, James M.

    2003-05-01

    The Apparent Beam Width (ABW) total system resolution metric is part of the International SEMATECH CDSEM specification and bench marking activities. It is also used in our own CDSEM specification, evaluations, and tool maintenance activities. Our first set of ABW artifacts, constructed a few years ago, need retirement for several reasons, including: (1) their materials and dimensions no longer represent current manufacturing line samples and (2) their line edge variation is too large for current and future ABW applications. The construction and testing of a new ABW artifact will be discussed in this paper. The use of ABW as a monitor for total system resolution requires a unique set of sample characteristics, which include near vertical sidewalls, minimal top corner rounding, minimal line edge roughness (LER), and good line edge uniformity across the artifact set. Several process iterations were performed using the latest photolithographic processes whilst including numerous measurement evaluations in order to achieve these characteristics. A sampling methodology has been formulated to take advantage of the good within-field, field-to-field, and wafer-to-wafer uniformities of the artifacts. In addition to driving resolution improvements ABW also serves as a metric for tool-to-tool matching in a manufacturing environment.

  18. Thermal-Diffusivity Measurements of Mexican Citrus Essential Oils Using Photoacoustic Methodology in the Transmission Configuration

    NASA Astrophysics Data System (ADS)

    Muñoz, G. A. López; González, R. F. López; López, J. A. Balderas; Martínez-Pérez, L.

    2011-05-01

    Photoacoustic methodology in the transmission configuration (PMTC) was used to study the thermophysical properties and their relation with the composition in Mexican citrus essential oils providing the viability of using photothermal techniques for quality control and for authentication of oils and their adulteration. Linear relations for the amplitude (on a semi-log scale) and phase, as functions of the sample's thickness, for the PMTC was obtained through a theoretical model fit to the experimental data for thermal-diffusivity measurements in Mexican orange, pink grapefruit, mandarin, lime type A, centrifuged essential oils, and Mexican distilled lime essential oil. Gas chromatography for distilled lime essential oil and centrifuged lime essential oil type A is reported to complement the study. Experimental results showed close thermal-diffusivity values between Mexican citrus essential oils obtained by centrifugation, but a significant difference of this physical property for distilled lime oil and the corresponding value obtained by centrifugation, which is due to their different chemical compositions involved with the extraction processes.

  19. A METHODOLOGY TO INTEGRATE MAGNETIC RESONANCE AND ACOUSTIC MEASUREMENTS FOR RESERVOIR CHARACTERIZATION

    SciTech Connect

    Jorge O. Parra; Chris L. Hackert; Lorna L. Wilson

    2002-09-20

    The work reported herein represents the third year of development efforts on a methodology to interpret magnetic resonance and acoustic measurements for reservoir characterization. In this last phase of the project we characterize a vuggy carbonate aquifer in the Hillsboro Basin, Palm Beach County, South Florida, using two data sets--the first generated by velocity tomography and the second generated by reflection tomography. First, we integrate optical macroscopic (OM), scanning electron microscope (SEM) and x-ray computed tomography (CT) images, as well as petrography, as a first step in characterizing the aquifer pore system. This pore scale integration provides information with which to evaluate nuclear magnetic resonance (NMR) well log signatures for NMR well log calibration, interpret ultrasonic data, and characterize flow units at the field scale between two wells in the aquifer. Saturated and desaturated NMR core measurements estimate the irreducible water in the rock and the variable T{sub 2} cut-offs for the NMR well log calibration. These measurements establish empirical equations to extract permeability from NMR well logs. Velocity and NMR-derived permeability and porosity relationships integrated with velocity tomography (based on crosswell seismic measurements recorded between two wells 100 m apart) capture two flow units that are supported with pore scale integration results. Next, we establish a more detailed picture of the complex aquifer pore structures and the critical role they play in water movement, which aids in our ability to characterize not only carbonate aquifers, but reservoirs in general. We analyze petrography and cores to reveal relationships between the rock physical properties that control the compressional and shear wave velocities of the formation. A digital thin section analysis provides the pore size distributions of the rock matrix, which allows us to relate pore structure to permeability and to characterize flow units at the

  20. A Novel Fiber Bragg Grating Based Sensing Methodology for Direct Measurement of Surface Strain on Body Muscles during Physical Exercises

    NASA Astrophysics Data System (ADS)

    Prasad Arudi Subbarao, Guru; Subbaramajois Narasipur, Omkar; Kalegowda, Anand; Asokan, Sundarrajan

    2012-07-01

    The present work proposes a new sensing methodology, which uses Fiber Bragg Gratings (FBGs) to measure in vivo the surface strain and strain rate on calf muscles while performing certain exercises. Two simple exercises, namely ankle dorsi-flexion and ankle plantar-flexion, have been considered and the strain induced on the medial head of the gastrocnemius muscle while performing these exercises has been monitored. The real time strain generated has been recorded and the results are compared with those obtained using a commercial Color Doppler Ultrasound (CDU) system. It is found that the proposed sensing methodology is promising for surface strain measurements in biomechanical applications.

  1. Assessment of hygienic conditions of ground pepper (Piper nigrum L.) on the market in Sao Paulo City, by means of two methodologies for detecting the light filth

    USDA-ARS?s Scientific Manuscript database

    Pepper should to be collected, processed, and packed under optimum conditions to avoid the presence of foreign matter. The hygienic conditions of ground pepper marketted in São Paulo city were assessed in determining the presence of foreign matter by means of two extraction methodologies. This study...

  2. Practical Experience of Discharge Measurement in Flood Conditions with ADP

    NASA Astrophysics Data System (ADS)

    Vidmar, A.; Brilly, M.; Rusjan, S.

    2009-04-01

    Accurate discharge estimation is important for an efficient river basin management and especially for flood forecasting. The traditional way of estimating the discharge in hydrological practice is to measure the water stage and to convert the recorded water stage values into discharge by using the single-valued rating curve .Relationship between the stage and discharge values of the rating curve for the extreme events are usually extrapolated by using different mathematical methods and are not directly measured. Our practice shows that by using the Accoustic Doppler Profiler (ADP) instrument we can record the actual relation between the water stage and the flow velocity at the occurrence of flood waves very successfully. Measurement in flood conditions it is not easy task, because of high water surface velocity and large amounts of sediments in the water and floating objects on the surface like branches, bushes, trees, piles and others which can also easily damage ADP instrument. We made several measurements in such extreme events on the Sava River down to the nuclear power plant Kr\\vsko where we have install fixed cable way. During the several measurement with traditional "moving-boat" measurement technique a mowing bed phenomenon was clearly seen. Measuring flow accurately using ADP that uses the "moving-boat" technique, the system needs a reference against which to relate water velocities to. This reference is river bed and must not move. During flood events we detected difficulty finding a static bed surface to which to relate water velocities. This is caused by motion of the surface layer of bed material or also sediments suspended in the water near bed very densely. So these traditional »moving-boat« measurement techniques that we normally use completely fail. Using stationary measurement method to making individual velocity profile measurements, using an Acoustic Doppler Profiler (ADP), at certain time at fixed locations across the width of a stream gave

  3. Non-invasive ultrasound based temperature measurements at reciprocating screw plastication units: Methodology and applications

    NASA Astrophysics Data System (ADS)

    Straka, Klaus; Praher, Bernhard; Steinbichler, Georg

    2015-05-01

    Previous attempts to accurately measure the real polymer melt temperature in the screw chamber as well as in the screw channels have failed on account of the challenging metrological boundary conditions (high pressure, high temperature, rotational and axial screw movement). We developed a novel ultrasound system - based on reflection measurements - for the online determination of these important process parameters. Using available pressure-volume-temperature (pvT) data from a polymer it is possible to estimate the density and adiabatic compressibility of the material and therefore the pressure and temperature depending longitudinal ultrasound velocity. From the measured ultrasonic reflection time from the screw root and barrel wall and the pressure it is possible to calculate the mean temperature in the screw channel or in the chamber in front of the screw (in opposition to flush mounted infrared or thermocouple probes). By means of the above described system we are able to measure axial profiles of the mean temperature in the screw chamber. The data gathered by the measurement system can be used to develop control strategies for the plastication process to reduce temperature gradients within the screw chamber or as input data for injection moulding simulation.

  4. Collector probe measurements of ohmic conditioning discharges in TFTR

    SciTech Connect

    Kilpatrick, S.J.; Dylla, H.F.; Manos, D.M.; Cohen, S.A.; Wampler, W.R.; Bastasz, R.

    1989-03-01

    Special limiter conditioning techniques using low density deuterium or helium discharges have produced enhanced plasma confinement in TFTR. Measurements with a rotatable collector probe have been made to increase our understanding of the boundary layer during these conditioning sequences. A set of silicon films behind slits facing in the ion and electron drift directions was exposed to four different D/sup +/ and He/sup 2 +/ discharge sequences. The amounts of deuterium and impurities trapped in the surface regions of the samples have been measured by different analytical techniques, including nuclear reaction analysis for retained deuterium, Rutherford backscattering spectroscopy for carbon and metals, and Auger electron spectroscopy for carbon, oxygen, and metals. Up to 1.9 /times/ 10/sup 17/ cm/sup /minus/2/ of deuterium was detected in codeposited carbon layers with D/C generally in the range of the bulk saturation limit. Radial profiles and ion drift/electron drift asymmetries are discussed. 21 refs., 3 figs., 1 tab.

  5. Lab measurements to support modeling terahertz propagation in brownout conditions

    NASA Astrophysics Data System (ADS)

    Fiorino, Steven T.; Grice, Phillip M.; Krizo, Matthew J.; Bartell, Richard J.; Haiducek, John D.; Cusumano, Salvatore J.

    2010-04-01

    Brownout, the loss of visibility caused by dust and debris introduced into the atmosphere by the downwash of a helicopter, currently represents a serious challenge to U.S. military operations in Iraq and Afghanistan, where it has been cited as a factor in the majority of helicopter accidents. Brownout not only reduces visibility, but can create visual illusions for the pilot and difficult conditions for crew beneath the aircraft. Terahertz imaging may provide one solution to this problem. Terahertz frequency radiation readily propagates through the dirt aerosols present in brownout, and therefore can provide an imaging capability to improve effective visibility for pilots, helping prevent the associated accidents. To properly model the success of such systems, it is necessary to determine the optical properties of such obscurants in the terahertz regime. This research attempts to empirically determine, and measure in the laboratory, the full complex index of refraction optical properties of dirt aerosols representative of brownout conditions. These properties are incorporated into the AFIT/CDE Laser Environmental Effects Definition and Reference (LEEDR) software, allowing this program to more accurately assess the propagation of terahertz radiation under brownout conditions than was done in the past with estimated optical properties.

  6. Safety Assessment for a Surface Repository in the Chernobyl Exclusion Zone - Methodology for Assessing Disposal under Intervention Conditions - 13476

    SciTech Connect

    Haverkamp, B.; Krone, J.; Shybetskyi, I.

    2013-07-01

    The Radioactive Waste Disposal Facility (RWDF) Buryakovka was constructed in 1986 as part of the intervention measures after the accident at Chernobyl NPP (ChNPP). Today, RWDF Buryakovka is still being operated but its maximum capacity is nearly reached. Plans for enlargement of the facility exist since more than 10 years but have not been implemented yet. In the framework of an European Commission Project DBE Technology GmbH prepared a safety analysis report of the facility in its current state (SAR) and a preliminary safety analysis report (PSAR) based on the planned enlargement. Due to its history RWDF Buryakovka does not fully comply with today's best international practices and the latest Ukrainian regulations in this area. The most critical aspects are its inventory of long-lived radionuclides, and the non-existent multi-barrier waste confinement system. A significant part of the project was dedicated, therefore, to the development of a methodology for the safety assessment taking into consideration the facility's special situation and to reach an agreement with all stakeholders involved in the later review and approval procedure of the safety analysis reports. Main aspect of the agreed methodology was to analyze the safety, not strictly based on regulatory requirements but on the assessment of the actual situation of the facility including its location within the Exclusion Zone. For both safety analysis reports, SAR and PSAR, the assessment of the long-term safety led to results that were either within regulatory limits or within the limits allowing for a specific situational evaluation by the regulator. (authors)

  7. Geometric scaling of artificial hair sensors for flow measurement under different conditions

    NASA Astrophysics Data System (ADS)

    Su, Weihua; Reich, Gregory W.

    2017-03-01

    Artificial hair sensors (AHSs) have been developed for prediction of the local flow speed and aerodynamic force around an airfoil and subsequent application in vibration control of the airfoil. Usually, a specific sensor design is only sensitive to the flow speeds within its operating flow measurement region. This paper aims at expanding this flow measurement concept of using AHSs to different flow speed conditions by properly sizing the parameters of the sensors, including the dimensions of the artificial hair, capillary, and carbon nanotubes (CNTs) that make up the sensor design, based on a baseline sensor design and its working flow condition. In doing so, the glass fiber hair is modeled as a cantilever beam with an elastic foundation, subject to the distributed aerodynamic drag over the length of the hair. Hair length and diameter, capillary depth, and CNT height are scaled by keeping the maximum compressive strain of the CNTs constant for different sensors under different speed conditions. Numerical studies will demonstrate the feasibility of the geometric scaling methodology by designing AHSs for aircraft with different dimensions and flight conditions, starting from the same baseline sensor. Finally, the operating bandwidth of the scaled sensors are explored.

  8. "MARK I" MEASUREMENT METHODOLOGY FOR POLLUTION PREVENTION PROGRESS OCCURRING AS A RESULT OF PRODUCT DECISIONS

    EPA Science Inventory

    A methodology for assessing progress in pollution prevention resulting from product redesign, reformulation or replacement is described. The method compares the pollution generated by the original product with that from the modified or replacement product, taking into account, if...

  9. Optimisation of the operational conditions of trichloroethylene degradation using Trametes versicolor under quinone redox cycling conditions using central composite design methodology.

    PubMed

    Vilaplana, Marcel; García, Ana Belén; Caminal, Gloria; Guillén, Francisco; Sarrà, Montserrat

    2012-04-01

    Extracellular radicals produced by Trametes versicolor under quinone redox cycling conditions can degrade a large variety of pollutant compounds, including trichloroethylene (TCE). This study investigated the effect of the agitation speed and the gas-liquid phase volume ratio on TCE degradation using central composite design (CCD) methodology for a future scale-up to a reactor system. The agitation speed ranged from 90 to 200 rpm, and the volume ratio ranged from 0.5 to 4.4. The results demonstrated the important and positive effect of the agitation speed and an interaction between the two factors on TCE degradation. Although the volume ratio did not have a significant effect if the agitation speed value was between 160 and 200 rpm, at lower speed values, the specific pollutant degradation was clearly more extensive at low volume ratios than at high volume ratios. The fitted response surface was validated by performing an experiment using the parameter combination in the model that maximised TCE degradation. The results of the experiments carried out using different biomass concentrations demonstrated that the biomass concentration had a positive effect on pollutant degradation if the amount of biomass present was lower than 1.6 g dry weight l(-1). The results show that the maximum TCE degradation was obtained at the highest speed (200 rpm), gas-liquid phase volume ratio (4.4), and a biomass concentration of 1.6 g dry weight l(-1).

  10. Inference of human affective states from psychophysiological measurements extracted under ecologically valid conditions

    PubMed Central

    Betella, Alberto; Zucca, Riccardo; Cetnarski, Ryszard; Greco, Alberto; Lanatà, Antonio; Mazzei, Daniele; Tognetti, Alessandro; Arsiwalla, Xerxes D.; Omedas, Pedro; De Rossi, Danilo; Verschure, Paul F. M. J.

    2014-01-01

    Compared to standard laboratory protocols, the measurement of psychophysiological signals in real world experiments poses technical and methodological challenges due to external factors that cannot be directly controlled. To address this problem, we propose a hybrid approach based on an immersive and human accessible space called the eXperience Induction Machine (XIM), that incorporates the advantages of a laboratory within a life-like setting. The XIM integrates unobtrusive wearable sensors for the acquisition of psychophysiological signals suitable for ambulatory emotion research. In this paper, we present results from two different studies conducted to validate the XIM as a general-purpose sensing infrastructure for the study of human affective states under ecologically valid conditions. In the first investigation, we recorded and classified signals from subjects exposed to pictorial stimuli corresponding to a range of arousal levels, while they were free to walk and gesticulate. In the second study, we designed an experiment that follows the classical conditioning paradigm, a well-known procedure in the behavioral sciences, with the additional feature that participants were free to move in the physical space, as opposed to similar studies measuring physiological signals in constrained laboratory settings. Our results indicate that, by using our sensing infrastructure, it is indeed possible to infer human event-elicited affective states through measurements of psychophysiological signals under ecological conditions. PMID:25309310

  11. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    PubMed

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information

  12. Suspended matter concentrations in coastal waters: Methodological improvements to quantify individual measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Röttgers, Rüdiger; Heymann, Kerstin; Krasemann, Hajo

    2014-12-01

    Measurements of total suspended matter (TSM) concentration and the discrimination of the particulate inorganic (PIM) and organic matter fraction by the loss on ignition methods are susceptible to significant and contradictory bias errors by: (a) retention of sea salt in the filter (despite washing with deionized water), and (b) filter material loss during washing and combustion procedures. Several methodological procedures are described to avoid or correct errors associated with these biases but no analysis of the final uncertainty for the overall mass concentration determination has yet been performed. Typically, the exact values of these errors are unknown and can only be estimated. Measurements were performed in coastal and estuarine waters of the German Bight that allowed the individual error for each sample to be determined with respect to a systematic mass offset. This was achieved by using different volumes of the sample and analyzing the mass over volume relationship by linear regression. The results showed that the variation in the mass offset is much larger than expected (mean mass offset: 0.85 ± 0.84 mg, range: -2.4 - 7.5 mg) and that it often leads to rather large relative errors even when TSM concentrations were high. Similarly large variations were found for the mass offset for PIM measurements. Correction with a mean offset determined with procedural control filters reduced the maximum error to <60%. The determination errors for the TSM concentration was <40% when three different volume were used, and for the majority of the samples the error was <10%. When six different volumes were used and outliers removed, the error was always <25%, very often errors of only a few percent were obtained. The approach proposed here can determine the individual determination error for each sample, is independent of bias errors, can be used for TSM and PIM determination, and allows individual quality control for samples from coastal and estuarine waters. It should

  13. An analysis of the pilot point methodology for automated calibration of an ensemble of conditionally simulated transmissivity fields

    USGS Publications Warehouse

    Cooley, R.L.

    2000-01-01

    An analysis of the pilot point method for automated calibration of an ensemble of conditionally simulated transmissivity fields was conducted on the basis of the simplifying assumption that the flow model is a linear function of log transmissivity. The analysis shows that the pilot point and conditional simulation method of model calibration and uncertainty analysis can produce accurate uncertainty measures if it can be assumed that errors of unknown origin in the differences between observed and model-computed water pressures are small. When this assumption is not met, the method could yield significant errors from overparameterization and the neglect of potential sources of model inaccuracy. The conditional simulation part of the method is also shown to be a variant of the percentile bootstrap method, so that when applied to a nonlinear model, the method is subject to bootstrap errors. These sources of error must be considered when using the method.

  14. Reverse micellar extraction of lectin from black turtle bean (Phaseolus vulgaris): optimisation of extraction conditions by response surface methodology.

    PubMed

    He, Shudong; Shi, John; Walid, Elfalleh; Zhang, Hongwei; Ma, Ying; Xue, Sophia Jun

    2015-01-01

    Lectin from black turtle bean (Phaseolus vulgaris) was extracted and purified by reverse micellar extraction (RME) method. Response surface methodology (RSM) was used to optimise the processing parameters for both forward and backward extraction. Hemagglutinating activity analysis, SDS-PAGE, RP-HPLC and FTIR techniques were used to characterise the lectin. The optimum extraction conditions were determined as 77.59 mM NaCl, pH 5.65, AOT 127.44 mM sodium bis (2-ethylhexyl) sulfosuccinate (AOT) for the forward extraction; and 592.97 mM KCl, pH 8.01 for the backward extraction. The yield was 63.21 ± 2.35 mg protein/g bean meal with a purification factor of 8.81 ± 0.17. The efficiency of RME was confirmed by SDS-PAGE and RP-HPLC, respectively. FTIR analysis indicated there were no significant changes in the secondary protein structure. Comparison with conventional chromatographic method confirmed that the RME method could be used for the purification of lectin from the crude extract.

  15. Optimization of Extraction Condition for Alisol B and Alisol B Acetate in Alismatis Rhizoma using Response Surface Methodology

    PubMed Central

    Lee, A Yeong; Park, Jun Yeon; Chun, Jin Mi; Moon, Byeong Cheol; Kang, Byoung Kab; Seo, Young Bae; Shin, Hyeun-Kyoo; Kim, Ho Kyoung

    2012-01-01

    Alismatis Rhizoma is a perennial herb originating from the rhizomes of Alisma orientalis (Sam) Juzep and the same species which have been used to treat seborrheic dermatitis, eczema, polydipsia, and pedal edema. We aimed to determine the concentrations of the compounds alisol B and alisol B acetate present in a sample of the herb using high-performance liquid chromatography coupled with a photodiode array detector. We selected methanol as the optimal solvent considering the structures of alisol B and alisol B acetate. We estimated the proportion of alisol B and alisol B acetate in a standard extract to be 0.0434% and 0.2365% in methanol, respectively. To optimize extraction, we employed response surface methodology to determine the yields of alisol B and alisol B acetate, which mapped out a central composite design consisting of 15 experimental points. The extraction parameters were time, concentration, and sample weight. The predicted concentration of alisol B derivatives was estimated to be 0.2388% under the following conditions: 81 min of extraction time, 76% of methanol concentration, and 1.52g of sample weight. PMID:23335845

  16. Evaluation of methodological aspects of digestibility measurements in ponies fed different grass hays.

    PubMed

    Schaafstra, F J W C; van Doorn, D A; Schonewille, J T; Wartena, F C; Zoon, M V; Blok, M C; Hendriks, W H

    2015-10-01

    Methodological aspects of digestibility measurements of feedstuffs for equines were studied in four Welsh pony geldings consuming four grass-hay diets in a 4 × 4 Latin square design. Diets contained either a low (L), medium (M), high (H), or very high (VH) ADF content (264, 314, 375, or 396 g·kg DM, respectively). Diets were supplemented with minerals, vitamins, and TiO (3.9 g Ti·d). Daily feces excreted were collected quantitatively over 10 consecutive days and analyzed for moisture, ash, ADL, AIA, and titanium (Ti). Minimum duration of total fecal collection (TFC) required for an accurate estimation of apparent organic matter digestibility (OMD) of grass hay was assessed. Based on literature and the calculated cumulative OMD assessed over 10 consecutive days of TFC, a minimum duration of at least 5 consecutive days of fecal collection is recommended for accurate estimation of dry matter digestibility (DMD) and OMD in ponies. The 5-d collection should be preceded by a 14-d adaptation period to allow the animals to adapt to the diets and become accustomed to the collection procedures. Mean fecal recovery over 10 d across diets for ADL, AIA, and Ti was 93.1% (SE 1.9), 98.9% (SE 5.5), and 97.1% (SE 1.8), respectively. Evaluation of CV of mean fecal recoveries obtained by ADL, AIA, and Ti showed that variation in fecal Ti (6.8) and ADL excretion (7.0) was relatively low compared to AIA (12.3). In conclusion, the use of internal ADL and externally supplemented Ti are preferred as markers to be used in digestibility trials in equine fed grass-hay diets.

  17. Nuclear Structure Measurements of Fermium-254 and Advances in Target Production Methodologies

    NASA Astrophysics Data System (ADS)

    Gothe, Oliver Ralf

    The Berkeley Gas-filled Separator (BGS) has been upgraded with a new gas control system. It allows for accurate control of hydrogen and helium gas mixtures. This greatly increases the capabilities of the separator by reducing background signals in the focal plane detector for asymmetric nuclear reactions. It has also been shown that gas mixtures can be used to focus the desired reaction products into a smaller area, thereby increasing the experimental efficiency. A new electrodeposition cell has been developed to produce metal oxide targets for experiments at the BGS. The new cell has been characterized and was used to produce americium targets for the production of element 115 in the reaction 243Am(48Ca.3n) 288115. Additionally, a new method of producing targets for nuclear reactions was explored. A procedure for producing targets via Polymer Assisted Deposition (PAD) was developed and targets produced via this method were tested using the nuclear reaction 208Pb(40Ar.4 n)244Fm to determine their in-beam performance. It was determined that the silicon nitride backings used in this procedure are not feasible due to their crystal structures, and alternative backing materials have been tested and proposed. A previously unknown level in 254Fm has been identified at 985.7 keV utilizing a newly developed low background coincident apparatus. 254m was produced in the reaction 208Pb(48Ca. n)254No. Reaction products were guided to the two-clover low background detector setup via a recoil transfer chamber. The new level has been assigned a spin of 2- and has tentatively been identified as the octupole vibration in 254Fm. Transporting evaporation residues to a two-clover, low background detector setup can effectively be used to perform gamma-spectroscopy measurements of nuclei that are not accessible by current common methodologies. This technique provides an excellent addition to previously available tools such as in-beam spectroscopy and gamma-ray tracking arrays.

  18. Optimization of the production conditions of the lipase produced by Bacillus cereus from rice flour through Plackett-Burman Design (PBD) and response surface methodology (RSM).

    PubMed

    Vasiee, Alireza; Behbahani, Behrooz Alizadeh; Yazdi, Farideh Tabatabaei; Moradi, Samira

    2016-12-01

    In this study, the screening of lipase positive bacteria from rice flour was carried out by Rhodamin B agar plate method. Bacillus cereus was identified by 16S rDNA method. Screening of the appropriate variables and optimization of the lipase production was performed using Plackett-Burman design (PBD) and response surface methodology (RSM). Among the isolated bacteria, an aerobic Bacillus cereus strain was recognized as the best lipase-producing bacteria (177.3 ± 20 U/ml). Given the results, the optimal enzyme production conditions were achieved with coriander seed extract (CSE)/yeast extract ratio of 16.9 w/w, olive oil (OO) and MgCl2 concentration of 2.37 g/L and 24.23 mM, respectively. In these conditions, the lipase activity (LA) was predicted 343 U/mL that was approximately close to the predicted value (324 U/mL), which was increased 1.83 fold LA compared with the non-optimized lipase. The kinetic parameters of Vmax and Km for the lipase were measured 0.367 μM/min.mL and 5.3 mM, respectively. The lipase producing Bacillus cereus was isolated and RSM was used for the optimization of enzyme production. The CSE/yeast extract ratio of 16.9 w/w, OO concentration of 2.37 g/L and MgCl2 concentration of 24.23 mM, were found to be the optimal conditions of the enzyme production process. LA at optimal enzyme production conditions was observed 1.83 times more than the non-optimal conditions. Ultimately, it can be concluded that the isolated B. cereus from rice flour is a proper source of lipase. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Measurement in Learning Games Evolution: Review of Methodologies Used in Determining Effectiveness of "Math Snacks" Games and Animations

    ERIC Educational Resources Information Center

    Trujillo, Karen; Chamberlin, Barbara; Wiburg, Karin; Armstrong, Amanda

    2016-01-01

    This article captures the evolution of research goals and methodologies used to assess the effectiveness and impact of a set of mathematical educational games and animations for middle-school aged students. The researchers initially proposed using a mixed model research design of formative and summative measures, such as user-testing,…

  20. Measurement in Learning Games Evolution: Review of Methodologies Used in Determining Effectiveness of "Math Snacks" Games and Animations

    ERIC Educational Resources Information Center

    Trujillo, Karen; Chamberlin, Barbara; Wiburg, Karin; Armstrong, Amanda

    2016-01-01

    This article captures the evolution of research goals and methodologies used to assess the effectiveness and impact of a set of mathematical educational games and animations for middle-school aged students. The researchers initially proposed using a mixed model research design of formative and summative measures, such as user-testing,…

  1. Anthropometric measurements, deficiency signs and their relationship under drought conditions.

    PubMed

    Ramnath, T; Krishnamachari, K A

    1993-01-01

    In October-November 1987 in India, the Desert Medicine Research Centre in Jodhpur conducted a rapid anthropometric survey of 555 preschool children in 4 districts of Rajasthan which had been severely affected by drought (Jodhpur, Jalore, Nagpur, and Barmer districts) to determine the association between anthropometric measurements and various nutritional deficiency signs and infections. Based on weight for age, 82.3% of the children were undernourished. 13.3% of all children were severely malnourished (grade III undernutrition). Anemia, protein energy malnutrition (PEM), and upper respiratory infections occurred significantly more often as one digressed from the normal nutrition grade. These 3 conditions were also closely linked to weight status. Based on height for age, 62.4% of the children were chronically undernourished. 11.9% of all children were severely so. PEM was the only deficiency sign or infection associated with height status (6.2% of children with normal nutrition had PEM vs. 15.% for grade I undernutrition and 34.8% for grade II undernutrition; p .001). Vitamin A deficiency, anemia, and PEM occurred more frequently as one went from normal nutrition to grade II undernutrition based on fat fold at triceps (FFT) measurements. PEM and upper respiratory infections were significantly associated with weight for height status. Weight correctly identified 84% of all nutritional deficiency signs and infections. The corresponding figures for height, FFT, and weight for height were 64.2%, 75.4%, and 31%. Thus, weight was the most sensitive screening measurement in identifying nutritional deficiency signs and infections. Based on weight alone, the odds ratio of undernourished children developing Vitamin b-complex deficiency, PEM, and upper respiratory infections was 1.58, 3.25, and 1.77, respectively. Weight for height was the most specific screening measurement (88.2% vs. 44.7% for height, 29.3% for FFT, and 26.1% for weight).

  2. Measurement of Two-Phase Flow Characteristics Under Microgravity Conditions

    NASA Technical Reports Server (NTRS)

    Keshock, E. G.; Lin, C. S.; Edwards, L. G.; Knapp, J.; Harrison, M. E.; Xhang, X.

    1999-01-01

    This paper describes the technical approach and initial results of a test program for studying two-phase annular flow under the simulated microgravity conditions of KC-135 aircraft flights. A helical coil flow channel orientation was utilized in order to circumvent the restrictions normally associated with drop tower or aircraft flight tests with respect to two-phase flow, namely spatial restrictions preventing channel lengths of sufficient size to accurately measure pressure drops. Additionally, the helical coil geometry is of interest in itself, considering that operating in a microgravity environment vastly simplifies the two-phase flows occurring in coiled flow channels under 1-g conditions for virtually any orientation. Pressure drop measurements were made across four stainless steel coil test sections, having a range of inside tube diameters (0.95 to 1.9 cm), coil diameters (25 - 50 cm), and length-to-diameter ratios (380 - 720). High-speed video photographic flow observations were made in the transparent straight sections immediately preceding and following the coil test sections. A transparent coil of tygon tubing of 1.9 cm inside diameter was also used to obtain flow visualization information within the coil itself. Initial test data has been obtained from one set of KC-135 flight tests, along with benchmark ground tests. Preliminary results appear to indicate that accurate pressure drop data is obtainable using a helical coil geometry that may be related to straight channel flow behavior. Also, video photographic results appear to indicate that the observed slug-annular flow regime transitions agree quite reasonably with the Dukler microgravity map.

  3. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    PubMed

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  4. Optimization of Extraction Conditions for Maximal Phenolic, Flavonoid and Antioxidant Activity from Melaleuca bracteata Leaves Using the Response Surface Methodology

    PubMed Central

    Hou, Wencheng; Zhang, Wei; Chen, Guode; Luo, Yanping

    2016-01-01

    Melaleuca bracteata is a yellow-leaved tree belonging to the Melaleuca genus. Species from this genus are known to be good sources of natural antioxidants, for example, the “tea tree oil” derived from M. alternifolia is used in food processing to extend the shelf life of products. In order to determine whether M. bracteata contains novel natural antioxidants, the components of M. bracteata ethanol extracts were analyzed by gas chromatography–mass spectrometry. Total phenolic and flavonoid contents were extracted and the antioxidant activities of the extracts evaluated. Single-factor experiments, central composite rotatable design (CCRD) and response surface methodology (RSM) were used to optimize the extraction conditions for total phenolic content (TPC) and total flavonoid content (TFC). Ferric reducing power (FRP) and 1,1-Diphenyl-2-picrylhydrazyl radical (DPPH·) scavenging capacity were used as the evaluation indices of antioxidant activity. The results showed that the main components of M. bracteata ethanol extracts are methyl eugenol (86.86%) and trans-cinnamic acid methyl ester (6.41%). The single-factor experiments revealed that the ethanol concentration is the key factor determining the TPC, TFC, FRP and DPPH·scavenging capacity. RSM results indicated that the optimal condition of all four evaluation indices was achieved by extracting for 3.65 days at 53.26°C in 34.81% ethanol. Under these conditions, the TPC, TFC, FRP and DPPH·scavenging capacity reached values of 88.6 ± 1.3 mg GAE/g DW, 19.4 ± 0.2 mg RE/g DW, 2.37 ± 0.01 mM Fe2+/g DW and 86.0 ± 0.3%, respectively, which were higher than those of the positive control, methyl eugenol (FRP 0.97 ± 0.02 mM, DPPH·scavenging capacity 58.6 ± 0.7%) at comparable concentrations. Therefore, the extracts of M. bracteata leaves have higher antioxidant activity, which did not only attributed to the methyl eugenol. Further research could lead to the development of a potent new natural antioxidant. PMID

  5. Optimization of Extraction Conditions for Maximal Phenolic, Flavonoid and Antioxidant Activity from Melaleuca bracteata Leaves Using the Response Surface Methodology.

    PubMed

    Hou, Wencheng; Zhang, Wei; Chen, Guode; Luo, Yanping

    2016-01-01

    Melaleuca bracteata is a yellow-leaved tree belonging to the Melaleuca genus. Species from this genus are known to be good sources of natural antioxidants, for example, the "tea tree oil" derived from M. alternifolia is used in food processing to extend the shelf life of products. In order to determine whether M. bracteata contains novel natural antioxidants, the components of M. bracteata ethanol extracts were analyzed by gas chromatography-mass spectrometry. Total phenolic and flavonoid contents were extracted and the antioxidant activities of the extracts evaluated. Single-factor experiments, central composite rotatable design (CCRD) and response surface methodology (RSM) were used to optimize the extraction conditions for total phenolic content (TPC) and total flavonoid content (TFC). Ferric reducing power (FRP) and 1,1-Diphenyl-2-picrylhydrazyl radical (DPPH·) scavenging capacity were used as the evaluation indices of antioxidant activity. The results showed that the main components of M. bracteata ethanol extracts are methyl eugenol (86.86%) and trans-cinnamic acid methyl ester (6.41%). The single-factor experiments revealed that the ethanol concentration is the key factor determining the TPC, TFC, FRP and DPPH·scavenging capacity. RSM results indicated that the optimal condition of all four evaluation indices was achieved by extracting for 3.65 days at 53.26°C in 34.81% ethanol. Under these conditions, the TPC, TFC, FRP and DPPH·scavenging capacity reached values of 88.6 ± 1.3 mg GAE/g DW, 19.4 ± 0.2 mg RE/g DW, 2.37 ± 0.01 mM Fe2+/g DW and 86.0 ± 0.3%, respectively, which were higher than those of the positive control, methyl eugenol (FRP 0.97 ± 0.02 mM, DPPH·scavenging capacity 58.6 ± 0.7%) at comparable concentrations. Therefore, the extracts of M. bracteata leaves have higher antioxidant activity, which did not only attributed to the methyl eugenol. Further research could lead to the development of a potent new natural antioxidant.

  6. Measurements of optical underwater turbulence under controlled conditions

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.; Gladysz, S.; Almeida de Sá Barros, R.; Matt, S.; Nootz, G. A.; Josset, D. B.; Hou, W.

    2016-05-01

    Laser beam propagation underwater is becoming an important research topic because of high demand for its potential applications. Namely, ability to image underwater at long distances is highly desired for scientific and military purposes, including submarine awareness, diver visibility, and mine detection. Optical communication in the ocean can provide covert data transmission with much higher rates than that available with acoustic techniques, and it is now desired for certain military and scientific applications that involve sending large quantities of data. Unfortunately underwater environment presents serious challenges for propagation of laser beams. Even in clean ocean water, the extinction due to absorption and scattering theoretically limit the useful range to few attenuation lengths. However, extending the laser light propagation range to the theoretical limit leads to significant beam distortions due to optical underwater turbulence. Experiments show that the magnitude of the distortions that are caused by water temperature and salinity fluctuations can significantly exceed the magnitude of the beam distortions due to atmospheric turbulence even for relatively short propagation distances. We are presenting direct measurements of optical underwater turbulence in controlled conditions of laboratory water tank using two separate techniques involving wavefront sensor and LED array. These independent approaches will enable development of underwater turbulence power spectrum model based directly on the spatial domain measurements and will lead to accurate predictions of underwater beam propagation.

  7. A novel methodology to measure methane bubble sizes in the water column

    NASA Astrophysics Data System (ADS)

    Hemond, H.; Delwiche, K.; Senft-Grupp, S.; Manganello, T.

    2014-12-01

    The fate of methane ebullition from lake sediments is dependent on initial bubble size. Rising bubbles are subject to dissolution, reducing the fraction of methane that ultimately enters the atmosphere while increasing concentrations of aqueous methane. Smaller bubbles not only rise more slowly, but dissolve more rapidly larger bubbles. Thus, understanding methane bubble size distributions in the water column is critical to predicting atmospheric methane emissions from ebullition. However, current methods of measuring methane bubble sizes in-situ are resource-intensive, typically requiring divers, video equipment, sonar, or hydroacoustic instruments. The complexity and cost of these techniques points to the strong need for a simple, autonomous device that can measure bubble size distributions and be deployed unattended over long periods of time. We describe a bubble sizing device that can be moored in the subsurface and can intercept and measure the size of bubbles as they rise. The instrument uses a novel optical measurement technique with infrared LEDs and IR-sensitive photodetectors combined with a custom-designed printed circuit board. An on-board microcomputer handles raw optical signals and stores the relevant information needed to calculate bubble volume. The electronics are housed within a pressure case fabricated from standard PVC fittings and are powered by size C alkaline batteries. The bill of materials cost is less than $200, allowing us to deploy multiple sensors at various locations within Upper Mystic Lake, MA. This novel device will provide information on how methane bubble sizes may vary both spatially and temporally. We present data from tests under controlled laboratory conditions and from deployments in Upper Mystic Lake.

  8. Generic design methodology for the development of three-dimensional structured-light sensory systems for measuring complex objects

    NASA Astrophysics Data System (ADS)

    Marin, Veronica E.; Chang, Wei Hao Wayne; Nejat, Goldie

    2014-11-01

    Structured-light (SL) techniques are emerging as popular noncontact approaches for obtaining three-dimensional (3-D) measurements of complex objects for real-time applications in manufacturing, bioengineering, and robotics. The performance of SL systems is determined by the emitting (i.e., projector) and capturing (i.e., camera) hardware components and the triangulation configuration between them and an object of interest. A generic design methodology is presented to determine optimal triangulation configurations for SL systems. These optimal configurations are determined with respect to a set of performance metrics: (1) minimizing the 3-D reconstruction errors, (2) maximizing the pixel-to-pixel correspondence between the projector and camera, and (3) maximizing the dispersion of the measured 3-D points within a measurement volume, while satisfying design constraints based on hardware and user-defined specifications. The proposed methodology utilizes a 3-D geometric triangulation model based on ray-tracing geometry and pin-hole models for the projector and camera. Using the methodology, a set of optimal system configurations can be determined for a given set of hardware components. The design methodology was applied to a real-time SL system for surface profiling of complex objects. Experiments were conducted with an optimal sensor configuration and its performance verified with respect to a nonoptimal hardware configuration.

  9. Antioxidant activity in barley (Hordeum Vulgare L.) grains roasted in a microwave oven under conditions optimized using response surface methodology.

    PubMed

    Omwamba, Mary; Hu, Qiuhui

    2010-01-01

    Microwave processing and cooking of foods is a recent development that is gaining momentum in household as well as large-scale food applications. Barley contains phenol compounds which possess antioxidant activity. In this study the microwave oven roasting condition was optimized to obtain grains with high antioxidant activity measured as the ability to scavenge 1,1-diphenyl-2-picrylhydrazyl (DPPH) free radical. Antioxidant activity of grains roasted under optimum conditions was assessed based on DPPH radical scavenging activity, reducing power and inhibition of oxidation in linoleic acid system. The optimum condition for obtaining roasted barley with high antioxidant activity (90.5% DPPH inhibition) was found to be at 600 W microwave power, 8.5 min roasting time, and 61.5 g or 2 layers of grains. The roasting condition influenced antioxidant activity both individually and interactively. Statistical analysis showed that the model was significant (P < 0.0001). The acetone extract had significantly high inhibition of lipid peroxidation and DPPH radical scavenging activity compared to the aqueous extract and alpha-tocopherol. The reducing power of acetone extracts was not significantly different from alpha-tocopherol. The acetone extract had twice the amount of phenol content compared to the aqueous extract indicating its high extraction efficiency. GC-MS analysis revealed the presence of phenol acids, amino phenols, and quinones. The aqueous extract did not contain 3,4-dihydroxybenzaldehyde and 4-hydroxycinnamic acid which are phenol compounds reported to contribute to antioxidant activity in barley grain.

  10. Measuring sporadic gastrointestinal illness associated with drinking water - an overview of methodologies.

    PubMed

    Bylund, John; Toljander, Jonas; Lysén, Maria; Rasti, Niloofar; Engqvist, Jannes; Simonsson, Magnus

    2017-06-01

    There is an increasing awareness that drinking water contributes to sporadic gastrointestinal illness (GI) in high income countries of the northern hemisphere. A literature search was conducted in order to review: (1) methods used for investigating the effects of public drinking water on GI; (2) evidence of possible dose-response relationship between sporadic GI and drinking water consumption; and (3) association between sporadic GI and factors affecting drinking water quality. Seventy-four articles were selected, key findings and information gaps were identified. In-home intervention studies have only been conducted in areas using surface water sources and intervention studies in communities supplied by ground water are therefore needed. Community-wide intervention studies may constitute a cost-effective alternative to in-home intervention studies. Proxy data that correlate with GI in the community can be used for detecting changes in the incidence of GI. Proxy data can, however, not be used for measuring the prevalence of illness. Local conditions affecting water safety may vary greatly, making direct comparisons between studies difficult unless sufficient knowledge about these conditions is acquired. Drinking water in high-income countries contributes to endemic levels of GI and there are public health benefits for further improvements of drinking water safety.

  11. Use of Response Surface Methodology to Optimize Culture Conditions for Hydrogen Production by an Anaerobic Bacterial Strain from Soluble Starch

    NASA Astrophysics Data System (ADS)

    Kieu, Hoa Thi Quynh; Nguyen, Yen Thi; Dang, Yen Thi; Nguyen, Binh Thanh

    2016-05-01

    Biohydrogen is a clean source of energy that produces no harmful byproducts during combustion, being a potential sustainable energy carrier for the future. Therefore, biohydrogen produced by anaerobic bacteria via dark fermentation has attracted attention worldwide as a renewable energy source. However, the hydrogen production capability of these bacteria depends on major factors such as substrate, iron-containing hydrogenase, reduction agent, pH, and temperature. In this study, the response surface methodology (RSM) with central composite design (CCD) was employed to improve the hydrogen production by an anaerobic bacterial strain isolated from animal waste in Phu Linh, Soc Son, Vietnam (PL strain). The hydrogen production process was investigated as a function of three critical factors: soluble starch concentration (8 g L-1 to 12 g L-1), ferrous iron concentration (100 mg L-1 to 200 mg L-1), and l-cysteine concentration (300 mg L-1 to 500 mg L-1). RSM analysis showed that all three factors significantly influenced hydrogen production. Among them, the ferrous iron concentration presented the greatest influence. The optimum hydrogen concentration of 1030 mL L-1 medium was obtained with 10 g L-1 soluble starch, 150 mg L-1 ferrous iron, and 400 mg L-1 l-cysteine after 48 h of anaerobic fermentation. The hydrogen concentration produced by the PL strain was doubled after using RSM. The obtained results indicate that RSM with CCD can be used as a technique to optimize culture conditions for enhancement of hydrogen production by the selected anaerobic bacterial strain. Hydrogen production from low-cost organic substrates such as soluble starch using anaerobic fermentation methods may be one of the most promising approaches.

  12. Ciona intestinalis as an emerging model organism: its regeneration under controlled conditions and methodology for egg dechorionation.

    PubMed

    Liu, Li-ping; Xiang, Jian-hai; Dong, Bo; Natarajan, Pavanasam; Yu, Kui-jie; Cai, Nan-er

    2006-06-01

    The ascidian Ciona intestinalis is a model organism of developmental and evolutionary biology and may provide crucial clues concerning two fundamental matters, namely, how chordates originated from the putative deuterostome ancestor and how advanced chordates originated from the simplest chordates. In this paper, a whole-life-span culture of C. intestinalis was conducted. Fed with the diet combination of dry Spirulina, egg yolk, Dicrateria sp., edible yeast and weaning diet for shrimp, C. intestinalis grew up to average 59 mm and matured after 60 d cultivation. This culture process could be repeated using the artificially cultured mature ascidians as material. When the fertilized eggs were maintained under 10, 15, 20, 25 degrees C, they hatched within 30 h, 22 h, 16 h and 12 h 50 min respectively experiencing cleavage, blastulation, gastrulation, neurulation, tailbud stage and tadpole stage. The tadpole larvae were characterized as typical but simplified chordates because of their dorsal nerve cord, notochord and primordial brain. After 8 - 24 h freely swimming, the tadpole larvae settled on the substrates and metamorphosized within 1- 2 d into filter feeding sessile juvenile ascidians. In addition, unfertilized eggs were successfully dechorionated in filtered seawater containing 1% Tripsin, 0.25% EDTA at pH of 10.5 within 40 min. After fertilization, the dechorionated eggs developed well and hatched at normal hatching rate. In conclusion, this paper presented feasible methodology for rearing the tadpole larvae of C. intestinalis into sexual maturity under controlled conditions and detailed observations on the embryogenesis of the laboratory cultured ascidians, which will facilitate developmental and genetic research using this model system.

  13. A methodological evaluation of volumetric measurement techniques including three-dimensional imaging in breast surgery.

    PubMed

    Hoeffelin, H; Jacquemin, D; Defaweux, V; Nizet, J L

    2014-01-01

    Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients) and we compared this system to other techniques (CT scanning and Archimedes' principle) under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress) and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection) of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery.

  14. A Methodological Evaluation of Volumetric Measurement Techniques including Three-Dimensional Imaging in Breast Surgery

    PubMed Central

    Hoeffelin, H.; Jacquemin, D.; Defaweux, V.; Nizet, J L.

    2014-01-01

    Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients) and we compared this system to other techniques (CT scanning and Archimedes' principle) under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress) and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection) of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery. PMID:24511536

  15. What Do We Measure? Methodological versus Institutional Validity in Student Surveys

    ERIC Educational Resources Information Center

    Johnson, Jeffrey Alan

    2011-01-01

    This paper examines the tension in the process of designing student surveys between the methodological requirements of good survey design and the institutional needs for survey data. Building on the commonly used argumentative approach to construct validity, I build an interpretive argument for student opinion surveys that allows assessment of the…

  16. The Measure of a Nation: The USDA and the Rise of Survey Methodology

    ERIC Educational Resources Information Center

    Mahoney, Kevin T.; Baker, David B.

    2007-01-01

    Survey research has played a major role in American social science. An outgrowth of efforts by the United States Department of Agriculture in the 1930s, the Division of Program Surveys (DPS) played an important role in the development of survey methodology. The DPS was headed by the ambitious and entrepreneurial Rensis Likert, populated by young…

  17. Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology

    ERIC Educational Resources Information Center

    Johnson, Tristan E.; O'Connor, Debra L.

    2008-01-01

    Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

  18. Measuring Sense of Community: A Methodological Interpretation of the Factor Structure Debate

    ERIC Educational Resources Information Center

    Peterson, N. Andrew; Speer, Paul W.; Hughey, Joseph

    2006-01-01

    Instability in the factor structure of the Sense of Community Index (SCI) was tested as a methodological artifact. Confirmatory factor analyses, tested with two data sets, supported neither the proposed one-factor nor the four-factor (needs fulfillment, group membership, influence, and emotional connection) SCI. Results demonstrated that the SCI…

  19. The Measure of a Nation: The USDA and the Rise of Survey Methodology

    ERIC Educational Resources Information Center

    Mahoney, Kevin T.; Baker, David B.

    2007-01-01

    Survey research has played a major role in American social science. An outgrowth of efforts by the United States Department of Agriculture in the 1930s, the Division of Program Surveys (DPS) played an important role in the development of survey methodology. The DPS was headed by the ambitious and entrepreneurial Rensis Likert, populated by young…

  20. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  1. The Continued Salience of Methodological Issues for Measuring Psychiatric Disorders in International Surveys

    ERIC Educational Resources Information Center

    Tausig, Mark; Subedi, Janardan; Broughton, Christopher; Pokimica, Jelena; Huang, Yinmei; Santangelo, Susan L.

    2011-01-01

    We investigated the extent to which methodological concerns explicitly addressed by the designers of the World Mental Health Surveys persist in the results that were obtained using the WMH-CIDI instrument. We compared rates of endorsement of mental illness symptoms in the United States (very high) and Nepal (very low) as they were affected by…

  2. Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology

    ERIC Educational Resources Information Center

    Johnson, Tristan E.; O'Connor, Debra L.

    2008-01-01

    Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

  3. The Continued Salience of Methodological Issues for Measuring Psychiatric Disorders in International Surveys

    ERIC Educational Resources Information Center

    Tausig, Mark; Subedi, Janardan; Broughton, Christopher; Pokimica, Jelena; Huang, Yinmei; Santangelo, Susan L.

    2011-01-01

    We investigated the extent to which methodological concerns explicitly addressed by the designers of the World Mental Health Surveys persist in the results that were obtained using the WMH-CIDI instrument. We compared rates of endorsement of mental illness symptoms in the United States (very high) and Nepal (very low) as they were affected by…

  4. Methodological aspects to be considered when measuring the approximate number system (ANS) – a research review

    PubMed Central

    Dietrich, Julia F.; Huber, Stefan; Nuerk, Hans-Christoph

    2015-01-01

    According to a dominant view, the approximate number system (ANS) is the foundation of symbolic math abilities. Due to the importance of math abilities for education and career, a lot of research focuses on the investigation of the ANS and its relationship with math performance. However, the results are inconsistent. This might be caused by studies differing greatly regarding the operationalization of the ANS (i.e., tasks, dependent variables). Moreover, many methodological aspects vary from one study to the next. In the present review, we discuss commonly used ANS tasks and dependent variables regarding their theoretical foundation and psychometric features. We argue that the inconsistent findings concerning the relationship between ANS acuity and math performance may be partially explained by differences in reliability. Furthermore, this review summarizes methodological aspects of ANS tasks having important impacts on the results, including stimulus range, visual controls, presentation duration of the stimuli and feedback. Based on this review, we give methodological recommendations on how to assess the ANS most reliably and most validly. All important methodological aspects to be considered when designing an ANS task or comparing results of different studies are summarized in two practical checklists. PMID:25852612

  5. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  6. A methodology for successfully producing global translations of patient reported outcome measures for use in multiple countries.

    PubMed

    Two, Rebecca; Verjee-Lorenz, Aneesa; Clayson, Darren; Dalal, Mehul; Grotzinger, Kelly; Younossi, Zobair M

    2010-01-01

    The production of accurate and culturally relevant translations of patient reported outcome (PRO) measures is essential for the success of international clinical trials. Although there are many reports in publication regarding the translation of PRO measures, the techniques used to produce single translations for use in multiple countries (global translations) are not well documented. This article addresses this apparent lack of documentation and presents the methodology used to create global translations of the Chronic Liver Disease Questionnaire-Hepatitis C Virus (CLDQ-HCV). The challenges of creating a translation for use in multiple countries are discussed, and the criteria for a global translation project explained. Based on a thorough translation and linguistic validation methodology including a concept elaboration, multiple forward translations, two back translations, reviews by in-country clinicians and the instrument developer, pilot testing in each target country and multiple sets of proofreading, the key concept of the global translation methodology is consistent international harmonization, achieved through the involvement of linguists from each target country at every stage of the process. This methodology enabled the successful resolution of the translation issues encountered, and resulted in consistent translations of the CLDQ-HCV that were linguistically and culturally appropriate for all target countries.

  7. Ultrasonic measurements at in-situ conditions in a geothermal field: Ngatamariki field, New Zealand.

    NASA Astrophysics Data System (ADS)

    Durán, E.; Adam, L.; Wallis, I. C.

    2016-12-01

    A set volcaniclastic and pyroclastic rocks were collected from Ngatamariki Geothermal Field. Two sets of measurements were carried out in core samples from geological intervals used for injection. The first set of measurements were made at surface conditions using ultrasonic transducers. The second measurements were made simulating in-situ confining and fluid pressures of the field inside a pressure vessel. A comparison of both approaches is made in order to validate existing data and expand the geophysical information collected in the field. Previous work on the rocks has shown that there is large variation in the physical and mechanical properties with depth, which might indicate that effects of lithology and hydrothermal alteration are controlling factors in the observed variability, nevertheless the addition of fluid pressures has never been studied in these rocks. Both datasets have been used to improve the identification and interpretation of P and S-wave arrivals and understand their variation with pressure and fluid content. Previous laboratory results on mineralogy, clay content, porosity, permeability, crack density and orientation are incorporated into the analysis. Finally, a methodology is presented to aid in the calibration and interpretation of S-wave arrivals for the transducers built to perform the experiments at in-situ conditions. Since the compressional and shear piezoelectric crystals used are packed in a single casing, converted waves must be identified on top of the direct arrivals. By comparing the source signature of the measurements performed on the bench to the waveforms recorded at field conditions, we aid the eye interpretation of picked times by adapting a Dynamic Time Warping algorithm for the task.

  8. A methodological frame for assessing benzene induced leukemia risk mitigation due to policy measures.

    PubMed

    Karakitsios, Spyros P; Sarigiannis, Dimosthenis Α; Gotti, Alberto; Kassomenos, Pavlos A; Pilidis, Georgios A

    2013-01-15

    The study relies on the development of a methodology for assessing the determinants that comprise the overall leukemia risk due to benzene exposure and how these are affected by outdoor and indoor air quality regulation. An integrated modeling environment was constructed comprising traffic emissions, dispersion models, human exposure models and a coupled internal dose/biology-based dose-response risk assessment model, in order to assess the benzene imposed leukemia risk, as much as the impact of traffic fleet renewal and smoking banning to these levels. Regarding traffic fleet renewal, several "what if" scenarios were tested. The detailed full-chain methodology was applied in a South-Eastern European urban setting in Greece and a limited version of the methodology in Helsinki. Non-smoking population runs an average risk equal to 4.1·10(-5) compared to 23.4·10(-5) for smokers. The estimated lifetime risk for the examined occupational groups was higher than the one estimated for the general public by 10-20%. Active smoking constitutes a dominant parameter for benzene-attributable leukemia risk, much stronger than any related activity, occupational or not. From the assessment of mitigation policies it was found that the associated leukemia risk in the optimum traffic fleet scenario could be reduced by up to 85% for non-smokers and up to 8% for smokers. On the contrary, smoking banning provided smaller gains for (7% for non-smokers, 1% for smokers), while for Helsinki, smoking policies were found to be more efficient than traffic fleet renewal. The methodology proposed above provides a general framework for assessing aggregated exposure and the consequent leukemia risk from benzene (incorporating mechanistic data), capturing exposure and internal dosimetry dynamics, translating changes in exposure determinants to actual changes in population risk, providing a valuable tool for risk management evaluation and consequently to policy support.

  9. A new methodology of determining directional albedo models from Nimbus 7 ERB scanning radiometer measurements

    NASA Technical Reports Server (NTRS)

    House, Frederick B.

    1986-01-01

    The Nimbus 7 Earth Radiation Budget (ERB) data set is reviewed to examine its strong and weak points. In view of the timing of this report relative to the processing schedule of Nimbus 7 ERB observations, emphasis is placed on the methodology of interpreting the scanning radiometer data to develop directional albedo models. These findings enhance the value of the Nimbus 7 ERB data set and can be applied to the interpretation of both the scanning and nonscanning radiometric observations.

  10. Martian dust threshold measurements: Simulations under heated surface conditions

    NASA Technical Reports Server (NTRS)

    White, Bruce R.; Greeley, Ronald; Leach, Rodman N.

    1991-01-01

    Diurnal changes in solar radiation on Mars set up a cycle of cooling and heating of the planetary boundary layer, this effect strongly influences the wind field. The stratification of the air layer is stable in early morning since the ground is cooler than the air above it. When the ground is heated and becomes warmer than the air its heat is transferred to the air above it. The heated parcels of air near the surface will, in effect, increase the near surface wind speed or increase the aeolian surface stress the wind has upon the surface when compared to an unheated or cooled surface. This means that for the same wind speed at a fixed height above the surface, ground-level shear stress will be greater for the heated surface than an unheated surface. Thus, it is possible to obtain saltation threshold conditions at lower mean wind speeds when the surface is heated. Even though the mean wind speed is less when the surface is heated, the surface shear stress required to initiate particle movement remains the same in both cases. To investigate this phenomenon, low-density surface dust aeolian threshold measurements have been made in the MARSWIT wind tunnel located at NASA Ames Research Center, Moffett Field, California. The first series of tests examined threshold values of the 100 micron sand material. At 13 mb surface pressure the unheated surface had a threshold friction speed of 2.93 m/s (and approximately corresponded to a velocity of 41.4 m/s at a height of 1 meter) while the heated surface equivalent bulk Richardson number of -0.02, yielded a threshold friction speed of 2.67 m/s (and approximately corresponded to a velocity of 38.0 m/s at a height of 1 meter). This change represents an 8.8 percent decrease in threshold conditions for the heated case. The values of velocities are well within the threshold range as observed by Arvidson et al., 1983. As the surface was heated the threshold decreased. At a value of bulk Richardson number equal to -0.02 the threshold

  11. Skin Absorption of Anions: Part One. Methodology for In Vitro Cutaneous Absorption Measurements.

    PubMed

    Paweloszek, Raphaël; Briançon, Stéphanie; Chevalier, Yves; Gilon-Delepine, Nicole; Pelletier, Jocelyne; Bolzinger, Marie-Alexandrine

    2016-07-01

    Measurement of skin absorption of ions requires specific experimental protocols regarding the use of pig skin as a model, the viability of excised skin in water medium over 24 h, the presence of endogenous ions, and evaluation of the contributions of facilitated transport through ion channels and ion transporters. Absorption experiments of halide anions F(-), Cl(-), Br(-) and I(-) in excised skin were performed in Franz diffusion cells. Experiments were performed on human and porcine skin under various conditions so as to define and validate experimental protocols. The distributions of endogenous ions and the absorption kinetics of halide ions were similar in both porcine and human skin models. Fresh skin kept its viability over 24 h in salt-free water, allowing experiments following OECD guidelines. Permeation increased in the order F(-) < Cl(-) < Br(-) < I(-) for all receptor media and skin samples. Absorption was larger in fresh skin due to the transport through chloride channels or exchangers. Skin absorption experiments of ions in Franz cells rely on working with fresh excised skin (human or porcine) and pure water as receptor fluid. Experiments with chloride blockers or frozen/thawed skin allow discriminating passive diffusion and facilitated transport.

  12. In vivo measurements of patellar tracking and finite helical axis using a static magnetic resonance based methodology.

    PubMed

    Yao, Jie; Yang, Bin; Niu, Wenxin; Zhou, Jianwei; Wang, Yuxing; Gong, He; Ma, Huasong; Tan, Rong; Fan, Yubo

    2014-12-01

    Patellofemoral (PF) maltracking is a critical factor predisposing to PF pain syndrome. Many novel techniques of measuring patellar tracking remain research tools. This study aimed to develop a method to measure the in vivo patellar tracking and finite helical axis (FHA) by using a static magnetic resonance (MR) based methodology. The geometrical models of PF joint at 0°, 45°, 60°, 90°, and 120° of knee flexion were developed from MR images. The approximate patellar tracking was derived from the discrete PF models with a spline interpolation algorithm. The patellar tracking was validated with the previous in vitro and in vivo experiments. The patellar FHA throughout knee flexion was calculated. In the present case, the FHA drew an "L-shaped" curve in the sagittal section. This methodology could advance the examination of PF kinematics in clinics, and may also provide preliminary knowledge on patellar FHA study. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. Measuring the structure factor of simple fluids under extreme conditions

    NASA Astrophysics Data System (ADS)

    Weck, Gunnar

    2013-06-01

    The structure and dynamics of fluids, although a long standing matter of investigations, is still far from being well established. In particular, with the existence of a first order liquid-liquid phase transition (LLT) discovered in liquid phosphorus at 0.9 GPa and 1300 K it is now recognized that the fluid state could present complex structural changes. At present, very few examples of LLTs have been clearly evidenced, which may mean that a larger range of densities must be probed. First order transitions between a molecular and a polymeric liquid have been recently predicted by first principles calculations in liquid nitrogen at 88 GPa and 2000 K and in liquid CO2 at 45 GPa and 1850 K. The only device capable of reaching these extreme conditions is the diamond anvil cell (DAC), in which, the sample is sandwiched between two diamond anvils of thickness 100 times larger. Consequently, the diffracted signal from the sample is very weak compared to the Compton signal of the anvils, and becomes hardly measurable for pressures above ~20 GPa. A similar problem has been faced by the high pressure community using large volume press so as to drastically reduce the x-ray background from the sample environment. In the angle-dispersive diffraction configuration, it was proposed to use a multichannel collimator (MCC). This solution has been implemented to fit the constraints of the Paris-Edimburg (PE) large volume press and it is now routinely used on beamline ID27 of the European Synchrotron Radiation Facility. In this contribution, we present our adaptation of the MCC device accessible at ID27 for the DAC experiment. Because of the small sample volume a careful alignment procedure between the MCC slits and the DAC had to be implemented. The data analysis procedure initially developed by Eggert et al. has also been completed in order to take into account the complex contribution of the MCC slits. A large reduction of the Compton diffusion from the diamond anvils is obtained

  14. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    NASA Astrophysics Data System (ADS)

    Phillips-Smith, Catherine; Jeong, Cheol-Heon; Healy, Robert M.; Dabek-Zlotorzynska, Ewa; Celo, Valbona; Brook, Jeffrey R.; Evans, Greg

    2017-08-01

    The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter) were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010-November 2012) at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013), hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow, water, and biota samples collected

  15. A supervised vibration-based statistical methodology for damage detection under varying environmental conditions & its laboratory assessment with a scale wind turbine blade

    NASA Astrophysics Data System (ADS)

    Gómez González, A.; Fassois, S. D.

    2016-03-01

    The problem of vibration-based damage detection under varying environmental conditions and uncertainty is considered, and a novel, supervised, PCA-type statistical methodology is postulated. The methodology employs vibration data records from the healthy and damaged states of a structure under various environmental conditions. Unlike standard PCA-type methods in which a feature vector corresponding to the least important eigenvalues is formed in a single step, the postulated methodology uses supervised learning in which damaged-state data records are employed to sequentially form a feature vector by appending a transformed scalar element at a time under the condition that it optimally, among all remaining elements, improves damage detectability. This leads to the formulation of feature vectors with optimized sensitivity to damage, and thus high damage detectability. Within this methodology three particular methods, two non-parametric and one parametric, are formulated. These are validated and comparatively assessed via a laboratory case study focusing on damage detection on a scale wind turbine blade under varying temperature and the potential presence of sprayed water. Damage detection performance is shown to be excellent based on a single vibration response sensor and a limited frequency bandwidth.

  16. Surface pressure characteristics of a highly loaded turbine blade at design and off-design conditions; a CFD methodology

    NASA Astrophysics Data System (ADS)

    Vakilipour, S.; Habibnia, M.; Sabour, M. H.; Riazi, R.; Mohammadi, M.

    2017-05-01

    The flow field passing through a highly loaded low pressure (LP) turbine cascade is numerically investigated at design and off-design conditions. The Field Operation And Manipulation (OpenFOAM) platform is used as the computational Fluid Dynamics (CFD) tool. In this regard, the influences of grid resolution on the results of k- ɛ, k- ω, and large-eddy simulation (LES) turbulence models are investigated and compared with those of experimental measurements. A numerical pressure undershoot is appeared near the end of blade pressure surface which is sensitive to grid resolution and flow turbulence modeling. The LES model is able to resolve separation on both coarse and fine grid resolutions. In addition, the off-design flow condition is modeled by negative and positive inflow incidence angles. The numerical experiments show that a separation bubble generated on blade pressure side is predicted by LES. The total pressure drop has also been calculated at incidence angles between -20° and +8°. The minimum total pressure drop is obtained by k- ω and LES at design point.

  17. Methodological challenges in measuring vaccine effectiveness using population cohorts in low resource settings.

    PubMed

    King, C; Beard, J; Crampin, A C; Costello, A; Mwansambo, C; Cunliffe, N A; Heyderman, R S; French, N; Bar-Zeev, N

    2015-09-11

    Post-licensure real world evaluation of vaccine implementation is important for establishing evidence of vaccine effectiveness (VE) and programme impact, including indirect effects. Large cohort studies offer an important epidemiological approach for evaluating VE, but have inherent methodological challenges. Since March 2012, we have conducted an open prospective cohort study in two sites in rural Malawi to evaluate the post-introduction effectiveness of 13-valent pneumococcal conjugate vaccine (PCV13) against all-cause post-neonatal infant mortality and monovalent rotavirus vaccine (RV1) against diarrhoea-related post-neonatal infant mortality. Our study sites cover a population of 500,000, with a baseline post-neonatal infant mortality of 25 per 1000 live births. We conducted a methodological review of cohort studies for vaccine effectiveness in a developing country setting, applied to our study context. Based on published literature, we outline key considerations when defining the denominator (study population), exposure (vaccination status) and outcome ascertainment (mortality and cause of death) of such studies. We assess various definitions in these three domains, in terms of their impact on power, effect size and potential biases and their direction, using our cohort study for illustration. Based on this iterative process, we discuss the pros and cons of our final per-protocol analysis plan. Since no single set of definitions or analytical approach accounts for all possible biases, we propose sensitivity analyses to interrogate our assumptions and methodological decisions. In the poorest regions of the world where routine vital birth and death surveillance are frequently unavailable and the burden of disease and death is greatest We conclude that provided the balance between definitions and their overall assumed impact on estimated VE are acknowledged, such large scale real-world cohort studies can provide crucial information to policymakers by providing

  18. Methodological challenges in measuring vaccine effectiveness using population cohorts in low resource settings

    PubMed Central

    King, C.; Beard, J.; Crampin, A.C.; Costello, A.; Mwansambo, C.; Cunliffe, N.A.; Heyderman, R.S.; French, N.; Bar-Zeev, N.

    2015-01-01

    Post-licensure real world evaluation of vaccine implementation is important for establishing evidence of vaccine effectiveness (VE) and programme impact, including indirect effects. Large cohort studies offer an important epidemiological approach for evaluating VE, but have inherent methodological challenges. Since March 2012, we have conducted an open prospective cohort study in two sites in rural Malawi to evaluate the post-introduction effectiveness of 13-valent pneumococcal conjugate vaccine (PCV13) against all-cause post-neonatal infant mortality and monovalent rotavirus vaccine (RV1) against diarrhoea-related post-neonatal infant mortality. Our study sites cover a population of 500,000, with a baseline post-neonatal infant mortality of 25 per 1000 live births. We conducted a methodological review of cohort studies for vaccine effectiveness in a developing country setting, applied to our study context. Based on published literature, we outline key considerations when defining the denominator (study population), exposure (vaccination status) and outcome ascertainment (mortality and cause of death) of such studies. We assess various definitions in these three domains, in terms of their impact on power, effect size and potential biases and their direction, using our cohort study for illustration. Based on this iterative process, we discuss the pros and cons of our final per-protocol analysis plan. Since no single set of definitions or analytical approach accounts for all possible biases, we propose sensitivity analyses to interrogate our assumptions and methodological decisions. In the poorest regions of the world where routine vital birth and death surveillance are frequently unavailable and the burden of disease and death is greatest We conclude that provided the balance between definitions and their overall assumed impact on estimated VE are acknowledged, such large scale real-world cohort studies can provide crucial information to policymakers by providing

  19. Neurolinguistic measures of typological effects in multilingual transfer: introducing an ERP methodology.

    PubMed

    Rothman, Jason; Alemán Bañón, José; González Alonso, Jorge

    2015-01-01

    This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3) acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions toward a better understanding of how the mind represents language and how (cognitive) economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event-related potential experiments, to complement the claims currently made on the basis of exclusively behavioral experiments.

  20. Neurolinguistic measures of typological effects in multilingual transfer: introducing an ERP methodology

    PubMed Central

    Rothman, Jason; Alemán Bañón, José; González Alonso, Jorge

    2015-01-01

    This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3) acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions toward a better understanding of how the mind represents language and how (cognitive) economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event-related potential experiments, to complement the claims currently made on the basis of exclusively behavioral experiments. PMID:26300800

  1. Measures assessing spirituality as more than religiosity: a methodological review of nursing and health-related literature.

    PubMed

    Sessanna, Loralee; Finnell, Deborah S; Underhill, Meghan; Chang, Yu-Ping; Peng, Hsi-Ling

    2011-08-01

    This paper is a report of a methodological review conducted to analyse, evaluate and synthesize the rigour of measures found in nursing and health-related literature used to assess and evaluate patient spirituality as more than religiosity. Holistic healthcare practitioners recognize important distinctions exist about what constitutes spiritual care needs and preferences and what constitutes religious care needs and preferences in patient care practice. Databases searched, limited to the years 1982 and 2009, included AMED, Alt Health Watch, CINAHL Plus with Full Text, EBSCO Host, EBSCO Host Religion and Philosophy, ERIC, Google Scholar, HAPI, HUBNET, IngentaConnect, Mental Measurements Yearbook Online, Ovid MEDLINE, Social Work Abstracts and Hill and Hood's Measures of Religiosity text. A methodological review was carried out. Measures assessing spirituality as more than religiosity were critically reviewed including quality appraisal, relevant data extraction and a narrative synthesis of findings. Ten measures fitting inclusion criteria were included in the review. Despite agreement among nursing and health-related disciplines that spirituality and religiosity are distinct and diverse concepts, the concept of spirituality was often used interchangeably with the concept religion to assess and evaluate patient spirituality. The term spiritual or spirituality was used in a preponderance of items to assess or evaluate spirituality. Measures differentiating spirituality from religiosity are grossly lacking in nursing and health-related literature. © 2011 Blackwell Publishing Ltd.

  2. An approach for prediction of optimum reaction conditions for laccase-catalyzed bio-transformation of 1-naphthol by response surface methodology (RSM).

    PubMed

    Ceylan, Hasan; Kubilay, Senol; Aktas, Nahit; Sahiner, Nurettin

    2008-04-01

    Response surface methodology (RSM) was successfully applied to enzymatic bio-transformation of 1-naphthol. The experiments were conducted in a closed system containing acetone and sodium acetate buffer, with laccase enzyme. Laccase enzyme used as catalyst was derived from Trametes versicolor (ATCC 200801). The enzymatic bio-transformation rate of 1-naphthol, based on measurements of initial dissolved oxygen (DO) consumption rate in the closed system, was optimized by the application of RSM. The independent variables, which had been found as the most effective variables on the initial DO consumption rate by screening experiments, were determined as medium temperature, pH and acetone content. A quadratic model was developed through RSM in terms of related independent variables to describe the DO consumption rate as the response. Based on contour plots and variance analysis, optimum operational conditions for maximizing initial DO consumption rate, while keeping acetone content at its minimum value, were 301 K of temperature, pH 6 and acetone content of 7% to obtain 9.17 x 10(-3) mM DO/min for initial oxidation rate.

  3. 2010 Review on the Extension of the AMedP-8(C) Methodology to New Agents, Materials, and Conditions

    DTIC Science & Technology

    2010-12-01

    pneumonic plague, smallpox, and Venezuelan equine encephalitis (VEE)). D. Human Response Injury Profile (HRIP) Parameters The HRIP methodology...would be a significantly different response due to the use of antibiotics (as chemoprophylaxis) or immunizations, and a separate set of prophylaxis...vaccination protocols or existing vaccination research programs and for bacterial agents that respond to antibiotics . E. The 2009 Report and

  4. Quality of life assessment in children: a review of conceptual and methodological issues in multidimensional health status measures.

    PubMed Central

    Pal, D K

    1996-01-01

    STUDY OBJECTIVE: To clarify concepts and methodological problems in existing multidimensional health status measures for children. DESIGN: Thematic review of instruments found by computerised and manual searches, 1979-95. SUBJECTS: Nine health status instruments. MAIN RESULTS: Many instruments did not satisfy criteria of being child centered or family focussed; few had sufficient psychometric properties for research or clinical use; underlying conceptual assumptions were rarely explicit. CONCLUSIONS: Quality of life measures should be viewed cautiously. Interdisciplinary discussion is required, as well as discussion with children and parents, to establish constructs that are truly useful. PMID:8882220

  5. Measurements methodology for evaluation of Digital TV operation in VHF high-band

    NASA Astrophysics Data System (ADS)

    Pudwell Chaves de Almeida, M.; Vladimir Gonzalez Castellanos, P.; Alfredo Cal Braz, J.; Pereira David, R.; Saboia Lima de Souza, R.; Pereira da Soledade, A.; Rodrigues Nascimento Junior, J.; Ferreira Lima, F.

    2016-07-01

    This paper describes the experimental setup of field measurements carried out for evaluating the operation of the ISDB-TB (Integrated Services Digital Broadcasting, Terrestrial, Brazilian version) standard digital TV in the VHF-highband. Measurements were performed in urban and suburban areas in a medium-sized Brazilian city. Besides the direct measurements of received power and environmental noise, a measurement procedure involving the injection of Gaussian additive noise was employed to achieve the signal to noise ratio threshold at each measurement site. The analysis includes results of static reception measurements for evaluating the received field strength and the signal to noise ratio thresholds for correct signal decoding.

  6. Shape measurement by a multi-view methodology based on the remote tracking of a 3D optical scanner

    NASA Astrophysics Data System (ADS)

    Barone, Sandro; Paoli, Alessandro; Viviano Razionale, Armando

    2012-03-01

    Full field optical techniques can be reliably used for 3D measurements of complex shapes by multi-view processes, which require the computation of transformation parameters relating different views into a common reference system. Although, several multi-view approaches have been proposed, the alignment process is still the crucial step of a shape reconstruction. In this paper, a methodology to automatically align 3D views has been developed by integrating a stereo vision system and a full field optical scanner. In particular, the stereo vision system is used to remotely track the optical scanner within a working volume. The tracking system uses stereo images to detect the 3D coordinates of retro-reflective infrared markers rigidly connected to the scanner. Stereo correspondences are established by a robust methodology based on combining the epipolar geometry with an image spatial transformation constraint. The proposed methodology has been validated by experimental tests regarding both the evaluation of the measurement accuracy and the 3D reconstruction of an industrial shape.

  7. Ethical and methodological issues in qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions: a critical review.

    PubMed

    Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika

    2017-01-01

    Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.

  8. Experimental Measurements And Evaluation Of Indoor Microclimate Conditions

    NASA Astrophysics Data System (ADS)

    Kraliková, Ružena; Sokolová, Hana

    2015-07-01

    The paper deals with monitoring of workplace where technological equipment produces heat during hot summer days. The thermo-hygric microclimate measurement took place during daily work shift, and was carried out at 5 choosen measuring points. Since there was radiation heat presented in workplace and workers worked at different places, the thermal environment was classified as a heterogeneous and unstationary area. The measurement, result processing and interpretation was carried out according to the valid legislation of Slovak Republic.

  9. A Novel Methodology for Measurements of an LED's Heat Dissipation Factor

    NASA Astrophysics Data System (ADS)

    Jou, R.-Y.; Haung, J.-H.

    2015-12-01

    Heat generation is an inevitable byproduct with high-power light-emitting diode (LED) lighting. The increase in junction temperature that accompanies the heat generation sharply degrades the optical output of the LED and has a significant negative influence on the reliability and durability of the LED. For these reasons, the heat dissipation factor, Kh, is an important factor in modeling and thermal design of LED installations. In this study, a methodology is proposed and experiments are conducted to determine LED heat dissipation factors. Experiments are conducted for two different brands of LED. The average heat dissipation factor of the Edixeon LED is 0.69, and is 0.60 for the OSRAM LED. By using the developed test method and comparing the results to the calculated luminous fluxes using theoretical equations, the interdependence of optical, electrical, and thermal powers can be predicted with a reasonable accuracy. The difference between the theoretical and experimental values is less than 9 %.

  10. [Techniques for measuring phakic and pseudophakic accommodation. Methodology for distinguishing between neurological and mechanical accommodative insufficiency].

    PubMed

    Roche, O; Roumes, C; Parsa, C

    2007-11-01

    The methods available for studying accommodation are evaluated: Donder's "push-up" method, dynamic retinoscopy, infrared optometry using the Scheiner principle, and wavefront analysis are each discussed with their inherent advantages and limitations. Based on the methodology described, one can also distinguish between causes of accommodative insufficiency. Dioptric insufficiency (accommodative lag) that remains equal at various testing distances from the subject indicates a sensory/neurologic (afferent), defect, whereas accommodative insufficiency changing with distance indicates a mechanical/restrictive (efferent) defect, such as in presbyopia. Determining accommodative insufficiency and the cause can be particularly useful when examining patients with a variety of diseases associated with reduced accommodative ability (e.g., Down syndrome and cerebral palsy) as well as in evaluating the effectiveness of various potentially accommodating intraocular lens designs.

  11. [Optimization of reaction conditions and methodological investigation on microtox-based fast testing system for traditional Chinese medicine injection].

    PubMed

    Gao, Hong-Li; Li, Xiao-Rong; Yan, Liang-Chun; Zhao, Jun-Ning

    2016-05-01

    Vibrio fischeri CS234 was used to establish and optimize microtox assay system, laying a foundation for the application of this method in comprehensive acute toxicity test of traditional Chinese medicine (TCM) injections. Firstly, the Plackett-Burman method was carried out to optimize the factors which would affect Vibrio fischeri CS234 luminescence. Secondly, ZnSO4•7H2O was chosen as reference substance to establish its reaction system with quality control samples. The optimal luminescence conditions were achieved as follows: ①At a temperature of (15±1) ℃, Vibrio fischeri CS234 lyophilized powders were balanced for 15 min, then, 1 mL resuscitation fluid was added and blended for 10 min. 100 μL bacteria suspension was taken to measure the initial luminescence intensity, and then 1 mL resuscitation fluid or test sample was immediately added; after reaction for 10 min, corresponding luminescence intensity was measured again. Resuscitation diluent, osmotic pressure regulator and ZnSO4•7H2O stock solution showed no interference to the determination of Vibrio fischeri CS234 luminescence intensity, so this method was of good specificity. The within-and between-batch precisions of quality controls and the lower limit of quantification (LLOQ) samples were <5% and <10% respectively, while the accuracy ranged between 85.8% and 103.2%. The standard curve equation of ZnSO4•7H2O ranged from 3.86 mg•L⁻¹ to 77.22 mg•L⁻¹ (final concentrations) was y=21.78lnx-15.14, R2=0.998; meanwhile, IC₅₀ of ZnSO4•7H2O to Vibrio fischeri CS234 was 19.90 mg•L⁻¹. ZnSO4•7H2O stock solution and its quality controls were continuously investigated for 120 h and 8 h respectively, and their RSD was lower than 2%, indicating stability at room temperature and 4 ℃ storage conditions. Between pH 4.5-8.0, luminescence intensity of Vibrio fischeri CS234 was controlled within ±10%, and such pH value range could meet the testing needs of the vast majority of traditional

  12. Measurement of Vehicle Air Conditioning Pull-Down Period

    SciTech Connect

    Thomas, John F.; Huff, Shean P.; Moore, Larry G.; West, Brian H.

    2016-08-01

    Air conditioner usage was characterized for high heat-load summer conditions during short driving trips using a 2009 Ford Explorer and a 2009 Toyota Corolla. Vehicles were parked in the sun with windows closed to allow the cabin to become hot. Experiments were conducted by entering the instrumented vehicles in this heated condition and driving on-road with the windows up and the air conditioning set to maximum cooling, maximum fan speed and the air flow setting to recirculate cabin air rather than pull in outside humid air. The main purpose was to determine the length of time the air conditioner system would remain at or very near maximum cooling power under these severe-duty conditions. Because of the variable and somewhat uncontrolled nature of the experiments, they serve only to show that for short vehicle trips, air conditioning can remain near or at full cooling capacity for 10-minutes or significantly longer and the cabin may be uncomfortably warm during much of this time.

  13. Methodological challenges surrounding direct-to-consumer advertising research--the measurement conundrum.

    PubMed

    Hansen, Richard A; Droege, Marcus

    2005-06-01

    Numerous studies have focused on the impact of direct-to-consumer (DTC) prescription drug advertising on consumer behavior and health outcomes. These studies have used various approaches to assess exposure to prescription drug advertising and to measure the subsequent effects of such advertisements. The objectives of this article are to (1) discuss measurement challenges involved in DTC advertising research, (2) summarize measurement approaches commonly identified in the literature, and (3) discuss contamination, time to action, and endogeneity as specific problems in measurement design and application. We conducted a review of the professional literature to identify illustrative approaches to advertising measurement. Specifically, our review of the literature focused on measurement of DTC advertising exposure and effect. We used the hierarchy-of-effects model to guide our discussion of processing and communication effects. Other effects were characterized as target audience action, sales, market share, and profit. Overall, existing studies have used a variety of approaches to measure advertising exposure and effect, yet the ability of measures to produce a valid and reliable understanding of the effects of DTC advertising can be improved. Our review provides a framework for conceptualizing DTC measurement, and can be used to identify gaps in the literature not sufficiently addressed by existing measures. Researchers should continue to explore correlations between exposure and effect of DTC advertising, but are obliged to improve and validate measurement in this area.

  14. Grapevine Yield and Leaf Area Estimation Using Supervised Classification Methodology on RGB Images Taken under Field Conditions

    PubMed Central

    Diago, Maria-Paz; Correa, Christian; Millán, Borja; Barreiro, Pilar; Valero, Constantino; Tardaguila, Javier

    2012-01-01

    The aim of this research was to implement a methodology through the generation of a supervised classifier based on the Mahalanobis distance to characterize the grapevine canopy and assess leaf area and yield using RGB images. The method automatically processes sets of images, and calculates the areas (number of pixels) corresponding to seven different classes (Grapes, Wood, Background, and four classes of Leaf, of increasing leaf age). Each one is initialized by the user, who selects a set of representative pixels for every class in order to induce the clustering around them. The proposed methodology was evaluated with 70 grapevine (V. vinifera L. cv. Tempranillo) images, acquired in a commercial vineyard located in La Rioja (Spain), after several defoliation and de-fruiting events on 10 vines, with a conventional RGB camera and no artificial illumination. The segmentation results showed a performance of 92% for leaves and 98% for clusters, and allowed to assess the grapevine’s leaf area and yield with R2 values of 0.81 (p < 0.001) and 0.73 (p = 0.002), respectively. This methodology, which operates with a simple image acquisition setup and guarantees the right number and kind of pixel classes, has shown to be suitable and robust enough to provide valuable information for vineyard management. PMID:23235443

  15. Evaluation of constraint methodologies applied to a shallow-flaw cruciform bend specimen tested under biaxial loading conditions

    SciTech Connect

    Bass, B.R.; McAfee, W.J.; Williams, P.T.; Pennell, W.E.

    1998-01-01

    A technology to determine shallow-flaw fracture toughness of reactor pressure vessel (RPV) steels is being developed for application to the safety assessment of RPVs containing postulated shallow surface flaws. Matrices of cruciform beam tests were developed to investigate and quantify the effects of temperature, biaxial loading, and specimen size on fracture initiation toughness of two-dimensional (constant depth), shallow surface flaws. The cruciform beam specimens were developed at Oak Ridge National Laboratory (ORNL) to introduce a prototypic, far-field. out-of-plane biaxial stress component in the test section that approximates the nonlinear stresses resulting from pressurized-thermal-shock or pressure-temperature loading of an RPV. Tests were conducted under biaxial load ratios ranging from uniaxial to equibiaxial. These tests demonstrated that biaxial loading can have a pronounced effect on shallow-flaw fracture toughness in the lower transition temperature region for RPV materials. The cruciform fracture toughness data were used to evaluate fracture methodologies for predicting the observed effects of biaxial loading on shallow-flaw fracture toughness. Initial emphasis was placed on assessment of stress-based methodologies. namely, the J-Q formulation, the Dodds-Anderson toughness scaling model, and the Weibull approach. Applications of these methodologies based on the hydrostatic stress fracture criterion indicated an effect of loading-biaxiality on fracture toughness, the conventional maximum principal stress criterion indicated no effect.

  16. Creating ‘obesogenic realities’; do our methodological choices make a difference when measuring the food environment?

    PubMed Central

    2013-01-01

    Background The use of Geographical Information Systems (GIS) to objectively measure ‘obesogenic’ food environment (foodscape) exposure has become common-place. This increase in usage has coincided with the development of a methodologically heterogeneous evidence-base, with subsequent perceived difficulties for inter-study comparability. However, when used together in previous work, different types of food environment metric have often demonstrated some degree of covariance. Differences and similarities between density and proximity metrics, and within methodologically different conceptions of density and proximity metrics need to be better understood. Methods Frequently used measures of food access were calculated for North East England, UK. Using food outlet data from local councils, densities of food outlets per 1000 population and per km2 were calculated for small administrative areas. Densities (counts) were also calculated based on population-weighted centroids of administrative areas buffered at 400/800/1000m street network and Euclidean distances. Proximity (street network and Euclidean distances) from these centroids to the nearest food outlet were also calculated. Metrics were compared using Spearman’s rank correlations. Results Measures of foodscape density and proximity were highly correlated. Densities per km2 and per 1000 population were highly correlated (rs = 0.831). Euclidean and street network based measures of proximity (rs = 0.865) and density (rs = 0.667-0.764, depending on neighbourhood size) were also highly correlated. Density metrics based on administrative areas and buffered centroids of administrative areas were less strongly correlated (rs = 0.299-0.658). Conclusions Density and proximity metrics were largely comparable, with some exceptions. Whilst results suggested a substantial degree of comparability across existing studies, future comparability could be ensured by moving towards a more standardised set of

  17. Conditioning of sexual proceptivity in female quail: Measures of conditioned place preference

    PubMed Central

    Gutiérrez, Germán; Domjan, Michael

    2011-01-01

    The present experiments were conducted to explore the nature of conditioned sexual proceptivity in female quail. Females exposed to males subsequently approached the area where the males were previously housed (Experiment 1). This increased preference for the male’s area reflected an increase in female sexual proceptivity and not an increase in non-directed locomotor activity (Experiment 2). These findings provide the first evidence that female quail show conditioned responses that may be considered to be proceptive responses toward male conspecifics. The proceptive responses are expressed as tonic changes in preference for areas where males have been observed in the past rather than as specific phasic conditioned responses. PMID:21664442

  18. Measures of Chronic Conditions and Diseases Associated With Aging in the National Social Life, Health, and Aging Project

    PubMed Central

    Pham-Kanter, Genevieve; Leitsch, Sara A.

    2009-01-01

    Objectives This paper presents a description of the methods used in the National Social Life, Health, and Aging Project to detect the presence of chronic conditions and diseases associated with aging. It also discusses the validity and distribution of these measures. Methods Markers associated with common chronic diseases and conditions of aging were collected from 3,005 community-dwelling older adults living in the United States, aged 57–85 years, during 2006. Dried blood spots, physical function tests, anthropometric measurements, self-reported history, and self-rated assessments were used to detect the presence of chronic conditions associated with aging or of risk factors associated with the development of chronic diseases. Results The distribution of each measure, disaggregated by age group and gender, is presented. Conclusions This paper describes the methodology used as well as the distribution of each of these measures. In addition, we discuss how the measures used in the study relate to specific chronic diseases and conditions associated with aging and how these measures might be used in social science analyses. PMID:19204070

  19. Measuring Science Teachers' Stress Level Triggered by Multiple Stressful Conditions

    ERIC Educational Resources Information Center

    Halim, Lilia; Samsudin, Mohd Ali; Meerah, T. Subahan M.; Osman, Kamisah

    2006-01-01

    The complexity of science teaching requires science teachers to encounter a range of tasks. Some tasks are perceived as stressful while others are not. This study aims to investigate the extent to which different teaching situations lead to different stress levels. It also aims to identify the easiest and most difficult conditions to be regarded…

  20. Measurement of Phonated Intervals during Four Fluency-Inducing Conditions

    ERIC Educational Resources Information Center

    Davidow, Jason H.; Bothe, Anne K.; Andreatta, Richard D.; Ye, Jun

    2009-01-01

    Purpose: Previous investigations of persons who stutter have demonstrated changes in vocalization variables during fluency-inducing conditions (FICs). A series of studies has also shown that a reduction in short intervals of phonation, those from 30 to 200 ms, is associated with decreased stuttering. The purpose of this study, therefore, was to…

  1. A Methodology to Integrate Magnetic Resonance and Acoustic Measurements for Reservoir Characterization

    SciTech Connect

    Parra, J.O.

    2001-01-26

    The objective of this project was to develop an advanced imaging method, including pore scale imaging, to integrate magnetic resonance (MR) techniques and acoustic measurements to improve predictability of the pay zone in two hydrocarbon reservoirs. This was accomplished by extracting the fluid property parameters using MR laboratory measurements and the elastic parameters of the rock matrix from acoustic measurements to create poroelastic models of different parts of the reservoir. Laboratory measurements were compared with petrographic analysis results to determine the relative roles of petrographic elements such as porosity type, mineralogy, texture, and distribution of clay and cement in creating permeability heterogeneity.

  2. A Methodology to Integrate Magnetic Resonance and Acoustic Measurements for Reservoir Characterization

    SciTech Connect

    Parra, Jorge O.; Hackert, Chris L.; Collier, Hughbert A.; Bennett, Michael

    2002-01-29

    The objective of this project was to develop an advanced imaging method, including pore scale imaging, to integrate NMR techniques and acoustic measurements to improve predictability of the pay zone in hydrocarbon reservoirs. This is accomplished by extracting the fluid property parameters using NMR laboratory measurements and the elastic parameters of the rock matrix from acoustic measurements to create poroelastic models of different parts of the reservoir. Laboratory measurement techniques and core imaging are being linked with a balanced petrographical analysis of the core and theoretical model.

  3. A Methodology to Integrate Magnetic Resonance and Acoustic Measurements for Reservoir Characterization

    SciTech Connect

    Parra, Ph.D., Jorge O.

    2002-06-10

    The objective of the project was to develop an advanced imaging method, including pore scale imaging, to integrate nuclear magnetic resonance (NMR) techniques and acoustic measurements to improve predictability of the pay zone in hydrocarbon reservoirs. This will be accomplished by extracting the fluid property parameters using NMR laboratory measurements and the elastic parameters of the rock matrix from acoustic measurements to create poroelastic models of different parts of the reservoir. Laboratory measurement techniques and core imaging were linked with a balanced petrographical analysis of cores and theoretical modeling.

  4. A methodology for investigating interdependencies between measured throughfall, meteorological variables and canopy structure on a small catchment.

    NASA Astrophysics Data System (ADS)

    Maurer, Thomas; Gustavos Trujillo Siliézar, Carlos; Oeser, Anne; Pohle, Ina; Hinz, Christoph

    2016-04-01

    In evolving initial landscapes, vegetation development depends on a variety of feedback effects. One of the less understood feedback loops is the interaction between throughfall and plant canopy development. The amount of throughfall is governed by the characteristics of the vegetation canopy, whereas vegetation pattern evolution may in turn depend on the spatio-temporal distribution of throughfall. Meteorological factors that may influence throughfall, while at the same time interacting with the canopy, are e.g. wind speed, wind direction and rainfall intensity. Our objective is to investigate how throughfall, vegetation canopy and meteorological variables interact in an exemplary eco-hydrological system in its initial development phase, in which the canopy is very heterogeneous and rapidly changing. For that purpose, we developed a methodological approach combining field methods, raster image analysis and multivariate statistics. The research area for this study is the Hühnerwasser ('Chicken Creek') catchment in Lower Lusatia, Brandenburg, Germany, where after eight years of succession, the spatial distribution of plant species is highly heterogeneous, leading to increasingly differentiated throughfall patterns. The constructed 6-ha catchment offers ideal conditions for our study due to the rapidly changing vegetation structure and the availability of complementary monitoring data. Throughfall data were obtained by 50 tipping bucket rain gauges arranged in two transects and connected via a wireless sensor network that cover the predominant vegetation types on the catchment (locust copses, dense sallow thorn bushes and reeds, base herbaceous and medium-rise small-reed vegetation, and open areas covered by moss and lichens). The spatial configuration of the vegetation canopy for each measurement site was described via digital image analysis of hemispheric photographs of the canopy using the ArcGIS Spatial Analyst, GapLight and ImageJ software. Meteorological data

  5. Negotiating Measurement: Methodological and Interpersonal Considerations in the Choice and Interpretation of Instruments

    ERIC Educational Resources Information Center

    Braverman, Marc T.

    2013-01-01

    Sound evaluation planning requires numerous decisions about how constructs in a program theory will be translated into measures and instruments that produce evaluation data. This article, the first in a dialogue exchange, examines how decisions about measurement are (and should be) made, especially in the context of small-scale local program…

  6. Negotiating Measurement: Methodological and Interpersonal Considerations in the Choice and Interpretation of Instruments

    ERIC Educational Resources Information Center

    Braverman, Marc T.

    2013-01-01

    Sound evaluation planning requires numerous decisions about how constructs in a program theory will be translated into measures and instruments that produce evaluation data. This article, the first in a dialogue exchange, examines how decisions about measurement are (and should be) made, especially in the context of small-scale local program…

  7. Construct Validity for Measures of Childhood Depression: Application of Multitrait-Multimethod Methodology.

    ERIC Educational Resources Information Center

    Saylor, Conway Fleming; And Others

    1984-01-01

    Presents results from two studies that investigate the measurement of childhood depression through self-report and reports from others. Results were discussed in terms of the need to consider self-report in the discussion of depression with its covert components and in terms of the need for improved measurement instruments. (BH)

  8. Interobserver agreement in perineal ultrasound measurement of the anovaginal distance: a methodological study.

    PubMed

    Pihl, Sofia; Uustal, Eva; Hjertberg, Linda; Blomberg, Marie

    2017-06-17

    Objective outcome measures of the extent of laceration at delivery are needed. In this study we evaluated and describe here a method for learning perineal ultrasound measurement of the anovaginal distance (AVD). The learning period needed for examiners proficient in vaginal ultrasound examination and the interobserver agreement after reaching proficiency in AVD measurement were determined. The hypothesis was that the method is feasible to learn and reproducible for use in further research. The method was taught by an examiner experienced in perineal ultrasonography. The distance between the mucosal margin of the internal anal sphincter was measured with a vaginal probe. The studied examiners measured the AVD until similar results (±5 mm) were achieved. The AVD in 40 women was then measured and documented by two examiners who were blinded to each other's results. Interobserver agreement was calculated using the kappa score. Examiners with previous experience in vaginal ultrasonography had learned the method after performing five sets of comeasurements. The AVD measurements after the learning period showed almost perfect agreement (κ = 0.87) between the examiners. The method for perineal ultrasound measurement of AVD was learned quickly with high interobserver agreement. The method is feasible to learn and reproducible for use in further research.

  9. Construct Validity for Measures of Childhood Depression: Application of Multitrait-Multimethod Methodology.

    ERIC Educational Resources Information Center

    Saylor, Conway Fleming; And Others

    1984-01-01

    Presents results from two studies that investigate the measurement of childhood depression through self-report and reports from others. Results were discussed in terms of the need to consider self-report in the discussion of depression with its covert components and in terms of the need for improved measurement instruments. (BH)

  10. A methodology to quantify the differences between alternative methods of heart rate variability measurement.

    PubMed

    García-González, M A; Fernández-Chimeno, M; Guede-Fernández, F; Ferrer-Mileo, V; Argelagós-Palau, A; Álvarez-Gómez, L; Parrado, E; Moreno, J; Capdevila, L; Ramos-Castro, J

    2016-01-01

    This work proposes a systematic procedure to report the differences between heart rate variability time series obtained from alternative measurements reporting the spread and mean of the differences as well as the agreement between measuring procedures and quantifying how stationary, random and normal the differences between alternative measurements are. A description of the complete automatic procedure to obtain a differences time series (DTS) from two alternative methods, a proposal of a battery of statistical tests, and a set of statistical indicators to better describe the differences in RR interval estimation are also provided. Results show that the spread and agreement depend on the choice of alternative measurements and that the DTS cannot be considered generally as a white or as a normally distributed process. Nevertheless, in controlled measurements the DTS can be considered as a stationary process.

  11. The Sensor Fish: Measuring Fish Passage in Severe Hydraulic Conditions

    SciTech Connect

    Carlson, Thomas J. ); Duncan, Joanne P. ); Gilbride, Theresa L. )

    2003-05-28

    This article describes PNNL's efforts to develop the Sensor Fish, a waterproof sensor package that travels thru the turbines of spillways of hydroelectric dam to collect pressure and acceleration data on the conditions experienced by live salmon smolts during dam passage. Sensor Fish development is sponsored by the DOE Advanced Hydropower Turbine Survival Program. The article also gave two recent examples of Sensor Fish use: turbine passage at a McNary Kaplan turbine and spill passage in topspill at Rock Island Dam.

  12. Factors of psychological distress: clinical value, measurement substance, and methodological artefacts.

    PubMed

    Böhnke, J R; Croudace, T J

    2015-04-01

    Psychometric models and statistical techniques are cornerstones of research into latent structures of specific psychopathology and general mental health. We discuss "pivot points" for future research efforts from a psychometric epidemiology perspective, emphasising sampling and selection processes of both indicators that guide data collection as well as samples that are confronted with them. First, we discuss how a theoretical model of psychopathology determines which empirical indicators (questions, diagnoses, etc.) and modelling methods are appropriate to test its implications. Second, we deal with how different research designs introduce different (co-)variances between indicators, potentially leading to a different understanding of latent structures. Third, we discuss widening the range of statistical models available within the "psychometrics class": the inclusion of categorical approaches can help to enlighten the debate on the structure of psychopathology and agreement on a minimal set of models might lead to greater convergence between studies. Fourth, we deal with aspects of methodology that introduce spurious (co-)variance in latent structure analysis (response styles, clustered data) and differential item functioning to gather more detailed information and to guard against over-generalisation of results, which renders assessments unfair. Building on established insights, future research efforts should be more explicit about their theoretical understanding of psychopathology and how the analysis of a given indicator-respondent set informs this theoretical model. A coherent treatment of theoretical assumptions, indicators, and samples holds the key to building a comprehensive account of the latent structures of different types of psychopathology and mental health in general.

  13. Comparative methodologies for measuring metabolizable energy of various types of resistant high amylose corn starch.

    PubMed

    Tulley, Richard T; Appel, Marko J; Enos, Tanya G; Hegsted, Maren; McCutcheon, Kathleen L; Zhou, Jun; Raggio, Anne M; Jeffcoat, Roger; Birkett, Anne; Martin, Roy J; Keenan, Michael J

    2009-09-23

    Energy values of high amylose corn starches high in resistant starch (RS) were determined in vivo by two different methodologies. In one study, energy values were determined according to growth relative to glucose-based diets in rats fed diets containing RS(2), heat-treated RS(2) (RS(2)-HT), RS(3), and amylase predigested versions to isolate the RS component. Net metabolizable energy values ranged from 2.68 to 3.06 kcal/g for the RS starches, and 1.91-2.53 kcal/g for the amylase predigested versions. In a second study, rats were fed a diet containing RS(2)-HT and the metabolizable energy value was determined by bomb calorimetry. The metabolizable energy value was 2.80 kcal/g, consistent with Study 1. Thus, high amylose corn based RS ingredients and their amylase predigested equivalents have energy values approximately 65-78% and 47-62% of available starch (Atwater factor), respectively, according to the RS type (Garcia, T. A.; McCutcheon, K. L.; Francis, A. R.; Keenan, M. J.; O'Neil, C. E.; Martin, R. J.; Hegsted, M. The effects of resistant starch on gastrointestinal organs and fecal output in rats. FASEB J. 2003, 17, A335).

  14. Use of Balanced Scorecard Methodology for Performance Measurement of the Health Extension Program in Ethiopia.

    PubMed

    Teklehaimanot, Hailay D; Teklehaimanot, Awash; Tedella, Aregawi A; Abdella, Mustofa

    2016-05-04

    In 2004, Ethiopia introduced a community-based Health Extension Program to deliver basic and essential health services. We developed a comprehensive performance scoring methodology to assess the performance of the program. A balanced scorecard with six domains and 32 indicators was developed. Data collected from 1,014 service providers, 433 health facilities, and 10,068 community members sampled from 298 villages were used to generate weighted national, regional, and agroecological zone scores for each indicator. The national median indicator scores ranged from 37% to 98% with poor performance in commodity availability, workforce motivation, referral linkage, infection prevention, and quality of care. Indicator scores showed significant difference by region (P < 0.001). Regional performance varied across indicators suggesting that each region had specific areas of strength and deficiency, with Tigray and the Southern Nations, Nationalities and Peoples Region being the best performers while the mainly pastoral regions of Gambela, Afar, and Benishangul-Gumuz were the worst. The findings of this study suggest the need for strategies aimed at improving specific elements of the program and its performance in specific regions to achieve quality and equitable health services.

  15. Thermal decomposition of hydroxylamine: isoperibolic calorimetric measurements at different conditions.

    PubMed

    Adamopoulou, Theodora; Papadaki, Maria I; Kounalakis, Manolis; Vazquez-Carreto, Victor; Pineda-Solano, Alba; Wang, Qingsheng; Mannan, M Sam

    2013-06-15

    Thermal decomposition of hydroxylamine, NH2OH, was responsible for two serious accidents. However, its reactive behavior and the synergy of factors affecting its decomposition are not being understood. In this work, the global enthalpy of hydroxylamine decomposition has been measured in the temperature range of 130-150 °C employing isoperibolic calorimetry. Measurements were performed in a metal reactor, employing 30-80 ml solutions containing 1.4-20 g of pure hydroxylamine (2.8-40 g of the supplied reagent). The measurements showed that increased concentration or temperature, results in higher global enthalpies of reaction per unit mass of reactant. At 150 °C, specific enthalpies as high as 8 kJ per gram of hydroxylamine were measured, although in general they were in the range of 3-5 kJ g(-1). The accurate measurement of the generated heat was proven to be a cumbersome task as (a) it is difficult to identify the end of decomposition, which after a fast initial stage, proceeds very slowly, especially at lower temperatures and (b) the environment of gases affects the reaction rate. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Advanced ultrasonic measurement methodology for non-invasive interrogation and identification of fluids in sealed containers

    NASA Astrophysics Data System (ADS)

    Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.

    2006-03-01

    Government agencies and homeland security related organizations have identified the need to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a prototype portable, hand-held, hazardous materials acoustic inspection prototype that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as well as container sizes and materials encountered in various law enforcement inspection activities, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the prototype. High bandwidth ultrasonic transducers combined with an advanced pulse compression technique allowed researchers to 1) obtain high signal-to-noise ratios and 2) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of work conducted in the laboratory have demonstrated that the prototype experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.

  17. Advanced Ultrasonic Measurement Methodology for Non-Invasive Interrogation and Identification of Fluids in Sealed Containers

    SciTech Connect

    Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.

    2006-03-16

    The Hazardous Materials Response Unit (HMRU) and the Counterterrorism and Forensic Science Research Unit (CTFSRU), Laboratory Division, Federal Bureau of Investigation (FBI) have been mandated to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a portable, hand-held, hazardous materials acoustic inspection device (HAZAID) that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as well as container sizes and materials, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The HAZAID prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the HAZAID prototype. High bandwidth ultrasonic transducers combined with the advanced pulse compression technique allowed researchers to 1) impart large amounts of energy, 2) obtain high signal-to-noise ratios, and 3) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of this feasibility study demonstrated that the HAZAID experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.

  18. Advanced ultrasonic measurement methodology for non-invasive interrogation and identification of fluids in sealed containers

    SciTech Connect

    Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.

    2006-05-01

    Government agencies and homeland security related organizations have identified the need to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a prototype portable, hand-held, hazardous materials acoustic inspection prototype that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as well as container sizes and materials encountered in various law enforcement inspection activities, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the prototype. High bandwidth ultrasonic transducers combined with an advanced pulse compression technique allowed researchers to 1) obtain high signal-to-noise ratios and 2) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of work conducted in the laboratory have demonstrated that the prototype experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.

  19. Methodology for Measuring Parent Involvement Program Implementation across Diverse Program Sites.

    ERIC Educational Resources Information Center

    Abramson, Lisa S.

    Much evidence indicates that raising the overall achievement level of an urban school requires parent participation in the school--participation beyond the traditional fundraising activities. This study identified program characteristics and district and school conditions that affect the implementation of parent-involvement programs. The first…

  20. Conditioning of sexual proceptivity in female quail: measures of conditioned place preference.

    PubMed

    Gutiérrez, Germán; Domjan, Michael

    2011-07-01

    The present experiments were conducted to explore the nature of conditioned sexual proceptivity in female quail. Females exposed to males subsequently approached the area where the males were previously housed (Experiment 1). This increased preference for the male's area reflected an increase in female sexual proceptivity and not an increase in non-directed locomotor activity (Experiment 2). These findings provide the first evidence that female quail show conditioned responses that may be considered to be proceptive responses toward male conspecifics. The proceptive responses are expressed as tonic changes in preference for areas where males have been observed in the past rather than as specific phasic conditioned responses. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Using Measured Plane-of-Array Data Directly in Photovoltaic Modeling: Methodology and Validation: Preprint

    SciTech Connect

    Freeman, Janine; Freestate, David; Riley, Cameron; Hobbs, William

    2016-11-01

    Measured plane-of-array (POA) irradiance may provide a lower-cost alternative to standard irradiance component data for photovoltaic (PV) system performance modeling without loss of accuracy. Previous work has shown that transposition models typically used by PV models to calculate POA irradiance from horizontal data introduce error into the POA irradiance estimates, and that measured POA data can correlate better to measured performance data. However, popular PV modeling tools historically have not directly used input POA data. This paper introduces a new capability in NREL's System Advisor Model (SAM) to directly use POA data in PV modeling, and compares SAM results from both POA irradiance and irradiance components inputs against measured performance data for eight operating PV systems.

  2. Using Measured Plane-of-Array Data Directly in Photovoltaic Modeling: Methodology and Validation

    SciTech Connect

    Freeman, Janine; Freestate, David; Hobbs, William; Riley, Cameron

    2016-06-05

    Measured plane-of-array (POA) irradiance may provide a lower-cost alternative to standard irradiance component data for photovoltaic (PV) system performance modeling without loss of accuracy. Previous work has shown that transposition models typically used by PV models to calculate POA irradiance from horizontal data introduce error into the POA irradiance estimates, and that measured POA data can correlate better to measured performance data. However, popular PV modeling tools historically have not directly used input POA data. This paper introduces a new capability in NREL's System Advisor Model (SAM) to directly use POA data in PV modeling, and compares SAM results from both POA irradiance and irradiance components inputs against measured performance data for eight operating PV systems.

  3. Using Measured Plane-of-Array Data Directly in Photovoltaic Modeling: Methodology and Validation

    SciTech Connect

    Freeman, Janine; Freestate, David; Hobbs, William; Riley, Cameron

    2016-11-21

    Measured plane-of-array (POA) irradiance may provide a lower-cost alternative to standard irradiance component data for photovoltaic (PV) system performance modeling without loss of accuracy. Previous work has shown that transposition models typically used by PV models to calculate POA irradiance from horizontal data introduce error into the POA irradiance estimates, and that measured POA data can correlate better to measured performance data. However, popular PV modeling tools historically have not directly used input POA data. This paper introduces a new capability in NREL's System Advisor Model (SAM) to directly use POA data in PV modeling, and compares SAM results from both POA irradiance and irradiance components inputs against measured performance data for eight operating PV systems.

  4. A New Methodology For In-Situ Residual Stress Measurement In MEMS Structures

    SciTech Connect

    Sebastiani, M.; Bemporad, E.; Melone, G.; Rizzi, L.; Korsunsky, A. M.

    2010-11-24

    In this paper, a new approach is presented for local residual stress measurement in MEMS structures. The newly proposed approach involves incremental focused ion beam (FIB) milling of annular trenches at material surface, combined with high resolution SEM imaging and Digital Image Correlation (DIC) analysis for the measurement of the strain relief over the surface of the remaining central pillar. The proposed technique allows investigating the average residual stress on suspended micro-structures, with a spatial resolution lower than 1 {mu}m. Results are presented for residual stress measurement on double clamped micro-beams, whose layers are obtained by DC-sputtering (PVD) deposition. Residual stresses were also independently measured by the conventional curvature method (Stoney's equation) on a similar homogeneous coating obtained by the same deposition parameters and a comparison and discussion of obtained results is performed.

  5. A New Methodology For In-Situ Residual Stress Measurement In MEMS Structures

    NASA Astrophysics Data System (ADS)

    Sebastiani, M.; Bemporad, E.; Melone, G.; Rizzi, L.; Korsunsky, A. M.

    2010-11-01

    In this paper, a new approach is presented for local residual stress measurement in MEMS structures. The newly proposed approach involves incremental focused ion beam (FIB) milling of annular trenches at material surface, combined with high resolution SEM imaging and Digital Image Correlation (DIC) analysis for the measurement of the strain relief over the surface of the remaining central pillar. The proposed technique allows investigating the average residual stress on suspended micro-structures, with a spatial resolution lower than 1 μm. Results are presented for residual stress measurement on double clamped micro-beams, whose layers are obtained by DC-sputtering (PVD) deposition. Residual stresses were also independently measured by the conventional curvature method (Stoney's equation) on a similar homogeneous coating obtained by the same deposition parameters and a comparison and discussion of obtained results is performed.

  6. Perspectives for clinical measures of dynamic foot function-reference data and methodological considerations.

    PubMed

    Rathleff, M S; Nielsen, R G; Simonsen, O; Olesen, C G; Kersting, U G

    2010-02-01

    Several studies have investigated if static posture assessments qualify to predict dynamic function of the foot showing diverse outcomes. However, it was suggested that dynamic measures may be better suited to predict foot-related overuse problems. The purpose of this study was to establish the reliability for dynamic measures of longitudinal arch angle (LAA) and navicular height (NH) and to examine to what extent static and dynamic measures thereof are related. Intra-rater reliability of LAA and NH measures was tested on a sample of 17 control subjects. Subsequently, 79 subjects were tested while walking on a treadmill. The ranges and minimum values for LAA and NH during ground contact were identified over 20 consecutive steps. A geometric error model was used to simulate effects of marker placement uncertainty and skin movement artifacts. Results demonstrated the highest reliability for the minimum NH (MinNH), followed by the minimum LAA (MinLAA), the dynamic range of navicular height (DeltaNH) and the range of LAA (DeltaLAA) while all measures were highly reliable. Marker location uncertainty and skin movement artifacts had the smallest effects on measures of NH. The use of an alignment device for marker placement was shown to reduce error ranges for NH measures. Therefore, DeltaNH and MinNH were recommended for functional dynamic foot characterization in the sagittal plane. There is potential for such measures to be a suitable predictor for overuse injuries while being obtainable in clinical settings. Future research needs to include such dynamic but simple foot assessments in large-scale clinical studies.

  7. Measuring ICT Use and Contributing Conditions in Primary Schools

    ERIC Educational Resources Information Center

    Vanderlinde, Ruben; Aesaert, Koen; van Braak, Johan

    2015-01-01

    Information and communication technology (ICT) use became of major importance for primary schools across the world as ICT has the potential to foster teaching and learning processes. ICT use is therefore a central measurement concept (dependent variable) in many ICT integration studies. This data paper presents two datasets (2008 and 2011) that…

  8. Measuring ICT Use and Contributing Conditions in Primary Schools

    ERIC Educational Resources Information Center

    Vanderlinde, Ruben; Aesaert, Koen; van Braak, Johan

    2015-01-01

    Information and communication technology (ICT) use became of major importance for primary schools across the world as ICT has the potential to foster teaching and learning processes. ICT use is therefore a central measurement concept (dependent variable) in many ICT integration studies. This data paper presents two datasets (2008 and 2011) that…

  9. Conditional Standard Errors of Measurement for Composite Scores Using IRT

    ERIC Educational Resources Information Center

    Kolen, Michael J.; Wang, Tianyou; Lee, Won-Chan

    2012-01-01

    Composite scores are often formed from test scores on educational achievement test batteries to provide a single index of achievement over two or more content areas or two or more item types on that test. Composite scores are subject to measurement error, and as with scores on individual tests, the amount of error variability typically depends on…

  10. Conditional Standard Errors of Measurement for Composite Scores Using IRT

    ERIC Educational Resources Information Center

    Kolen, Michael J.; Wang, Tianyou; Lee, Won-Chan

    2012-01-01

    Composite scores are often formed from test scores on educational achievement test batteries to provide a single index of achievement over two or more content areas or two or more item types on that test. Composite scores are subject to measurement error, and as with scores on individual tests, the amount of error variability typically depends on…

  11. Comparison of efficiency of distance measurement methodologies in mango (Mangifera indica) progenies based on physicochemical descriptors.

    PubMed

    Alves, E O S; Cerqueira-Silva, C B M; Souza, A M; Santos, C A F; Lima Neto, F P; Corrêa, R X

    2012-03-14

    We investigated seven distance measures in a set of observations of physicochemical variables of mango (Mangifera indica) submitted to multivariate analyses (distance, projection and grouping). To estimate the distance measurements, five mango progeny (total of 25 genotypes) were analyzed, using six fruit physicochemical descriptors (fruit weight, equatorial diameter, longitudinal diameter, total soluble solids in °Brix, total titratable acidity, and pH). The distance measurements were compared by the Spearman correlation test, projection in two-dimensional space and grouping efficiency. The Spearman correlation coefficients between the seven distance measurements were, except for the Mahalanobis' generalized distance (0.41 ≤ rs ≤ 0.63), high and significant (rs ≥ 0.91; P < 0.001). Regardless of the origin of the distance matrix, the unweighted pair group method with arithmetic mean grouping method proved to be the most adequate. The various distance measurements and grouping methods gave different values for distortion (-116.5 ≤ D ≤ 74.5), cophenetic correlation (0.26 ≤ rc ≤ 0.76) and stress (-1.9 ≤ S ≤ 58.9). Choice of distance measurement and analysis methods influence the.

  12. Contact Thermocouple Methodology and Evaluation for Temperature Measurement in the Laboratory

    NASA Technical Reports Server (NTRS)

    Brewer, Ethan J.; Pawlik, Ralph J.; Krause, David L.

    2013-01-01

    Laboratory testing of advanced aerospace components very often requires highly accurate temperature measurement and control devices, as well as methods to precisely analyze and predict the performance of such components. Analysis of test articles depends on accurate measurements of temperature across the specimen. Where possible, this task is accomplished using many thermocouples welded directly to the test specimen, which can produce results with great precision. However, it is known that thermocouple spot welds can initiate deleterious cracks in some materials, prohibiting the use of welded thermocouples. Such is the case for the nickel-based superalloy MarM-247, which is used in the high temperature, high pressure heater heads for the Advanced Stirling Converter component of the Advanced Stirling Radioisotope Generator space power system. To overcome this limitation, a method was developed that uses small diameter contact thermocouples to measure the temperature of heater head test articles with the same level of accuracy as welded thermocouples. This paper includes a brief introduction and a background describing the circumstances that compelled the development of the contact thermocouple measurement method. Next, the paper describes studies performed on contact thermocouple readings to determine the accuracy of results. It continues on to describe in detail the developed measurement method and the evaluation of results produced. A further study that evaluates the performance of different measurement output devices is also described. Finally, a brief conclusion and summary of results is provided.

  13. A Novel Instrument and Methodology for the In-Situ Measurement of the Stress in Thin Films

    NASA Technical Reports Server (NTRS)

    Broadway, David M.; Omokanwaye, Mayowa O.; Ramsey, Brian D.

    2014-01-01

    We introduce a novel methodology for the in-situ measurement of mechanical stress during thin film growth utilizing a highly sensitive non-contact variation of the classic spherometer. By exploiting the known spherical deformation of the substrate the value of the stress induced curvature is inferred by measurement of only one point on the substrate's surface-the sagittal. From the known curvature the stress can be calculated using the well-known Stoney equation. Based on this methodology, a stress sensor has been designed which is simple, highly sensitive, compact, and low cost. As a result of its compact nature, the sensor can be mounted in any orientation to accommodate a given deposition geometry without the need for extensive modification to an already existing deposition system. The technique employs the use of a double side polished substrate that offers good specular reflectivity and is isotropic in its mechanical properties, such as <111> oriented crystalline silicon or amorphous soda lime glass, for example. The measurement of the displacement of the uncoated side during deposition is performed with a high resolution (i.e. 5nm), commercially available, inexpensive, fiber optic sensor which can be used in both high vacuum and high temperature environments (i.e. 10(exp-7) Torr and 480oC, respectively). A key attribute of this instrument lies in its potential to achieve sensitivity that rivals other measurement techniques such as the micro cantilever method but, due to the comparatively larger substrate area, offers a more robust and practical alternative for subsequent measurement of additional characteristics of the film that can might be correlated to film stress. We present measurement results of nickel films deposited by magnetron sputtering which show good qualitative agreement to the know behavior of polycrystalline films previously reported by Hoffman.

  14. Thermal measurements for jets in disturbed and undisturbed crosswind conditions

    NASA Technical Reports Server (NTRS)

    Wark, Candace E.; Foss, John F.

    1988-01-01

    A direct comparison is made of the thermal field properties for a low-disturbance and a high-disturbance level condition affecting the low-temperature air jets introduced into gas turbine combustor aft sections in order both to cool the high-temperature gases and quench the combustion reactions. Sixty-four fast-response thermocouples were simultaneously sampled and corrected for their time constant effect at a downstream plane close to the jet exit. Histograms formed from independent samples were sufficiently smooth to approximate a pdf.

  15. Measurement of Rubidium Number Density Under Optically Thick Conditions

    DTIC Science & Technology

    2010-11-15

    Voigt profiles . A Voigt line shape is represented by equations 4.1 and 4.2. (4.1) gV oigt(λ, λFF ′) = 1 2π √ π ∫ ∞ −∞  ∆λL exp(−t2) (λ− λFF ′ − t ∆λD...various cell conditions of temperature and pressure were then fit to a pressure broadened Voigt profile thereby allowing the determination of the...broadened Voigt profile thereby allowing the determination of the rubidium number den- sity. 1. Background In recent years, alkali metals have garnered a

  16. Development of Methodologies For The Analysis of The Efficiency of Flood Reduction Measures In The Rhine Basin On The Basis of Reference Floods (deflood)

    NASA Astrophysics Data System (ADS)

    Krahe, P.; Herpertz, D.; Buiteveld, H.; Busch, N.; Engel, H.; Helbig, A.; Naef, F.; Wilke, K.

    After some years of extreme flooding in the 1990s extended efforts were made to im- prove flood protection by means of an integrated river basin management. Part of this strategy is the implementation of decentralised flood reduction measures (FRM). With this in mind, the CHR/IRMA-SPONGE Project DEFLOOD was initiated. By estab- lishing a set of methodological tools this project aims at making a step further towards a quantitative hydrological evaluation of the effects of local FRM on flood generation in large river basins. The basin of the River Mosel and in particular, the basin of its tributary Saar served as case study area for testing the methodological approach. A framework for an integrated river basin modelling approach (FIRM U Flood Reduc- tion) based on generation of hydrometeorological reference conditions, precipitation- runoff modelling and flood routing procedures was set up. In this approach interfaces to incorporate the results of scenario calculations by meso-scale hydrological mod- elling are defined in order to study the downstream propagation of the effect of decen- tralised flood reduction measures including the potential retention along minor rivers in large rivers. Examples for scenario calculations are given. Based on the experience gained the strategy for the use of the methodological framework within the context of river basin management practice are identified. The application of the methodol- ogy requires a set of actions which has to be installed in the Rhine/Meuse basins. The recommendations suggest that - beside progress in hydrological modelling - a base of knowledge needs to be built up and administered which encompasses hydrologically relevant information on the actual state and prospected developments in the River Rhine basin. Furthermore, problem-oriented hydrological process studies in selected small-scale river basins ought to be carried out. Based on these studies conceptual meso-scale modelling approaches can be improved and

  17. Extreme Sea Conditions in Shallow Water: Estimation based on in-situ measurements

    NASA Astrophysics Data System (ADS)

    Le Crom, Izan; Saulnier, Jean-Baptiste

    2013-04-01

    The design of marine renewable energy devices and components is based, among others, on the assessment of the environmental extreme conditions (winds, currents, waves, and water level) that must be combined together in order to evaluate the maximal loads on a floating/fixed structure, and on the anchoring system over a determined return period. Measuring devices are generally deployed at sea over relatively short durations (a few months to a few years), typically when describing water free surface elevation, and extrapolation methods based on hindcast data (and therefore on wave simulation models) have to be used. How to combine, in a realistic way, the action of the different loads (winds and waves for instance) and which correlation of return periods should be used are highly topical issues. However, the assessment of the extreme condition itself remains a not-fully-solved, crucial, and sensitive task. Above all in shallow water, extreme wave height, Hmax, is the most significant contribution in the dimensioning process of EMR devices. As a case study, existing methodologies for deep water have been applied to SEMREV, the French marine energy test site. The interest of this study, especially at this location, goes beyond the simple application to SEMREV's WEC and floating wind turbines deployment as it could also be extended to the Banc de Guérande offshore wind farm that are planned close by. More generally to pipes and communication cables as it is a redundant problematic. The paper will first present the existing measurements (wave and wind on site), the prediction chain that has been developed via wave models, the extrapolation methods applied to hindcast data, and will try to formulate recommendations for improving this assessment in shallow water.

  18. Setting the light conditions for measuring root transparency for age-at-death estimation methods.

    PubMed

    Adserias-Garriga, Joe; Nogué-Navarro, Laia; Zapico, Sara C; Ubelaker, Douglas H

    2017-03-30

    Age-at-death estimation is one of the main goals in forensic identification, being an essential parameter to determine the biological profile, narrowing the possibility of identification in cases involving missing persons and unidentified bodies. The study of dental tissues has been long considered as a proper tool for age estimation with several age estimation methods based on them. Dental age estimation methods can be divided into three categories: tooth formation and development, post-formation changes, and histological changes. While tooth formation and growth changes are important for fetal and infant consideration, when the end of dental and skeletal growth is achieved, post-formation or biochemical changes can be applied. Lamendin et al. in J Forensic Sci 37:1373-1379, (1992) developed an adult age estimation method based on root transparency and periodontal recession. The regression formula demonstrated its accuracy of use for 40 to 70-year-old individuals. Later on, Prince and Ubelaker in J Forensic Sci 47(1):107-116, (2002) evaluated the effects of ancestry and sex and incorporated root height into the equation, developing four new regression formulas for males and females of African and European ancestry. Even though root transparency is a key element in the method, the conditions for measuring this element have not been established. The aim of the present study is to set the light conditions measured in lumens that offer greater accuracy when applying the Lamendin et al. method modified by Prince and Ubelaker. The results must be also taken into account in the application of other age estimation methodologies using root transparency to estimate age-at-death.

  19. Towards a methodology for validation of centrality measures in complex networks.

    PubMed

    Batool, Komal; Niazi, Muaz A

    2014-01-01

    Living systems are associated with Social networks - networks made up of nodes, some of which may be more important in various aspects as compared to others. While different quantitative measures labeled as "centralities" have previously been used in the network analysis community to find out influential nodes in a network, it is debatable how valid the centrality measures actually are. In other words, the research question that remains unanswered is: how exactly do these measures perform in the real world? So, as an example, if a centrality of a particular node identifies it to be important, is the node actually important? The goal of this paper is not just to perform a traditional social network analysis but rather to evaluate different centrality measures by conducting an empirical study analyzing exactly how do network centralities correlate with data from published multidisciplinary network data sets. We take standard published network data sets while using a random network to establish a baseline. These data sets included the Zachary's Karate Club network, dolphin social network and a neural network of nematode Caenorhabditis elegans. Each of the data sets was analyzed in terms of different centrality measures and compared with existing knowledge from associated published articles to review the role of each centrality measure in the determination of influential nodes. Our empirical analysis demonstrates that in the chosen network data sets, nodes which had a high Closeness Centrality also had a high Eccentricity Centrality. Likewise high Degree Centrality also correlated closely with a high Eigenvector Centrality. Whereas Betweenness Centrality varied according to network topology and did not demonstrate any noticeable pattern. In terms of identification of key nodes, we discovered that as compared with other centrality measures, Eigenvector and Eccentricity Centralities were better able to identify important nodes.

  20. Towards a Methodology for Validation of Centrality Measures in Complex Networks

    PubMed Central

    2014-01-01

    Background Living systems are associated with Social networks — networks made up of nodes, some of which may be more important in various aspects as compared to others. While different quantitative measures labeled as “centralities” have previously been used in the network analysis community to find out influential nodes in a network, it is debatable how valid the centrality measures actually are. In other words, the research question that remains unanswered is: how exactly do these measures perform in the real world? So, as an example, if a centrality of a particular node identifies it to be important, is the node actually important? Purpose The goal of this paper is not just to perform a traditional social network analysis but rather to evaluate different centrality measures by conducting an empirical study analyzing exactly how do network centralities correlate with data from published multidisciplinary network data sets. Method We take standard published network data sets while using a random network to establish a baseline. These data sets included the Zachary's Karate Club network, dolphin social network and a neural network of nematode Caenorhabditis elegans. Each of the data sets was analyzed in terms of different centrality measures and compared with existing knowledge from associated published articles to review the role of each centrality measure in the determination of influential nodes. Results Our empirical analysis demonstrates that in the chosen network data sets, nodes which had a high Closeness Centrality also had a high Eccentricity Centrality. Likewise high Degree Centrality also correlated closely with a high Eigenvector Centrality. Whereas Betweenness Centrality varied according to network topology and did not demonstrate any noticeable pattern. In terms of identification of key nodes, we discovered that as compared with other centrality measures, Eigenvector and Eccentricity Centralities were better able to identify important nodes

  1. Nonequilibrium free-energy estimation conditioned on measurement outcomes

    NASA Astrophysics Data System (ADS)

    Asban, Shahaf; Rahav, Saar

    2017-08-01

    The Jarzynski equality is one of the most influential results in the field of nonequilibrium statistical mechanics. This celebrated equality allow the calculation of equilibrium free-energy differences from work distributions of nonequilibrium processes. In practice, such calculations often suffer from poor convergence due to the need to sample rare events. Here we examine if the inclusion of measurement and feedback can improve the convergence of nonequilibrium free-energy calculations. A modified version of the Jarzynski equality in which realizations with a given outcome are kept, while others are discarded, is used. We find that discarding realizations with unwanted outcomes can result in improved convergence compared to calculations based on the Jarzynski equality. We argue that the observed improved convergence is closely related to Bennett's acceptance ratio method, which was developed without any reference to measurements or feedback.

  2. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  3. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  4. Technical Guide Documenting Methodology, Indicators, and Data Sources for "Measuring Up 2004: The National and State Report Card on Higher Education." National Center #04-6

    ERIC Educational Resources Information Center

    Ryu, Mikyung

    2004-01-01

    This "Technical Guide" describes the methodology and concepts used to measure and grade the performance of the 50 states in the higher education arena. Part I presents the methodology for grading states and provides information on data collection and reporting. Part II explains the indicators that comprise each of the graded categories.…

  5. Conditionally-Sampled Turbulent and Nonturbulent Measurements of Entropy Generation Rate in the Transition Region of Boundary Layers

    SciTech Connect

    D. M. McEligot; J. R. Wolf; K. P. Nolan; E. J. Walsh; R. J. Volino

    2006-05-01

    Conditionally-sampled boundary layer data for an accelerating transitional boundary layer have been analyzed to calculate the entropy generation rate in the transition region. By weighing the nondimensional dissipation coefficient for the laminar-conditioned-data and turbulent-conditioned-data with the intermittency factor the average entropy generation rate in the transition region can be determined and hence be compared to the time averaged data and correlations for steady laminar and turbulent flows. It is demonstrated that this method provides, for the first time, an accurate and detailed picture of the entropy generation rate during transition. The data used in this paper have been taken from detailed boundary layer measurements available in the literature. This paper provides, using an intermittency weighted approach, a methodology for predicting entropy generation in a transitional boundary layer.

  6. Microcalorimetric assays for measuring cell growth and metabolic activity: methodology and applications.

    PubMed

    Braissant, O; Bachmann, A; Bonkat, G

    2015-04-01

    Isothermal microcalorimetry measures the heat released or consumed by physical or chemical processes. Metabolic activity releases heat that can be measured. However the calorimetric signal can be difficult for new users to interpret. This paper compares microcalorimetry to other techniques and reviews its application to different fields where microbiological activity is important. We also describe different ways to analyze the data and translate it into meaningful (micro)biological equivalents. This paper aims at providing non-specialist reader the tools to understand how isothermal microcalorimetry can be used for microbiological applications. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. A methodology suitable for TEM local measurements of carbon concentration in retained austenite

    SciTech Connect

    Kammouni, A.; Saikaly, W. Dumont, M.; Marteau, C.; Bano, X.; Charai, A.

    2008-09-15

    Carbon concentration in retained austenite grains is of great importance determining the mechanical properties of hot-rolled TRansformation Induced Plasticity steels. Among the different techniques available to measure such concentrations, Kikuchi lines obtained in Transmission Electron Microscopy provide a relatively easy and accurate method. The major problem however is to be able to locate an austenitic grain in the observed Transmission Electron Microscopy thin foil. Focused Ion Beam in combination with Scanning Electron Microscopy was used to successfully prepare a thin foil for Transmission Electron Microscopy and carbon concentration measurements from a 700 nm retained austenite grain.

  8. Conditions necessary for low-level measurements of reactive oxidants

    SciTech Connect

    Nakareseisoon, S.

    1988-01-01

    Chlorine dioxide and ozone are considered to be the alternatives to chlorine for the disinfection of drinking water supplies and also for the treatment of wastewaters prior to discharge. Chlorine dioxide, under normal circumstances, is reduced to chlorite ion which is toxic. The recommended seven-day suggested no-adverse-response levels (SNARL's) of chlorite ion is 0.007 mg/l (7 ppb). Chlorite ion at these low levels cannot be satisfactorily determined by existing methods, and so, it became necessary to develop an analytical method for determining ppb levels of chlorite ion. Such a method can be developed using differential pulse polarography (DPP). The electrochemical reduction of chlorite ion has been studied between pH 3.7-14 and in an ionic strength range of 0.05-3.0 M. The optimum conditions are pH 4.1-4.4 and an ionic strength of 0.45 M. The current under these conditions is a linear function of chlorite ion concentration ranging from 2.77 {times} 10{sup {minus}7} to 2.80 {times} 10{sup {minus}4} M (19 ppb to 19 ppm). The imprecision is better than {plus minus} 1.0% and {plus minus} 3.4% at concentrations of 2.87 {times} 10{sup {minus}5} M and 1.74 {times} 10{sup {minus}6} M, respectively, with a detection limit of 1 {times} 10{sup {minus}7} M (7 ppb). The rate of ozone decomposition has been studied in highly basic solutions (8-15 NaOH), where ozone becomes stable. The mechanism of ozone regeneration was proposed to explain the observed kinetic and to clarify the contradiction concerning the very slow observed rate of ozone decomposition in basic solution.

  9. Rigorous Measures of Implementation: A Methodological Framework for Evaluating Innovative STEM Programs

    ERIC Educational Resources Information Center

    Cassata-Widera, Amy; Century, Jeanne; Kim, Dae Y.

    2011-01-01

    The practical need for multidimensional measures of fidelity of implementation (FOI) of reform-based science, technology, engineering, and mathematics (STEM) instructional materials, combined with a theoretical need in the field for a shared conceptual framework that could support accumulating knowledge on specific enacted program elements across…

  10. Standardized Test Methodology for Measuring Pressure Suit Glove Performance and Demonstration Units.

    DTIC Science & Technology

    1994-01-01

    Foam Sensor Data: Run 3 ......................................................................... 1 3 5 Tekscan Ink Sensor Data: Run 1...1 4 6 Tekscan Ink Sensor Data: Run 2 ... .................... ........... 1 4 7 Tekscan Ink Sensor Data...sensor and Tekscan techniques for range of motion measurement. With the foam sensor, runs 1-3, the pressure was increased by 1 psi every 60 seconds up to

  11. The Sidewalk Survey: A Field Methodology to Measure Late-Night College Drinking

    ERIC Educational Resources Information Center

    Johnson, Mark B.; Lange, James E.; Voas, Robert B.; Clapp, John D.; Lauer, Elizabeth; Snowden, Cecelia B.

    2006-01-01

    Alcohol use is highly prevalent among U.S. college students, and alcohol-related problems are often considered the most serious public health threat on American college campuses. Although empirical examinations of college drinking have relied primarily on self-report measures, several investigators have implemented field studies to obtain…

  12. METHODOLOGY FOR MEASURING PM 2.5 SEPARATOR CHARACTERISTICS USING AN AEROSIZER

    EPA Science Inventory

    A method is presented that enables the measurement of the particle size separation characteristics of an inertial separator in a rapid fashion. Overall penetration is determined for discrete particle sizes using an Aerosizer (Model LD, TSI, Incorporated, Particle Instruments/Am...

  13. Measuring the Impact of University Technology Transfer: A Guide to Methodologies, Data Needs, and Sources

    ERIC Educational Resources Information Center

    Lowe, Robert A.; Quick, Suzanne K.

    2005-01-01

    This paper discusses measures that capture the impact of university technology transfer activities on a university?s local and regional economies (economic impact). Such assessments are of increasing interest to policy makers, researchers and technology transfer professionals, yet there have been few published discussions of the merits of various…

  14. METHODOLOGY FOR MEASURING PM 2.5 SEPARATOR CHARACTERISTICS USING AN AEROSIZER

    EPA Science Inventory

    A method is presented that enables the measurement of the particle size separation characteristics of an inertial separator in a rapid fashion. Overall penetration is determined for discrete particle sizes using an Aerosizer (Model LD, TSI, Incorporated, Particle Instruments/Am...

  15. METHODOLOGICAL APPROACH FOR MEASURING PRIORITY DBPS IN REVERSE OSMOSIS CONCENTRATED DRINKING WATER

    EPA Science Inventory

    Many disinfection by-products (DBPs) are formed when drinking water is chlorinated, but only a few are routinely measured or regulated. Various studies have revealed a plethora of DBPs for which sensitive and quantitative analytical methods have always been a major limiting facto...

  16. A novel, non-invasive transdermal fluid sampling methodology: IGF-I measurement following exercise

    USDA-ARS?s Scientific Manuscript database

    This study tested the hypothesis that transdermal fluid (TDF) provides a more sensitive and accurate measure of exercise-induced increases in insulin-like growth factor-I (IGF-I) than serum, and that these increases are detectable proximal, but not distal, to the exercising muscle. A novel, noninvas...

  17. Measurement of masticatory forces and implant loads: a methodologic clinical study.

    PubMed

    Morneburg, Thomas R; Pröschel, Peter A

    2002-01-01

    The aim of this study was to measure vertical masticatory forces in vivo using a method that should be insensitive to the location of bite force impact. Two exchangeable implant abutments were equipped with strain gauges. In nine patients, the abutments were attached to implants supporting three-unit fixed partial dentures (FPD) in one mandibular chewing center. The signals of the two abutments were summed to give a force reading that was independent of the location of force impact along the FPD. In two subjects, an additional strain gauge was fixed under the pontic. With both setups, masticatory forces were measured in chewing of winegum. Total masticatory force displayed by the sum signal proved to be independent of the site of force application. Pontic strain gauges indicated only 42% or 84% of the force measured simultaneously by the corresponding sum signal of the abutments. In all nine patients, a mean total masticatory force of 220 N, with a maximum of 450 N, was found. The single abutments experienced mean loads of 91 N (anterior) and 129 N (posterior), with a maximum of 314 N. Measuring chewing force via bending of a pontic involves the risk of underestimation. Masticatory forces obtained with a method that was insensitive to the site of force application were higher than forces found with some other setups.

  18. METHODOLOGICAL APPROACH FOR MEASURING PRIORITY DBPS IN REVERSE OSMOSIS CONCENTRATED DRINKING WATER

    EPA Science Inventory

    Many disinfection by-products (DBPs) are formed when drinking water is chlorinated, but only a few are routinely measured or regulated. Various studies have revealed a plethora of DBPs for which sensitive and quantitative analytical methods have always been a major limiting facto...

  19. Auditory evoked potential measurement methodology for odontocetes and a comparison of measured thresholds with those obtained using psychophysical techniques

    NASA Astrophysics Data System (ADS)

    Nachtigall, Paul E.; Yuen, Michelle; Mooney, T. Aran; Taylor, Kristen

    2005-04-01

    Most measurements of the hearing capabilities of toothed whales and dolphins have been taken using traditional psychophysical procedures in which the animals have been maintained in laboratory environments and trained to behaviorally report the sensation or difference of acoustic stimuli. Because of the advantage of rapid data collection, increased opportunities, and new methods, Auditory Evoked Potentials (AEPs) have become increasingly used to measure audition. The use of this new procedure calls to question the comparability of the established literature and the new results collected with AEPs. The results of behavioral and AEP methods have been directly compared with basic audiogram measurements and have been shown to produce similar (but not exactly the same) values when the envelope following response procedure has been used and the length of the stimulus is taken into account. The AEP methods allow possible audiometric opportunities beyond those available with conventional psychophysics including: (1) the measurement of stranded dolphins and whales that may never be kept in laboratories, (2) the testing of stranded animals for hearing deficits perhaps caused by overexposure to noise, and (3) passive testing of hearing mechanisms while animals actively echolocate. [Work supported by the Office of Naval Research and NOAA-NMFS.

  20. Quantitative colorimetric measurement of cellulose degradation under microbial culture conditions.

    PubMed

    Haft, Rembrandt J F; Gardner, Jeffrey G; Keating, David H

    2012-04-01

    We have developed a simple, rapid, quantitative colorimetric assay to measure cellulose degradation based on the absorbance shift of Congo red dye bound to soluble cellulose. We term this assay "Congo Red Analysis of Cellulose Concentration," or "CRACC." CRACC can be performed directly in culture media, including rich and defined media containing monosaccharides or disaccharides (such as glucose and cellobiose). We show example experiments from our laboratory that demonstrate the utility of CRACC in probing enzyme kinetics, quantifying cellulase secretion, and assessing the physiology of cellulolytic organisms. CRACC complements existing methods to assay cellulose degradation, and we discuss its utility for a variety of applications.

  1. Radon-222 activity flux measurement using activated charcoal canisters: revisiting the methodology.

    PubMed

    Alharbi, Sami H; Akber, Riaz A

    2014-03-01

    The measurement of radon ((222)Rn) activity flux using activated charcoal canisters was examined to investigate the distribution of the adsorbed (222)Rn in the charcoal bed and the relationship between (222)Rn activity flux and exposure time. The activity flux of (222)Rn from five sources of varying strengths was measured for exposure times of one, two, three, five, seven, 10, and 14 days. The distribution of the adsorbed (222)Rn in the charcoal bed was obtained by dividing the bed into six layers and counting each layer separately after the exposure. (222)Rn activity decreased in the layers that were away from the exposed surface. Nevertheless, the results demonstrated that only a small correction might be required in the actual application of charcoal canisters for activity flux measurement, where calibration standards were often prepared by the uniform mixing of radium ((226)Ra) in the matrix. This was because the diffusion of (222)Rn in the charcoal bed and the detection efficiency as a function of the charcoal depth tended to counterbalance each other. The influence of exposure time on the measured (222)Rn activity flux was observed in two situations of the canister exposure layout: (a) canister sealed to an open bed of the material and (b) canister sealed over a jar containing the material. The measured (222)Rn activity flux decreased as the exposure time increased. The change in the former situation was significant with an exponential decrease as the exposure time increased. In the latter case, lesser reduction was noticed in the observed activity flux with respect to exposure time. This reduction might have been related to certain factors, such as absorption site saturation or the back diffusion of (222)Rn gas occurring at the canister-soil interface.

  2. Measuring functional connectivity using MEG: Methodology and comparison with fcMRI

    PubMed Central

    Brookes, Matthew J.; Hale, Joanne R.; Zumer, Johanna M.; Stevenson, Claire M.; Francis, Susan T.; Barnes, Gareth R.; Owen, Julia P.; Morris, Peter G.; Nagarajan, Srikantan S.

    2011-01-01

    Functional connectivity (FC) between brain regions is thought to be central to the way in which the brain processes information. Abnormal connectivity is thought to be implicated in a number of diseases. The ability to study FC is therefore a key goal for neuroimaging. Functional connectivity (fc) MRI has become a popular tool to make connectivity measurements but the technique is limited by its indirect nature. A multimodal approach is therefore an attractive means to investigate the electrodynamic mechanisms underlying hemodynamic connectivity. In this paper, we investigate resting state FC using fcMRI and magnetoencephalography (MEG). In fcMRI, we exploit the advantages afforded by ultra high magnetic field. In MEG we apply envelope correlation and coherence techniques to source space projected MEG signals. We show that beamforming provides an excellent means to measure FC in source space using MEG data. However, care must be taken when interpreting these measurements since cross talk between voxels in source space can potentially lead to spurious connectivity and this must be taken into account in all studies of this type. We show good spatial agreement between FC measured independently using MEG and fcMRI; FC between sensorimotor cortices was observed using both modalities, with the best spatial agreement when MEG data are filtered into the β band. This finding helps to reduce the potential confounds associated with each modality alone: while it helps reduce the uncertainties in spatial patterns generated by MEG (brought about by the ill posed inverse problem), addition of electrodynamic metric confirms the neural basis of fcMRI measurements. Finally, we show that multiple MEG based FC metrics allow the potential to move beyond what is possible using fcMRI, and investigate the nature of electrodynamic connectivity. Our results extend those from previous studies and add weight to the argument that neural oscillations are intimately related to functional

  3. System Error Compensation Methodology Based on a Neural Network for a Micromachined Inertial Measurement Unit

    PubMed Central

    Liu, Shi Qiang; Zhu, Rong

    2016-01-01

    Errors compensation of micromachined-inertial-measurement-units (MIMU) is essential in practical applications. This paper presents a new compensation method using a neural-network-based identification for MIMU, which capably solves the universal problems of cross-coupling, misalignment, eccentricity, and other deterministic errors existing in a three-dimensional integrated system. Using a neural network to model a complex multivariate and nonlinear coupling system, the errors could be readily compensated through a comprehensive calibration. In this paper, we also present a thermal-gas MIMU based on thermal expansion, which measures three-axis angular rates and three-axis accelerations using only three thermal-gas inertial sensors, each of which capably measures one-axis angular rate and one-axis acceleration simultaneously in one chip. The developed MIMU (100 × 100 × 100 mm3) possesses the advantages of simple structure, high shock resistance, and large measuring ranges (three-axes angular rates of ±4000°/s and three-axes accelerations of ±10 g) compared with conventional MIMU, due to using gas medium instead of mechanical proof mass as the key moving and sensing elements. However, the gas MIMU suffers from cross-coupling effects, which corrupt the system accuracy. The proposed compensation method is, therefore, applied to compensate the system errors of the MIMU. Experiments validate the effectiveness of the compensation, and the measurement errors of three-axis angular rates and three-axis accelerations are reduced to less than 1% and 3% of uncompensated errors in the rotation range of ±600°/s and the acceleration range of ±1 g, respectively. PMID:26840314

  4. System Error Compensation Methodology Based on a Neural Network for a Micromachined Inertial Measurement Unit.

    PubMed

    Liu, Shi Qiang; Zhu, Rong

    2016-01-29

    Errors compensation of micromachined-inertial-measurement-units (MIMU) is essential in practical applications. This paper presents a new compensation method using a neural-network-based identification for MIMU, which capably solves the universal problems of cross-coupling, misalignment, eccentricity, and other deterministic errors existing in a three-dimensional integrated system. Using a neural network to model a complex multivariate and nonlinear coupling system, the errors could be readily compensated through a comprehensive calibration. In this paper, we also present a thermal-gas MIMU based on thermal expansion, which measures three-axis angular rates and three-axis accelerations using only three thermal-gas inertial sensors, each of which capably measures one-axis angular rate and one-axis acceleration simultaneously in one chip. The developed MIMU (100 × 100 × 100 mm³) possesses the advantages of simple structure, high shock resistance, and large measuring ranges (three-axes angular rates of ±4000°/s and three-axes accelerations of ± 10 g) compared with conventional MIMU, due to using gas medium instead of mechanical proof mass as the key moving and sensing elements. However, the gas MIMU suffers from cross-coupling effects, which corrupt the system accuracy. The proposed compensation method is, therefore, applied to compensate the system errors of the MIMU. Experiments validate the effectiveness of the compensation, and the measurement errors of three-axis angular rates and three-axis accelerations are reduced to less than 1% and 3% of uncompensated errors in the rotation range of ±600°/s and the acceleration range of ± 1 g, respectively.

  5. Radio weak lensing shear measurement in the visibility domain - I. Methodology

    NASA Astrophysics Data System (ADS)

    Rivi, M.; Miller, L.; Makhathini, S.; Abdalla, F. B.

    2016-12-01

    The high sensitivity of the new generation of radio telescopes such as the Square Kilometre Array (SKA) will allow cosmological weak lensing measurements at radio wavelengths that are competitive with optical surveys. We present an adaptation to radio data of lensfit, a method for galaxy shape measurement originally developed and used for optical weak lensing surveys. This likelihood method uses an analytical galaxy model and makes a Bayesian marginalization of the likelihood over uninteresting parameters. It has the feature of working directly in the visibility domain, which is the natural approach to adopt with radio interferometer data, avoiding systematics introduced by the imaging process. As a proof of concept, we provide results for visibility simulations of individual galaxies with flux density S ≥ 10 μJy at the phase centre of the proposed SKA1-MID baseline configuration, adopting 12 frequency channels in the band 950-1190 MHz. Weak lensing shear measurements from a population of galaxies with realistic flux and scalelength distributions are obtained after natural gridding of the raw visibilities. Shear measurements are expected to be affected by `noise bias': we estimate the bias in the method as a function of signal-to-noise ratio (SNR). We obtain additive and multiplicative bias values that are comparable to SKA1 requirements for SNR > 18 and SNR > 30, respectively. The multiplicative bias for SNR >10 is comparable to that found in ground-based optical surveys such as CFHTLenS, and we anticipate that similar shear measurement calibration strategies to those used for optical surveys may be used to good effect in the analysis of SKA radio interferometer data.

  6. Explosive Strength of the Knee Extensors: The Influence of Criterion Trial Detection Methodology on Measurement Reproducibility.

    PubMed

    Dirnberger, Johannes; Wiesinger, Hans-Peter; Wiemer, Nicolas; Kösters, Alexander; Müller, Erich

    2016-04-01

    The present study was conducted to assess test-retest reproducibility of explosive strength measurements during single-joint isometric knee extension using the IsoMed 2000 dynamometer. Thirty-one physically active male subjects (mean age: 23.7 years) were measured on two occasions separated by 48-72 h. The intraclass correlation coefficient (ICC 2,1) and the coefficient of variation (CV) were calculated for (i) maximum torque (MVC), (ii) the peak rate of torque development (RTDpeak) as well as for (iii) the average rate of torque development (RTD) and the impulse taken at several predefined time intervals (0-30 to 0-300 ms); thereby explosive strength variables were derived in two conceptually different versions: on the one hand from the MVC-trial (version I), on the other hand from the trial showing the RTDpeak (version II). High ICC-values (0.80-0.99) and acceptable CV-values (1.9-8.7%) could be found for MVC as well as for the RTD and the impulse taken at time intervals of ≥100 ms, regardless of whether version I or II was used. In contrast, measurements of the RTDpeak as well as the RTD and the impulse taken during the very early contraction phase (i.e. RTD/impulse0-30ms and RTD/impulse0-50ms) showed clearly weaker reproducibility results (ICC: 0.53-0.84; CV: 7.3-16.4%) and gave rise to considerable doubts as to clinical usefulness, especially when derived using version I. However, if there is a need to measure explosive strength for earlier time intervals in practice, it is, in view of stronger reproducibility results, recommended to concentrate on measures derived from version II, which is based on the RTDpeak-trial.

  7. Development and characterization of an annular denuder methodology for the measurement of divalent inorganic reactive gaseous mercury in ambient air.

    PubMed

    Landis, Matthew S; Stevens, Robert K; Schaedlich, Frank; Prestbo, Eric M

    2002-07-01

    Atmospheric mercury is predominantly present in the gaseous elemental form (Hg0). However, anthropogenic emissions (e.g., incineration, fossil fuel combustion) emit and natural processes create particulate-phase mercury(Hg(p)) and divalent reactive gas-phase mercury (RGM). RGM species (e.g., HgCl2, HgBr2) are water-soluble and have much shorter residence times in the atmosphere than Hg0 due to their higher removal rates through wet and dry deposition mechanisms. Manual and automated annular denuder methodologies, to provide high-resolution (1-2 h) ambient RGM measurements, were developed and evaluated. Following collection of RGM onto KCl-coated quartz annular denuders, RGM was thermally decomposed and quantified as Hg0. Laboratory and field evaluations of the denuders found the RGM collection efficiency to be >94% and mean collocated precision to be <15%. Method detection limits for sampling durations ranging from 1 to 12 h were 6.2-0.5 pg m(-3), respectively. As part of this research, the authors observed that methods to measure Hg(p) had a significant positive artifact when RGM coexists with Hg(p). This artifact was eliminated if a KCl-coated annular denuder preceded the filter. This new atmospheric mercury speciation methodology has dramatically enhanced our ability to investigate the mechanisms of transformation and deposition of mercury in the atmosphere.

  8. Methodology developed for the simultaneous measurement of bone formation and bone resorption in rats based on the pharmacokinetics of fluoride.

    PubMed

    Lupo, Maela; Brance, Maria Lorena; Fina, Brenda Lorena; Brun, Lucas Ricardo; Rigalli, Alfredo

    2015-01-01

    This paper describes a novel methodology for the simultaneous estimation of bone formation (BF) and resorption (BR) in rats using fluoride as a nonradioactive bone-seeker ion. The pharmacokinetics of flouride have been extensively studied in rats; its constants have all been characterized. This knowledge was the cornerstone for the underlying mathematical model that we used to measure bone fluoride uptake and elimination rate after a dose of fluoride. Bone resorption and formation were estimated by bone fluoride uptake and elimination rate, respectively. ROC analysis showed that sensitivity, specificity and area under the ROC curve were not different from deoxypiridinoline and bone alkaline phosphatase, well-known bone markers. Sprague-Dawley rats with modified bone remodelling (ovariectomy, hyper, and hypocalcic diet, antiresorptive treatment) were used to validate the values obtained with this methodology. The results of BF and BR obtained with this technique were as expected for each biological model. Although the method should be performed under general anesthesia, it has several advantages: simultaneous measurement of BR and BF, low cost, and the use of compounds with no expiration date.

  9. Improving the reliability of closed chamber methodologies for methane emissions measurement in treatment wetlands.

    PubMed

    Corbella, Clara; Puigagut, Jaume

    2013-01-01

    Non-homogeneous mixing of methane (NHM) within closed chambers was studied under laboratory conditions. The experimental set-up consisted of a PVC vented chamber of 5.3 litres of effective volume fitted with a power-adjustable 12 V fan. NHM within the chamber was studied according to fan position (top vs lateral), fan airflow strength (23 vs 80 cubic feet per minute) and the mixing time before sample withdrawal (5, 10, 15 and 20 minutes). The potential bias of methane flux densities caused by NHM was addressed by monitoring the difference between linearly expected and estimated flux densities of ca. 400, ca. 800 and ca. 1,600 mg CH(4).m(-2) d(-1). Methane within the chamber was under non-homogeneous conditions. Accordingly, methane concentrations at the bottom of the chamber were between 20 to 70% higher than those recorded at the middle or top sections of the chamber, regardless of fan position, fan air-flow strength or time before sample withdrawal. NHM led to notable biases on flux density estimation. Accordingly, flux density estimated from top and middle sampling sections were systematically lower (ca. 50%) than those expected. Flux densities estimated from bottom samples were between 10% higher and 25% lower than expected, regardless of the flux density considered.

  10. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    NASA Astrophysics Data System (ADS)

    Renbaum-Wolff, L.; Grayson, J. W.; Bertram, A. K.

    2012-10-01

    Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions). The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η) ranging between 10-3 and 103 Pascal seconds (Pa s) in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter) are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  11. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    NASA Astrophysics Data System (ADS)

    Renbaum-Wolff, L.; Grayson, J. W.; Bertram, A. K.

    2013-01-01

    Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions). The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η) ranging between 10-3 and 103 Pascal seconds (Pa s) in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter) are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  12. High frequency measurement of P- and S-wave velocities on crystalline rock massif surface - methodology of measurement

    NASA Astrophysics Data System (ADS)

    Vilhelm, Jan; Slavík, Lubomír

    2014-05-01

    For the purpose of non-destructive monitoring of rock properties in the underground excavation it is possible to perform repeated high-accuracy P- and S-wave velocity measurements. This contribution deals with preliminary results gained during the preparation of micro-seismic long-term monitoring system. The field velocity measurements were made by pulse-transmission technique directly on the rock outcrop (granite) in Bedrichov gallery (northern Bohemia). The gallery at the experimental site was excavated using TBM (Tunnel Boring Machine) and it is used for drinking water supply, which is conveyed in a pipe. The stable measuring system and its automatic operation lead to the use of piezoceramic transducers both as a seismic source and as a receiver. The length of measuring base at gallery wall was from 0.5 to 3 meters. Different transducer coupling possibilities were tested namely with regard of repeatability of velocity determination. The arrangement of measuring system on the surface of the rock massif causes better sensitivity of S-transducers for P-wave measurement compared with the P-transducers. Similarly P-transducers were found more suitable for S-wave velocity determination then P-transducers. The frequency dependent attenuation of fresh rock massif results in limited frequency content of registered seismic signals. It was found that at the distance between the seismic source and receiver from 0.5 m the frequency components above 40 kHz are significantly attenuated. Therefore for the excitation of seismic wave 100 kHz transducers are most suitable. The limited frequency range should be also taken into account for the shape of electric impulse used for exciting of piezoceramic transducer. The spike pulse generates broad-band seismic signal, short in the time domain. However its energy after low-pass filtration in the rock is significantly lower than the energy of seismic signal generated by square wave pulse. Acknowledgments: This work was partially

  13. Thermal conductivity measurements of particulate materials under Martian conditions

    NASA Technical Reports Server (NTRS)

    Presley, M. A.; Christensen, P. R.

    1993-01-01

    The mean particle diameter of surficial units on Mars has been approximated by applying thermal inertia determinations from the Mariner 9 Infrared Radiometer and the Viking Infrared Thermal Mapper data together with thermal conductivity measurement. Several studies have used this approximation to characterize surficial units and infer their nature and possible origin. Such interpretations are possible because previous measurements of the thermal conductivity of particulate materials have shown that particle size significantly affects thermal conductivity under martian atmospheric pressures. The transfer of thermal energy due to collisions of gas molecules is the predominant mechanism of thermal conductivity in porous systems for gas pressures above about 0.01 torr. At martian atmospheric pressures the mean free path of the gas molecules becomes greater than the effective distance over which conduction takes place between the particles. Gas particles are then more likely to collide with the solid particles than they are with each other. The average heat transfer distance between particles, which is related to particle size, shape and packing, thus determines how fast heat will flow through a particulate material.The derived one-to-one correspondence of thermal inertia to mean particle diameter implies a certain homogeneity in the materials analyzed. Yet the samples used were often characterized by fairly wide ranges of particle sizes with little information about the possible distribution of sizes within those ranges. Interpretation of thermal inertia data is further limited by the lack of data on other effects on the interparticle spacing relative to particle size, such as particle shape, bimodal or polymodal mixtures of grain sizes and formation of salt cements between grains. To address these limitations and to provide a more comprehensive set of thermal conductivities vs. particle size a linear heat source apparatus, similar to that of Cremers, was assembled to

  14. Thermal conductivity measurements of particulate materials under Martian conditions

    NASA Technical Reports Server (NTRS)

    Presley, M. A.; Christensen, P. R.

    1993-01-01

    The mean particle diameter of surficial units on Mars has been approximated by applying thermal inertia determinations from the Mariner 9 Infrared Radiometer and the Viking Infrared Thermal Mapper data together with thermal conductivity measurement. Several studies have used this approximation to characterize surficial units and infer their nature and possible origin. Such interpretations are possible because previous measurements of the thermal conductivity of particulate materials have shown that particle size significantly affects thermal conductivity under martian atmospheric pressures. The transfer of thermal energy due to collisions of gas molecules is the predominant mechanism of thermal conductivity in porous systems for gas pressures above about 0.01 torr. At martian atmospheric pressures the mean free path of the gas molecules becomes greater than the effective distance over which conduction takes place between the particles. Gas particles are then more likely to collide with the solid particles than they are with each other. The average heat transfer distance between particles, which is related to particle size, shape and packing, thus determines how fast heat will flow through a particulate material.The derived one-to-one correspondence of thermal inertia to mean particle diameter implies a certain homogeneity in the materials analyzed. Yet the samples used were often characterized by fairly wide ranges of particle sizes with little information about the possible distribution of sizes within those ranges. Interpretation of thermal inertia data is further limited by the lack of data on other effects on the interparticle spacing relative to particle size, such as particle shape, bimodal or polymodal mixtures of grain sizes and formation of salt cements between grains. To address these limitations and to provide a more comprehensive set of thermal conductivities vs. particle size a linear heat source apparatus, similar to that of Cremers, was assembled to

  15. Active microwave measurements of Arctic sea ice under summer conditions

    NASA Technical Reports Server (NTRS)

    Onstott, R. G.; Gogineni, S. P.

    1985-01-01

    Radar provides a valuable tool in the study of sea-ice conditions and the solution of sea-ice operational problems. For this reason, the U.S. and Canada have conducted studies to define a bilateral synthetic aperture radar (SAR) satellite program. The present paper is concerned with work which has been performed to explore the needs associated with the study of sea-ice-covered waters. The design of a suitable research or operational spaceborne SAR or real aperture radar must be based on an adequate knowledge of the backscatter coefficients of the ice features which are of interest. In order to obtain the needed information, studies involving the use of a helicopter were conducted. In these studies L-C-X-Ku-band calibrated radar data were acquired over areas of Arctic first-year and multiyear ice during the first half of the summer of 1982. The results show that the microwave response in the case of sea ice is greatly influenced by summer melt, which produces significant changes in the properties of the snowpack and ice sheet.

  16. [New methodology for heavy metals measurement in water samples by PGNAA-XRF].

    PubMed

    Jia, Wen-Bao; Zhang, Yan; Hei, Da-Qian; Ling, Yong-Sheng; Shan, Qing; Cheng, Can

    2014-11-01

    In the present paper, a new combined detection method was proposed using prompt gamma neutron activation analysis (PGNAA) and characteristic X-ray fluorescence to improve the heavy metals measurement accuracy for in-situ environmental water rejects analysis by PGNAA technology. Especially, the characteristic X-ray fluorescence (XRF) of heavy metals is induced by prompt gamma-ray directly instead of the traditional excitation sources. Thus, a combined measurement facility with an 241 AmBe neutron source, a BGO detector and a NaI-Be detector was developed to analyze the pollutants in water. The two detectors were respectively used to record prompt gamma-ray and characteristic X-ray fluorescence of heavy metals. The prompt gamma-ray intensity (I(γ)) and characteristic X-ray fluorescence intensity (I(x)) was determined by MCNP calculations for different concentration (c(i)) of chromium (Cr), cadmium (Cd), mercury (Hg) and lead (Pb), respectively. The simulation results showed that there was a good linear relationship between I(γ), I(x) and (c(i)), respectively. The empirical formula of combined detection method was given based on the above calculations. It was found that the combined detection method was more sensitive for high atomic number heavy metals like Hg and Pb measurement than low atomic number like Cr and Cd by comparing and analyzing I(γ) and I(x). The limits of detection for Hg and Pb by the combined measurement instrument were 17.4 and 24.2 mg x kg(-1), respectively.

  17. Vertical profiles of aerosol volume from high-spectral-resolution infrared transmission measurements. I. Methodology.

    PubMed

    Eldering, A; Irion, F W; Chang, A Y; Gunson, M R; Mills, F P; Steele, H M

    2001-06-20

    The wavelength-dependent aerosol extinction in the 800-1250-cm(-1) region has been derived from ATMOS (atmospheric trace molecule spectroscopy) high-spectral-resolution IR transmission measurements. Using models of aerosol and cloud extinction, we have performed weighted nonlinear least-squares fitting to determine the aerosol-volume columns and vertical profiles of stratospheric sulfate aerosol and cirrus cloud volume. Modeled extinction by use of cold-temperature aerosol optical constants for a 70-80% sulfuric-acid-water solution shows good agreement with the measurements, and the derived aerosol volumes for a 1992 occultation are consistent with data from other experiments after the eruption of Mt. Pinatubo. The retrieved sulfuric acid aerosol-volume profiles are insensitive to the aerosol-size distribution and somewhat sensitive to the set of optical constants used. Data from the nonspherical cirrus extinction model agree well with a 1994 mid-latitude measurement indicating the presence of cirrus clouds at the tropopause.

  18. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    PubMed Central

    Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario

    2014-01-01

    Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565