Science.gov

Sample records for conditions measurement methodology

  1. Methodological Quality of National Guidelines for Pediatric Inpatient Conditions

    PubMed Central

    Hester, Gabrielle; Nelson, Katherine; Mahant, Sanjay; Eresuma, Emily; Keren, Ron; Srivastava, Rajendu

    2014-01-01

    Background Guidelines help inform standardization of care for quality improvement (QI). The Pediatric Research in Inpatient Settings (PRIS) network published a prioritization list of inpatient conditions with high prevalence, cost, and variation in resource utilization across children’s hospitals. The methodological quality of guidelines for priority conditions is unknown. Objective To rate the methodological quality of national guidelines for 20 priority pediatric inpatient conditions. Design We searched sources including PubMed for national guidelines published 2002–2012. Guidelines specific to one organism, test or treatment, or institution were excluded. Guidelines were rated by two raters using a validated tool (AGREE II) with an overall rating on a 7-point scale (7–highest). Inter-rater reliability was measured with a weighted kappa coefficient. Results 17 guidelines met inclusion criteria for 13 conditions, 7 conditions yielded no relevant national guidelines. The highest methodological quality guidelines were for asthma, tonsillectomy, and bronchiolitis (mean overall rating 7, 6.5 and 6.5 respectively); the lowest were for sickle cell disease (2 guidelines) and dental caries (mean overall rating 4, 3.5, and 3 respectively). The overall weighted kappa was 0.83 (95% confidence interval 0.78–0.87). Conclusions We identified a group of moderate to high methodological quality national guidelines for priority pediatric inpatient conditions. Hospitals should consider these guidelines to inform QI initiatives. PMID:24677729

  2. DEVELOPMENT OF MEASUREMENT METHODOLOGY FOR EVALUATING FUGITIVE PARTICULATE EMISSIONS

    EPA Science Inventory

    A measurement methodology to evaluate fugitive particulate emissions was developed and demonstrated. The project focused on the application of the lidar (laser radar) technique under field conditions, but in circumstances that simplified and controlled the variables of the genera...

  3. Structural health monitoring methodology for aircraft condition-based maintenance

    NASA Astrophysics Data System (ADS)

    Saniger, Jordi; Reithler, Livier; Guedra-Degeorges, Didier; Takeda, Nobuo; Dupuis, Jean Pierre

    2001-06-01

    Reducing maintenance costs while keeping a constant level of safety is a major issue for Air Forces and airlines. The long term perspective is to implement condition based maintenance to guarantee a constant safety level while decreasing maintenance costs. On this purpose, the development of a generalized Structural Health Monitoring System (SHMS) is needed. The objective of such a system is to localize the damages and to assess their severity, with enough accuracy to allow low cost corrective actions. The present paper describes a SHMS based on acoustic emission technology. This choice was driven by its reliability and wide use in the aerospace industry. The described SHMS uses a new learning methodology which relies on the generation of artificial acoustic emission events on the structure and an acoustic emission sensor network. The calibrated acoustic emission events picked up by the sensors constitute the knowledge set that the system relies on. With this methodology, the anisotropy of composite structures is taken into account, thus avoiding the major cause of errors of classical localization methods. Moreover, it is adaptive to different structures as it does not rely on any particular model but on measured data. The acquired data is processed and the event's location and corrected amplitude are computed. The methodology has been demonstrated and experimental tests on elementary samples presented a degree of accuracy of 1cm.

  4. A Wavelet-Based Methodology for Grinding Wheel Condition Monitoring

    SciTech Connect

    Liao, T. W.; Ting, C.F.; Qu, Jun; Blau, Peter Julian

    2007-01-01

    Grinding wheel surface condition changes as more material is removed. This paper presents a wavelet-based methodology for grinding wheel condition monitoring based on acoustic emission (AE) signals. Grinding experiments in creep feed mode were conducted to grind alumina specimens with a resinoid-bonded diamond wheel using two different conditions. During the experiments, AE signals were collected when the wheel was 'sharp' and when the wheel was 'dull'. Discriminant features were then extracted from each raw AE signal segment using the discrete wavelet decomposition procedure. An adaptive genetic clustering algorithm was finally applied to the extracted features in order to distinguish different states of grinding wheel condition. The test results indicate that the proposed methodology can achieve 97% clustering accuracy for the high material removal rate condition, 86.7% for the low material removal rate condition, and 76.7% for the combined grinding conditions if the base wavelet, the decomposition level, and the GA parameters are properly selected.

  5. PIMM: A Performance Improvement Measurement Methodology

    SciTech Connect

    Not Available

    1994-05-15

    This report presents a Performance Improvement Measurement Methodology (PIMM) for measuring and reporting the mission performance for organizational elements of the U.S. Department of Energy to comply with the Chief Financial Officer`s Act (CFOA) of 1990 and the Government Performance and Results Act (GPRA) of 1993. The PIMM is illustrated by application to the Morgantown Energy Technology Center (METC), a Research, Development and Demonstration (RD&D) field center of the Office of Fossil Energy, along with limited applications to the Strategic Petroleum Reserve Office and the Office of Fossil Energy. METC is now implementing the first year of a pilot project under GPRA using the PIMM. The PIMM process is applicable to all elements of the Department; organizations may customize measurements to their specific missions. The PIMM has four aspects: (1) an achievement measurement that applies to any organizational element, (2) key indicators that apply to institutional elements, (3) a risk reduction measurement that applies to all RD&D elements and to elements with long-term activities leading to risk-associated outcomes, and (4) a cost performance evaluation. Key Indicators show how close the institution is to attaining long range goals. Risk reduction analysis is especially relevant to RD&D. Product risk is defined as the chance that the product of new technology will not meet the requirements of the customer. RD&D is conducted to reduce technology risks to acceptable levels. The PIMM provides a profile to track risk reduction as RD&D proceeds. Cost performance evaluations provide a measurement of the expected costs of outcomes relative to their actual costs.

  6. Measuring Program Outcomes: Using Retrospective Pretest Methodology.

    ERIC Educational Resources Information Center

    Pratt, Clara C.; McGuigan, William M.; Katsev, Aphra R.

    2000-01-01

    Used longitudinal data from 307 mothers of firstborn infants participating in a home-visitation, child abuse prevention program in a retrospective pretest methodology. Results shows that when response shift bias was present, the retrospective pretest methodology produced a more legitimate assessment of program outcomes than did the traditional…

  7. Relative Hazard and Risk Measure Calculation Methodology

    SciTech Connect

    Stenner, Robert D.; Strenge, Dennis L.; Elder, Matthew S.

    2004-03-20

    The relative hazard (RH) and risk measure (RM) methodology and computer code is a health risk-based tool designed to allow managers and environmental decision makers the opportunity to readily consider human health risks (i.e., public and worker risks) in their screening-level analysis of alternative cleanup strategies. Environmental management decisions involve consideration of costs, schedules, regulatory requirements, health hazards, and risks. The RH-RM tool is a risk-based environmental management decision tool that allows managers the ability to predict and track health hazards and risks over time as they change in relation to mitigation and cleanup actions. Analysis of the hazards and risks associated with planned mitigation and cleanup actions provides a baseline against which alternative strategies can be compared. This new tool allows managers to explore “what if scenarios,” to better understand the impact of alternative mitigation and cleanup actions (i.e., alternatives to the planned actions) on health hazards and risks. This new tool allows managers to screen alternatives on the basis of human health risk and compare the results with cost and other factors pertinent to the decision. Once an alternative or a narrow set of alternatives are selected, it will then be more cost-effective to perform the detailed risk analysis necessary for programmatic and regulatory acceptance of the selected alternative. The RH-RM code has been integrated into the PNNL developed Framework for Risk Analysis In Multimedia Environmental Systems (FRAMES) to allow the input and output data of the RH-RM code to be readily shared with the more comprehensive risk analysis models, such as the PNNL developed Multimedia Environmental Pollutant Assessment System (MEPAS) model.

  8. Methodology and Process for Condition Assessment at Existing Hydropower Plants

    SciTech Connect

    Zhang, Qin Fen; Smith, Brennan T; Cones, Marvin; March, Patrick; Dham, Rajesh; Spray, Michael

    2012-01-01

    Hydropower Advancement Project was initiated by the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy to develop and implement a systematic process with a standard methodology to identify the opportunities of performance improvement at existing hydropower facilities and to predict and trend the overall condition and improvement opportunity within the U.S. hydropower fleet. The concept of performance for the HAP focuses on water use efficiency how well a plant or individual unit converts potential energy to electrical energy over a long-term averaging period of a year or more. The performance improvement involves not only optimization of plant dispatch and scheduling but also enhancement of efficiency and availability through advanced technology and asset upgrades, and thus requires inspection and condition assessment for equipment, control system, and other generating assets. This paper discusses the standard methodology and process for condition assessment of approximately 50 nationwide facilities, including sampling techniques to ensure valid expansion of the 50 assessment results to the entire hydropower fleet. The application and refining process and the results from three demonstration assessments are also presented in this paper.

  9. Quantum Measurement and Initial Conditions

    NASA Astrophysics Data System (ADS)

    Stoica, Ovidiu Cristinel

    2016-03-01

    Quantum measurement finds the observed system in a collapsed state, rather than in the state predicted by the Schrödinger equation. Yet there is a relatively spread opinion that the wavefunction collapse can be explained by unitary evolution (for instance in the decoherence approach, if we take into account the environment). In this article it is proven a mathematical result which severely restricts the initial conditions for which measurements have definite outcomes, if pure unitary evolution is assumed. This no-go theorem remains true even if we take the environment into account. The result does not forbid a unitary description of the measurement process, it only shows that such a description is possible only for very restricted initial conditions. The existence of such restrictions of the initial conditions can be understood in the four-dimensional block universe perspective, as a requirement of global self-consistency of the solutions of the Schrödinger equation.

  10. Wastewater Sampling Methodologies and Flow Measurement Techniques.

    ERIC Educational Resources Information Center

    Harris, Daniel J.; Keffer, William J.

    This document provides a ready source of information about water/wastewater sampling activities using various commercial sampling and flow measurement devices. The report consolidates the findings and summarizes the activities, experiences, sampling methods, and field measurement techniques conducted by the Environmental Protection Agency (EPA),…

  11. Methodological Challenges in Measuring Child Maltreatment

    ERIC Educational Resources Information Center

    Fallon, Barbara; Trocme, Nico; Fluke, John; MacLaurin, Bruce; Tonmyr, Lil; Yuan, Ying-Ying

    2010-01-01

    Objective: This article reviewed the different surveillance systems used to monitor the extent of reported child maltreatment in North America. Methods: Key measurement and definitional differences between the surveillance systems are detailed and their potential impact on the measurement of the rate of victimization. The infrastructure…

  12. A methodological review of resilience measurement scales

    PubMed Central

    2011-01-01

    Background The evaluation of interventions and policies designed to promote resilience, and research to understand the determinants and associations, require reliable and valid measures to ensure data quality. This paper systematically reviews the psychometric rigour of resilience measurement scales developed for use in general and clinical populations. Methods Eight electronic abstract databases and the internet were searched and reference lists of all identified papers were hand searched. The focus was to identify peer reviewed journal articles where resilience was a key focus and/or is assessed. Two authors independently extracted data and performed a quality assessment of the scale psychometric properties. Results Nineteen resilience measures were reviewed; four of these were refinements of the original measure. All the measures had some missing information regarding the psychometric properties. Overall, the Connor-Davidson Resilience Scale, the Resilience Scale for Adults and the Brief Resilience Scale received the best psychometric ratings. The conceptual and theoretical adequacy of a number of the scales was questionable. Conclusion We found no current 'gold standard' amongst 15 measures of resilience. A number of the scales are in the early stages of development, and all require further validation work. Given increasing interest in resilience from major international funders, key policy makers and practice, researchers are urged to report relevant validation statistics when using the measures. PMID:21294858

  13. Experimental comparison of exchange bias measurement methodologies

    SciTech Connect

    Hovorka, Ondrej; Berger, Andreas; Friedman, Gary

    2007-05-01

    Measurements performed on all-ferromagnetic bilayer systems and supported by model calculation results are used to compare different exchange bias characterization methods. We demonstrate that the accuracy of the conventional two-point technique based on measuring the sum of the coercive fields depends on the symmetry properties of hysteresis loops. On the other hand, the recently proposed center of mass method yields results independent of the hysteresis loop type and coincides with the two-point measurement only if the loops are symmetric. Our experimental and simulation results clearly demonstrate a strong correlation between loop asymmetry and the difference between these methods.

  14. New seeding methodology for gas concentration measurements.

    PubMed

    Chan, Qing N; Medwell, Paul R; Dally, Bassam B; Alwahabi, Zeyad T; Nathan, Graham J

    2012-07-01

    This paper presents the first demonstration of the pulsed laser ablation technique to seed a laminar non-reacting gaseous jet at atmospheric pressure. The focused, second harmonic from a pulsed Nd : YAG laser is used to ablate a neutral indium rod at atmospheric pressure and temperature. The ablation products generated with the new seeding method are used to seed the jet, as a marker of the scalar field. The neutral indium atoms so generated are found to be stable and survive a convection time of the order of tens of seconds before entering the interrogation region. The measurements of planar laser-induced fluorescence (PLIF) with indium and laser nephelometry measurements with the ablation products are both reported. The resulting average and root mean square (RMS) of the measurements are found to agree reasonably well although some differences are found. The results show that the pulsed laser ablation method has potential to provide scalar measurement for mixing studies. PMID:22710315

  15. Diffusions conditioned on occupation measures

    NASA Astrophysics Data System (ADS)

    Angeletti, Florian; Touchette, Hugo

    2016-02-01

    A Markov process fluctuating away from its typical behavior can be represented in the long-time limit by another Markov process, called the effective or driven process, having the same stationary states as the original process conditioned on the fluctuation observed. We construct here this driven process for diffusions spending an atypical fraction of their evolution in some region of state space, corresponding mathematically to stochastic differential equations conditioned on occupation measures. As an illustration, we consider the Langevin equation conditioned on staying for a fraction of time in different intervals of the real line, including the positive half-line which leads to a generalization of the Brownian meander problem. Other applications related to quasi-stationary distributions, metastable states, noisy chemical reactions, queues, and random walks are discussed.

  16. Toward a new methodology for measuring the threshold Shields number

    NASA Astrophysics Data System (ADS)

    Rousseau, Gauthier; Dhont, Blaise; Ancey, Christophe

    2016-04-01

    A number of bedload transport equations involve the threshold Shields number (corresponding to the threshold of incipient motion for particles resting on the streambed). Different methods have been developed for determining this threshold Shields number; they usually assume that the initial streambed is plane prior to sediment transport. Yet, there are many instances in real-world scenarios, in which the initial streambed is not free of bed forms. We are interested in developing a new methodology for determining the threshold of incipient motion in gravel-bed streams in which smooth bed forms (e.g., anti-dunes) develop. Experiments were conducted in a 10-cm wide, 2.5-m long flume, whose initial inclination was 3%. Flows were supercritical and fully turbulent. The flume was supplied with water and sediment at fixed rates. As bed forms developed and migrated, and sediment transport rates exhibited wide fluctuations, measurements had to be taken over long times (typically 10 hr). Using a high-speed camera, we recorded the instantaneous bed load transport rate at the outlet of the flume by taking top-view images. In parallel, we measured the evolution of the bed slope, water depth, and shear stress by filming through a lateral window of the flume. These measurements allowed for the estimation of the space and time-averaged slope, from which we deduced the space and time-averaged Shields number under incipient bed load transport conditions. In our experiments, the threshold Shields number was strongly dependent on streambed morphology. Experiments are under way to determine whether taking the space and time average of incipient motion experiments leads to a more robust definition of the threshold Shields number. If so, this new methodology will perform better than existing approaches at measuring the threshold Shields number.

  17. Evaluation of UT Wall Thickness Measurements and Measurement Methodology

    SciTech Connect

    Weier, Dennis R.; Pardini, Allan F.

    2007-10-01

    CH2M HILL has requested that PNNL examine the ultrasonic methodology utilized in the inspection of the Hanford double shell waste tanks. Specifically, PNNL is to evaluate the UT process variability and capability to detect changes in wall thickness and to document the UT operator's techniques and methodology in the determination of the reported minimum and average UT data and how it compares to the raw (unanalyzed) UT data.

  18. Methodological aspects of EEG and body dynamics measurements during motion.

    PubMed

    Reis, Pedro M R; Hebenstreit, Felix; Gabsteiger, Florian; von Tscharner, Vinzenz; Lochmann, Matthias

    2014-01-01

    EEG involves the recording, analysis, and interpretation of voltages recorded on the human scalp which originate from brain gray matter. EEG is one of the most popular methods of studying and understanding the processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements that are performed in response to the environment. However, there are methodological difficulties which can occur when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions on how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics, and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determinating real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks. PMID:24715858

  19. Methodological aspects of EEG and body dynamics measurements during motion

    PubMed Central

    Reis, Pedro M. R.; Hebenstreit, Felix; Gabsteiger, Florian; von Tscharner, Vinzenz; Lochmann, Matthias

    2014-01-01

    EEG involves the recording, analysis, and interpretation of voltages recorded on the human scalp which originate from brain gray matter. EEG is one of the most popular methods of studying and understanding the processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements that are performed in response to the environment. However, there are methodological difficulties which can occur when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions on how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics, and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determinating real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks. PMID:24715858

  20. DEVELOPMENT OF PETROLUEM RESIDUA SOLUBILITY MEASUREMENT METHODOLOGY

    SciTech Connect

    Per Redelius

    2006-03-01

    In the present study an existing spectrophotometry system was upgraded to provide high-resolution ultraviolet (UV), visible (Vis), and near infrared (NIR) analyses of test solutions to measure the relative solubilities of petroleum residua dissolved in eighteen test solvents. Test solutions were prepared by dissolving ten percent petroleum residue in a given test solvent, agitating the mixture, followed by filtration and/or centrifugation to remove insoluble materials. These solutions were finally diluted with a good solvent resulting in a supernatant solution that was analyzed by spectrophotometry to quantify the degree of dissolution of a particular residue in the suite of test solvents that were selected. Results obtained from this approach were compared with spot-test data (to be discussed) obtained from the cosponsor.

  1. The Piston Compressor: The Methodology of the Real-Time Condition Monitoring

    NASA Astrophysics Data System (ADS)

    Naumenko, A. P.; Kostyukov, V. N.

    2012-05-01

    The methodology of a diagnostic signal processing, a function chart of the monitoring system are considered in the article. The methodology of monitoring and diagnosing is based on measurement of indirect processes' parameters (vibroacoustic oscillations) therefore no more than five sensors is established on the cylinder, measurement of direct structural and thermodynamic parameters is envisioned as well. The structure and principle of expert system's functioning of decision-making is given. Algorithm of automatic expert system includes the calculation diagnostic attributes values based on their normative values, formation sets of diagnostic attributes that correspond to individual classes to malfunction, formation of expert system messages. The scheme of a real-time condition monitoring system for piston compressors is considered. The system have consistently-parallel structure of information-measuring equipment, which allows to measure the vibroacoustic signal for condition monitoring of reciprocating compressors and modes of its work. Besides, the system allows to measure parameters of other physical processes, for example, system can measure and use for monitoring and statements of the diagnosis the pressure in decreasing spaces (the indicator diagram), the inlet pressure and flowing pressure of each cylinder, inlet and delivery temperature of gas, valves temperature, position of a rod, leakage through compression packing and others.

  2. Measuring the Quality of Publications: New Methodology and Case Study.

    ERIC Educational Resources Information Center

    Kleijnen, Jack P. C.; Van Groenendaal, Willem

    2000-01-01

    Proposes a methodology using citations to measure the quality of journals, proceedings, and book publishers. Explains the use of statistical sampling, bootstrapping, and classification that results in ranked lists of journals, proceedings, and publishers, as evidenced in a case study of the information systems field. (Author/LRW)

  3. Methodological considerations for measuring spontaneous physical activity in rodents

    PubMed Central

    Perez-Leighton, Claudio E.; Billington, Charles J.; Kotz, Catherine M.

    2014-01-01

    When exploring biological determinants of spontaneous physical activity (SPA), it is critical to consider whether methodological factors differentially affect rodents and the measured SPA. We determined whether acclimation time, sensory stimulation, vendor, or chamber size affected measures in rodents with varying propensity for SPA. We used principal component analysis to determine which SPA components (ambulatory and vertical counts, time in SPA, and distance traveled) best described the variability in SPA measurements. We compared radiotelemetry and infrared photobeams used to measure SPA and exploratory activity. Acclimation time, sensory stimulation, vendor, and chamber size independently influenced SPA, and the effect was moderated by the propensity for SPA. A 24-h acclimation period prior to SPA measurement was sufficient for habituation. Principal component analysis showed that ambulatory and vertical measurements of SPA describe different dimensions of the rodent's SPA behavior. Smaller testing chambers and a sensory attenuation cubicle around the chamber reduced SPA. SPA varies between rodents purchased from different vendors. Radiotelemetry and infrared photobeams differ in their sensitivity to detect phenotypic differences in SPA and exploratory activity. These data highlight methodological considerations in rodent SPA measurement and a need to standardize SPA methodology. PMID:24598463

  4. Holdup measurements under realistic conditions

    SciTech Connect

    Sprinkel, J.K. Jr.; Marshall, R.; Russo, P.A.; Siebelist, R.

    1997-11-01

    This paper reviews the documentation of the precision and bias of holdup (residual nuclear material remaining in processing equipment) measurements and presents previously unreported results. Precision and bias results for holdup measurements are reported from training seminars with simulated holdup, which represent the best possible results, and compared to actual plutonium processing facility measurements. Holdup measurements for plutonium and uranium processing plants are also compared to reference values. Recommendations for measuring holdup are provided for highly enriched uranium facilities and for low enriched uranium facilities. The random error component of holdup measurements is less than the systematic error component. The most likely factor in measurement error is incorrect assumptions about the measurement, such as background, measurement geometry, or signal attenuation. Measurement precision on the order of 10% can be achieved with some difficulty. Bias of poor quality holdup measurement can also be improved. However, for most facilities, holdup measurement errors have no significant impact on inventory difference, sigma, or safety (criticality, radiation, or environmental); therefore, it is difficult to justify the allocation of more resources to improving holdup measurements. 25 refs., 10 tabs.

  5. National working conditions surveys in Latin America: comparison of methodological characteristics

    PubMed Central

    Merino-Salazar, Pamela; Artazcoz, Lucía; Campos-Serna, Javier; Gimeno, David; Benavides, Fernando G.

    2015-01-01

    Background: High-quality and comparable data to monitor working conditions and health in Latin America are not currently available. In 2007, multiple Latin American countries started implementing national working conditions surveys. However, little is known about their methodological characteristics. Objective: To identify commonalities and differences in the methodologies of working conditions surveys (WCSs) conducted in Latin America through 2013. Methods: The study critically examined WCSs in Latin America between 2007 and 2013. Sampling design, data collection, and questionnaire content were compared. Results: Two types of surveys were identified: (1) surveys covering the entire working population and administered at the respondent's home and (2) surveys administered at the workplace. There was considerable overlap in the topics covered by the dimensions of employment and working conditions measured, but less overlap in terms of health outcomes, prevention resources, and activities. Conclusions: Although WCSs from Latin America are similar, there was heterogeneity across surveyed populations and location of the interview. Reducing differences in surveys between countries will increase comparability and allow for a more comprehensive understanding of occupational health in the region. PMID:26079314

  6. From lab to field conditions: a pilot study on EEG methodology in applied sports sciences.

    PubMed

    Reinecke, Kirsten; Cordes, Marjolijn; Lerch, Christiane; Koutsandréou, Flora; Schubert, Michael; Weiss, Michael; Baumeister, Jochen

    2011-12-01

    Although neurophysiological aspects have become more important in sports and exercise sciences in the last years, it was not possible to measure cortical activity during performance outside a laboratory due to equipment limits or movement artifacts in particular. With this pilot study we want to investigate whether Electroencephalography (EEG) data obtained in a laboratory golf putting performance differ from a suitable putting task under field conditions. Therefore, parameters of the working memory (frontal Theta and parietal Alpha 2 power) were recorded during these two conditions. Statistical calculations demonstrated a significant difference only for Theta power at F4 regarding the two putting conditions "field" and "laboratory". These findings support the idea that brain activity patterns obtained under laboratory conditions are comparable but not equivalent to those obtained under field conditions. Additionally, we were able to show that the EEG methodology seems to be a reliable tool to observe brain activity under field conditions in a golf putting task. However, considering the still existing problems of movement artifacts during EEG measurements, eligible sports and exercises are limited to those being relatively motionless during execution. Further studies are needed to confirm these pilot results. PMID:21800184

  7. Methodology of high-resolution photography for mural condition database

    NASA Astrophysics Data System (ADS)

    Higuchi, R.; Suzuki, T.; Shibata, M.; Taniguchi, Y.

    2015-08-01

    Digital documentation is one of the most useful techniques to record the condition of cultural heritage. Recently, high-resolution images become increasingly useful because it is possible to show general views of mural paintings and also detailed mural conditions in a single image. As mural paintings are damaged by environmental stresses, it is necessary to record the details of painting condition on high-resolution base maps. Unfortunately, the cost of high-resolution photography and the difficulty of operating its instruments and software have commonly been an impediment for researchers and conservators. However, the recent development of graphic software makes its operation simpler and less expensive. In this paper, we suggest a new approach to make digital heritage inventories without special instruments, based on our recent our research project in Üzümlü church in Cappadocia, Turkey. This method enables us to achieve a high-resolution image database with low costs, short time, and limited human resources.

  8. Research Methodology: Endocrinologic Measurements in Exercise Science and Sports Medicine

    PubMed Central

    Hackney, Anthony C; Viru, Atko

    2008-01-01

    Objective: To provide background information on methodologic factors that influence and add variance to endocrine outcome measurements. Our intent is to aid and improve the quality of exercise science and sports medicine research endeavors of investigators inexperienced in endocrinology. Background: Numerous methodologic factors influence human endocrine (hormonal) measurements and, consequently, can dramatically compromise the accuracy and validity of exercise and sports medicine research. These factors can be categorized into those that are biologic and those that are procedural-analytic in nature. Recommendations: Researchers should design their studies to monitor, control, and adjust for the biologic and procedural-analytic factors discussed within this paper. By doing so, they will find less variance in their hormonal outcomes and thereby will increase the validity of their physiologic data. These actions can assist the researcher in the interpretation and understanding of endocrine data and, in turn, make their research more scientifically sound. PMID:19030142

  9. Physiological outflow boundary conditions methodology for small arteries with multiple outlets: a patient-specific hepatic artery haemodynamics case study.

    PubMed

    Aramburu, Jorge; Antón, Raúl; Bernal, Nebai; Rivas, Alejandro; Ramos, Juan Carlos; Sangro, Bruno; Bilbao, José Ignacio

    2015-04-01

    Physiological outflow boundary conditions are necessary to carry out computational fluid dynamics simulations that reliably represent the blood flow through arteries. When dealing with complex three-dimensional trees of small arteries, and therefore with multiple outlets, the robustness and speed of convergence are also important. This study derives physiological outflow boundary conditions for cases in which the physiological values at those outlets are not known (neither in vivo measurements nor literature-based values are available) and in which the tree exhibits symmetry to some extent. The inputs of the methodology are the three-dimensional domain and the flow rate waveform and the systolic and diastolic pressures at the inlet. The derived physiological outflow boundary conditions, which are a physiological pressure waveform for each outlet, are based on the results of a zero-dimensional model simulation. The methodology assumes symmetrical branching and is able to tackle the flow distribution problem when the domain outlets are at branches with a different number of upstream bifurcations. The methodology is applied to a group of patient-specific arteries in the liver. The methodology is considered to be valid because the pulsatile computational fluid dynamics simulation with the inflow flow rate waveform (input of the methodology) and the derived outflow boundary conditions lead to physiological results, that is, the resulting systolic and diastolic pressures at the inlet match the inputs of the methodology, and the flow split is also physiological. PMID:25934258

  10. Novel CD-SEM measurement methodology for complex OPCed patterns

    NASA Astrophysics Data System (ADS)

    Lee, Hyung-Joo; Park, Won Joo; Choi, Seuk Hwan; Chung, Dong Hoon; Shin, Inkyun; Kim, Byung-Gook; Jeon, Chan-Uk; Fukaya, Hiroshi; Ogiso, Yoshiaki; Shida, Soichi; Nakamura, Takayuki

    2014-07-01

    As design rules of lithography shrink: accuracy and precision of Critical Dimension (CD) and controllability of hard OPCed patterns are required in semiconductor production. Critical Dimension Scanning Electron Microscopes (CD SEM) are essential tools to confirm the quality of a mask such as CD control; CD uniformity and CD mean to target (MTT). Basically, Repeatability and Reproducibility (R and R) performance depends on the length of Region of Interest (ROI). Therefore, the measured CD can easily fluctuate in cases of extremely narrow regions of OPCed patterns. With that premise, it is very difficult to define MTT and uniformity of complex OPCed masks using the conventional SEM measurement approach. To overcome these difficulties, we evaluated Design Based Metrology (DBM) using Large Field Of View (LFOV) of CD-SEM. DBM can standardize measurement points and positions within LFOV based on the inflection/jog of OPCed patterns. Thus, DBM has realized several thousand multi ROI measurements with average CD. This new measurement technique can remove local CD errors and improved statistical methodology of the entire mask to enhance the representativeness of global CD uniformity. With this study we confirmed this new technique as a more reliable methodology in complex OPCed patterns compared to conventional technology. This paper summarizes the experiments of DBM with LFOV using various types of the patterns and compares them with current CD SEM methods.

  11. Methodology for measurement in schools and kindergartens: experiences.

    PubMed

    Fojtíková, I; Navrátilová Rovenská, K

    2015-06-01

    In more than 1500 schools and preschool facilities, long-term radon measurement was carried out in the last 3 y. The negative effect of thermal retrofitting on the resulting long-term radon averages is evident. In some of the facilities, low ventilation rates and correspondingly high radon levels were found, so it was recommended to change ventilation habits. However, some of the facilities had high radon levels due to its ingress from soil gas. Technical measures should be undertaken to reduce radon exposure in this case. The paper presents the long-term experiences with the two-stage measurement methodology for investigation of radon levels in school and preschool facilities and its possible improvements. PMID:25979748

  12. Methodological Research Priorities in Palliative Care and Hospice Quality Measurement.

    PubMed

    Dy, Sydney Morss; Herr, Keela; Bernacki, Rachelle E; Kamal, Arif H; Walling, Anne M; Ersek, Mary; Norton, Sally A

    2016-02-01

    Quality measurement is a critical tool for improving palliative care and hospice, but significant research is needed to improve the application of quality indicators. We defined methodological priorities for advancing the science of quality measurement in this field based on discussions of the Technical Advisory Panel of the Measuring What Matters consensus project of the American Academy of Hospice and Palliative Medicine and Hospice and Palliative Nurses Association and a subsequent strategy meeting to better clarify research challenges, priorities, and quality measurement implementation strategies. In this article, we describe three key priorities: 1) defining the denominator(s) (or the population of interest) for palliative care quality indicators, 2) developing methods to measure quality from different data sources, and 3) conducting research to advance the development of patient/family-reported indicators. We then apply these concepts to the key quality domain of advance care planning and address relevance to implementation of indicators in improving care. Developing the science of quality measurement in these key areas of palliative care and hospice will facilitate improved quality measurement across all populations with serious illness and care for patients and families. PMID:26596877

  13. A Methodology for Measuring Strain in Power Semiconductors

    NASA Astrophysics Data System (ADS)

    Avery, Seth M.

    The objective of this work is to develop a strain measurement methodology for use in power electronics during electrical operation; such that strain models can be developed and used as the basis of an active strain controller---improving the reliability of power electronics modules. This research involves developing electronic speckle pattern interferometry (ESPI) into a technology capable of measuring thermal-mechanical strain in electrically active power semiconductors. ESPI is a non-contact optical technique capable of high resolution (approx. 10 nm) surface displacement measurements. This work has developed a 3-D ESPI test stand, where simultaneous in- and out-of-plane measured components are combined to accurately determine full-field surface displacement. Two cameras are used to capture both local (interconnect level) displacements and strains, and global (device level) displacements. Methods have been developed to enable strain measurements of larger loads, while avoiding speckle decorrelation (which limits ESPI measurement of large deformations). A method of extracting strain estimates directly from unfiltered and wrapped phase maps has been developed, simplifying data analysis. Experimental noise measurements are made and used to develop optimal filtering using model-based tracking and determined strain noise characteristics. The experimental results of this work are strain measurements made on the surface of a leadframe of an electrically active IGBT. A model-based tracking technique has been developed to allow for the optimal strain solution to be extracted from noisy displacement results. Also, an experimentally validated thermal-mechanical FE strain model has been developed. The results of this work demonstrate that in situ strain measurements in power devices are feasible. Using the procedures developed in the work, strain measurements at critical locations of strain, which limit device reliability, at relevant power levels can be completed.

  14. Select Methodology for Validating Advanced Satellite Measurement Systems

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.

    2008-01-01

    Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.

  15. Conditional Standard Error of Measurement in Prediction.

    ERIC Educational Resources Information Center

    Woodruff, David

    1990-01-01

    A method of estimating conditional standard error of measurement at specific score/ability levels is described that avoids theoretical problems identified for previous methods. The method focuses on variance of observed scores conditional on a fixed value of an observed parallel measurement, decomposing these variances into true and error parts.…

  16. Measuring user experience in digital gaming: theoretical and methodological issues

    NASA Astrophysics Data System (ADS)

    Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte

    2007-01-01

    There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.

  17. Telemetric measurement system of beehive environment conditions

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Sawicki, Aleksander

    2014-11-01

    This work presents a measurement system of beehive environmental conditions. The purpose of the device is to perform measurements of parameters such as ambient temperature, atmospheric pressure, internal temperature, humidity and sound level. The measured values were transferred to the MySQL database, which is located on an external server, with the use of GPRS protocol. A website presents the measurement data in the form of tables and graphs. The study also shows exemplary results of environmental conditions measurements recorded in the beehive by hour cycle.

  18. Methodological considerations for measuring glucocorticoid metabolites in feathers

    PubMed Central

    Berk, Sara A.; McGettrick, Julie R.; Hansen, Warren K.; Breuner, Creagh W.

    2016-01-01

    In recent years, researchers have begun to use corticosteroid metabolites in feathers (fCORT) as a metric of stress physiology in birds. However, there remain substantial questions about how to measure fCORT most accurately. Notably, small samples contain artificially high amounts of fCORT per millimetre of feather (the small sample artefact). Furthermore, it appears that fCORT is correlated with circulating plasma corticosterone only when levels are artificially elevated by the use of corticosterone implants. Here, we used several approaches to address current methodological issues with the measurement of fCORT. First, we verified that the small sample artefact exists across species and feather types. Second, we attempted to correct for this effect by increasing the amount of methanol relative to the amount of feather during extraction. We consistently detected more fCORT per millimetre or per milligram of feather in small samples than in large samples even when we adjusted methanol:feather concentrations. We also used high-performance liquid chromatography to identify hormone metabolites present in feathers and measured the reactivity of these metabolites against the most commonly used antibody for measuring fCORT. We verified that our antibody is mainly identifying corticosterone (CORT) in feathers, but other metabolites have significant cross-reactivity. Lastly, we measured faecal glucocorticoid metabolites in house sparrows and correlated these measurements with corticosteroid metabolites deposited in concurrently grown feathers; we found no correlation between faecal glucocorticoid metabolites and fCORT. We suggest that researchers should be cautious in their interpretation of fCORT in wild birds and should seek alternative validation methods to examine species-specific relationships between environmental challenges and fCORT. PMID:27335650

  19. Methodological considerations for measuring glucocorticoid metabolites in feathers.

    PubMed

    Berk, Sara A; McGettrick, Julie R; Hansen, Warren K; Breuner, Creagh W

    2016-01-01

    In recent years, researchers have begun to use corticosteroid metabolites in feathers (fCORT) as a metric of stress physiology in birds. However, there remain substantial questions about how to measure fCORT most accurately. Notably, small samples contain artificially high amounts of fCORT per millimetre of feather (the small sample artefact). Furthermore, it appears that fCORT is correlated with circulating plasma corticosterone only when levels are artificially elevated by the use of corticosterone implants. Here, we used several approaches to address current methodological issues with the measurement of fCORT. First, we verified that the small sample artefact exists across species and feather types. Second, we attempted to correct for this effect by increasing the amount of methanol relative to the amount of feather during extraction. We consistently detected more fCORT per millimetre or per milligram of feather in small samples than in large samples even when we adjusted methanol:feather concentrations. We also used high-performance liquid chromatography to identify hormone metabolites present in feathers and measured the reactivity of these metabolites against the most commonly used antibody for measuring fCORT. We verified that our antibody is mainly identifying corticosterone (CORT) in feathers, but other metabolites have significant cross-reactivity. Lastly, we measured faecal glucocorticoid metabolites in house sparrows and correlated these measurements with corticosteroid metabolites deposited in concurrently grown feathers; we found no correlation between faecal glucocorticoid metabolites and fCORT. We suggest that researchers should be cautious in their interpretation of fCORT in wild birds and should seek alternative validation methods to examine species-specific relationships between environmental challenges and fCORT. PMID:27335650

  20. Experimental Methodology for Measuring Combustion and Injection-Coupled Responses

    NASA Technical Reports Server (NTRS)

    Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.

    2006-01-01

    A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.

  1. Innovative methodologies and technologies for thermal energy release measurement.

    NASA Astrophysics Data System (ADS)

    Marotta, Enrica; Peluso, Rosario; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Chiodini, Giovanni; Mangiacapra, Annarita; Petrillo, Zaccaria; Sansivero, Fabio; Vilardo, Giuseppe; Marfe, Barbara

    2016-04-01

    Volcanoes exchange heat, gases and other fluids between the interrior of the Earth and its atmosphere influencing processes both at the surface and above it. This work is devoted to improve the knowledge on the parameters that control the anomalies in heat flux and chemical species emissions associated with the diffuse degassing processes of volcanic and hydrothermal zones. We are studying and developing innovative medium range remote sensing technologies to measure the variations through time of heat flux and chemical emissions in order to boost the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The current methodologies used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. Remote sensing of these parameters will allow for measurements faster than already accredited methods therefore it will be both more effective and efficient in case of emergency and it will be used to make quick routine monitoring. We are currently developing a method based on drone-born IR cameras to measure the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. The use of flying drones will allow to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature at distance in the order of hundreds of meters. Further development of remote sensing will be done through the use, on flying drones, of multispectral and/or iperspectral sensors, UV scanners in order to be able to detect the amount of chemical species released in the athmosphere.

  2. Optimization of Preparation Conditions for Lysozyme Nanoliposomes Using Response Surface Methodology and Evaluation of Their Stability.

    PubMed

    Wu, Zhipan; Guan, Rongfa; Lyu, Fei; Liu, Mingqi; Gao, Jianguo; Cao, Guozou

    2016-01-01

    The main purpose of this study was to optimize the preparation of lysozyme nanoliposomes using response surface methodology and measure their stability. The stabilities of lysozyme nanoliposomes in simulated gastrointestinal fluid (SGF), simulated intestinal fluid (SIF), as well as pH, temperature and sonication treatment time were evaluated. Reverse-phase evaporation method is an easy, speedy, and beneficial approach for nanoliposomes' preparation and optimization. The optimal preparative conditions were as follows: phosphatidylcholine-to-cholesterol ratio of 3.86, lysozyme concentration of 1.96 mg/mL, magnetic stirring time of 40.61 min, and ultrasound time of 14.15 min. At the optimal point, encapsulation efficiency and particle size were found to be 75.36% ± 3.20% and 245.6 nm ± 5.2 nm, respectively. The lysozyme nanoliposomes demonstrated certain stability in SGF and SIF at a temperature of 37 °C for 4 h, and short sonication handling times were required to attain nano-scaled liposomes. Under conditions of high temperature, acidity and alkalinity, lysozyme nanoliposomes are unstable. PMID:27338315

  3. Measuring persistence: A literature review focusing on methodological issues

    SciTech Connect

    Wolfe, A.K.; Brown, M.A.; Trumble, D.

    1995-03-01

    This literature review was conducted as part of a larger project to produce a handbook on the measurement of persistence. The past decade has marked the development of the concept of persistence and a growing recognition that the long-term impacts of demand-side management (DSM) programs warrant careful assessment. Although Increasing attention has been paid to the topic of persistence, no clear consensus has emerged either about its definition or about the methods most appropriate for its measurement and analysis. This project strives to fill that gap by reviewing the goals, terminology, and methods of past persistence studies. It was conducted from the perspective of a utility that seeks to acquire demand-side resources and is interested in their long-term durability; it was not conducted from the perspective of the individual consumer. Over 30 persistence studies, articles, and protocols were examined for this report. The review begins by discussing the underpinnings of persistence studies: namely, the definitions of persistence and the purposes of persistence studies. Then. it describes issues relevant to both the collection and analysis of data on the persistence of energy and demand savings. Findings from persistence studies also are summarized. Throughout the review, four studies are used repeatedly to illustrate different methodological and analytical approaches to persistence so that readers can track the data collection. data analysis, and findings of a set of comprehensive studies that represent alternative approaches.

  4. Measurement of steroid concentrations in brain tissue: methodological considerations.

    PubMed

    Taves, Matthew D; Ma, Chunqi; Heimovics, Sarah A; Saldanha, Colin J; Soma, Kiran K

    2011-01-01

    It is well recognized that steroids are synthesized de novo in the brain (neurosteroids). In addition, steroids circulating in the blood enter the brain. Steroids play numerous roles in the brain, such as influencing neural development, adult neuroplasticity, behavior, neuroinflammation, and neurodegenerative diseases such as Alzheimer's disease. In order to understand the regulation and functions of steroids in the brain, it is important to directly measure steroid concentrations in brain tissue. In this brief review, we discuss methods for the detection and quantification of steroids in the brain. We concisely present the major advantages and disadvantages of different technical approaches at various experimental stages: euthanasia, tissue collection, steroid extraction, steroid separation, and steroid measurement. We discuss, among other topics, the potential effects of anesthesia and saline perfusion prior to tissue collection; microdissection via Palkovits punch; solid phase extraction; chromatographic separation of steroids; and immunoassays and mass spectrometry for steroid quantification, particularly the use of mass spectrometry for "steroid profiling." Finally, we discuss the interpretation of local steroid concentrations, such as comparing steroid levels in brain tissue with those in the circulation (plasma vs. whole blood samples; total vs. free steroid levels). We also present reference values for a variety of steroids in different brain regions of adult rats. This brief review highlights some of the major methodological considerations at multiple experimental stages and provides a broad framework for designing studies that examine local steroid levels in the brain as well as other steroidogenic tissues, such as thymus, breast, and prostate. PMID:22654806

  5. Measurement of testosterone in human sexuality research: methodological considerations.

    PubMed

    van Anders, Sari M; Goldey, Katherine L; Bell, Sarah N

    2014-02-01

    Testosterone (T) and other androgens are incorporated into an increasingly wide array of human sexuality research, but there are a number of issues that can affect or confound research outcomes. This review addresses various methodological issues relevant to research design in human studies with T; unaddressed, these issues may introduce unwanted noise, error, or conceptual barriers to interpreting results. Topics covered are (1) social and demographic factors (gender and sex; sexual orientations and sexual diversity; social/familial connections and processes; social location variables), (2) biological rhythms (diurnal variation; seasonality; menstrual cycles; aging and menopause), (3) sample collection, handling, and storage (saliva vs. blood; sialogogues, saliva, and tubes; sampling frequency, timing, and context; shipping samples), (4) health, medical issues, and the body (hormonal contraceptives; medications and nicotine; health conditions and stress; body composition, weight, and exercise), and (5) incorporating multiple hormones. Detailing a comprehensive set of important issues and relevant empirical evidence, this review provides a starting point for best practices in human sexuality research with T and other androgens that may be especially useful for those new to hormone research. PMID:23807216

  6. Adaptability of laser diffraction measurement technique in soil physics methodology

    NASA Astrophysics Data System (ADS)

    Barna, Gyöngyi; Szabó, József; Rajkai, Kálmán; Bakacsi, Zsófia; Koós, Sándor; László, Péter; Hauk, Gabriella; Makó, András

    2016-04-01

    There are intentions all around the world to harmonize soils' particle size distribution (PSD) data by the laser diffractometer measurements (LDM) to that of the sedimentation techniques (pipette or hydrometer methods). Unfortunately, up to the applied methodology (e. g. type of pre-treatments, kind of dispersant etc.), PSDs of the sedimentation methods (due to different standards) are dissimilar and could be hardly harmonized with each other, as well. A need was arisen therefore to build up a database, containing PSD values measured by the pipette method according to the Hungarian standard (MSZ-08. 0205: 1978) and the LDM according to a widespread and widely used procedure. In our current publication the first results of statistical analysis of the new and growing PSD database are presented: 204 soil samples measured with pipette method and LDM (Malvern Mastersizer 2000, HydroG dispersion unit) were compared. Applying usual size limits at the LDM, clay fraction was highly under- and silt fraction was overestimated compared to the pipette method. Subsequently soil texture classes determined from the LDM measurements significantly differ from results of the pipette method. According to previous surveys and relating to each other the two dataset to optimizing, the clay/silt boundary at LDM was changed. Comparing the results of PSDs by pipette method to that of the LDM, in case of clay and silt fractions the modified size limits gave higher similarities. Extension of upper size limit of clay fraction from 0.002 to 0.0066 mm, and so change the lower size limit of silt fractions causes more easy comparability of pipette method and LDM. Higher correlations were found between clay content and water vapor adsorption, specific surface area in case of modified limit, as well. Texture classes were also found less dissimilar. The difference between the results of the two kind of PSD measurement methods could be further reduced knowing other routinely analyzed soil parameters

  7. Measurement of Steroid Concentrations in Brain Tissue: Methodological Considerations

    PubMed Central

    Taves, Matthew D.; Ma, Chunqi; Heimovics, Sarah A.; Saldanha, Colin J.; Soma, Kiran K.

    2011-01-01

    It is well recognized that steroids are synthesized de novo in the brain (neurosteroids). In addition, steroids circulating in the blood enter the brain. Steroids play numerous roles in the brain, such as influencing neural development, adult neuroplasticity, behavior, neuroinflammation, and neurodegenerative diseases such as Alzheimer’s disease. In order to understand the regulation and functions of steroids in the brain, it is important to directly measure steroid concentrations in brain tissue. In this brief review, we discuss methods for the detection and quantification of steroids in the brain. We concisely present the major advantages and disadvantages of different technical approaches at various experimental stages: euthanasia, tissue collection, steroid extraction, steroid separation, and steroid measurement. We discuss, among other topics, the potential effects of anesthesia and saline perfusion prior to tissue collection; microdissection via Palkovits punch; solid phase extraction; chromatographic separation of steroids; and immunoassays and mass spectrometry for steroid quantification, particularly the use of mass spectrometry for “steroid profiling.” Finally, we discuss the interpretation of local steroid concentrations, such as comparing steroid levels in brain tissue with those in the circulation (plasma vs. whole blood samples; total vs. free steroid levels). We also present reference values for a variety of steroids in different brain regions of adult rats. This brief review highlights some of the major methodological considerations at multiple experimental stages and provides a broad framework for designing studies that examine local steroid levels in the brain as well as other steroidogenic tissues, such as thymus, breast, and prostate. PMID:22654806

  8. [Measuring the intracoronary pressure gradient--value and methodologic limitations].

    PubMed

    Sievert, H; Kaltenbach, M

    1987-06-01

    Measurements of pressure gradients were performed in a fluid-filled model. The hydrostatically regulated perfusion pressure, as well as the diameter of the tube segments and the regulation of the flow by peripheral resistance, were comparable to conditions in human coronary arteries. Pressure gradients above 20 mm Hg were only measured with a reduction in cross-sectional area of more than 90%. Even after increasing the flow four-fold, which corresponds to the human coronary flow reserve, as well as after probing the stenosis with different catheters (2F-5F), gradients greater than 20 mm Hg were only recorded with high-grade stenoses (more than 80% reduction in cross-sectional area). The findings in this model demonstrate that measurement of pressure gradients allows only a quantitative differentiation between high-grade (greater than 80%) and low-grade (less than 80%) stenoses. The catheter itself can substantially contribute to the gradient by vessel obstruction, depending on the diameter of the catheter and of the coronary vessel. A quantitative assessment of the stenosis therefore requires knowledge of the pre- and post-stenotic vessel diameter as well as of the catheter diameter. However, pressure measurements during transluminal coronary angioplasty should not be abandoned. They can be useful to aid catheter positioning and to estimate dilatation efficacy. Moreover, measurement of coronary capillary wedge pressure during balloon expansion provides valuable information about the extent of collateralisation. PMID:2957862

  9. Apparatus for measuring rat body volume: a methodological proposition.

    PubMed

    Hohl, Rodrigo; de Oliveira, Renato Buscariolli; Vaz de Macedo, Denise; Brenzikofer, René

    2007-03-01

    We propose a communicating-vessels system to measure body volume in live rats through water level detection by hydrostatic weighing. The reproducibility, accuracy, linearity, and reliability of this apparatus were evaluated in two tests using previously weighed water or six aluminum cylinders of known volume after proper system calibration. The applicability of this apparatus to measurement of live animals (Wistar rats) was tested in a transversal experiment with five rats, anesthetized and nonanesthetized. We took 18 measurements of the volume under each condition (anesthetized and nonanesthetized), totaling 90 measurements. The addition of water volumes (50-700 ml) produced a regression equation with a slope of 1.0006 +/- 0.0017, intercept of 0.75 +/- 0.81 (R(2) = 0.99999, standard error of estimate = 0.58 ml), and bias of approximately 1 ml. The differences between cylinders of known volumes and volumes calculated by the system were <0.4 ml. Mean volume errors were 0.01-0.07%. Among the live models, the difference between the volumes obtained for anesthetized and nonanesthetized rats was 0.31 +/- 2.34 (SD) ml (n = 90). These data showed that animal movement does not interfere with the volume measured by the proposed apparatus, and neither anesthesia nor fur shaving is needed for this procedure. Nevertheless, some effort should be taken to eliminate air bubbles trapped in the apparatus or the fur. The proposed apparatus for measuring rat body volume is inexpensive and may be useful for a range of scientific purposes. PMID:17082370

  10. Responsive Surface Methodology Optimizes Extraction Conditions of Industrial by-products, Camellia japonica Seed Cake

    PubMed Central

    Kim, Jae Kyeom; Lim, Ho-Jeong; Kim, Mi-So; Choi, Soo Jung; Kim, Mi-Jeong; Kim, Cho Rong; Shin, Dong-Hoon; Shin, Eui-Cheol

    2016-01-01

    Background: The central nervous system is easily damaged by oxidative stress due to high oxygen consumption and poor defensive capacity. Hence, multiple studies have demonstrated that inhibiting oxidative stress-induced damage, through an antioxidant-rich diet, might be a reasonable approach to prevent neurodegenerative disease. Objective: In the present study, response surface methodology was utilized to optimize the extraction for neuro-protective constituents of Camellia japonica byproducts. Materials and Methods: Rat pheochromocytoma cells were used to evaluate protective potential of Camellia japonica byproducts. Results: Optimum conditions were 33.84 min, 75.24%, and 75.82°C for time, ethanol concentration and temperature. Further, we demonstrated that major organic acid contents were significantly impacted by the extraction conditions, which may explain varying magnitude of protective potential between fractions. Conclusions: Given the paucity of information in regards to defatted C. japonica seed cake and their health promoting potential, our results herein provide interesting preliminary data for utilization of this byproduct from oil processing in both academic and industrial applications. SUMMARY Neuro-protective potential of C. japonica seed cake on cell viability was affected by extraction conditionsExtraction conditions effectively influenced on active constituents of C. japonica seed cakeBiological activity of C. japonica seed cake was optimized by the responsive surface methodology. Abbreviations used: GC-MS: Gas chromatography-mass spectrometer, MTT: 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide, PC12 cells: Pheochromocytoma, RSM: Response surface methodology. PMID:27601847

  11. Aircraft emission measurements by remote sensing methodologies at airports

    NASA Astrophysics Data System (ADS)

    Schäfer, Klaus; Jahn, Carsten; Sturm, Peter; Lechner, Bernhard; Bacher, Michael

    The emission indices of aircraft engine exhausts from measurements taken under operating conditions, to calculate precisely the emission inventories of airports, are not available up to now. To determine these data, measurement campaigns were performed on idling aircraft at major European airports using non-intrusive spectroscopic methods like Fourier transform infrared spectrometry and differential optical absorption spectroscopy. Emission indices for CO and NO x were calculated and compared to the values given in the International Civil Aviation Organisation (ICAO) database. The emission index for CO for 36 different aircraft engine types and for NO x (24 different engine types) were determined. It was shown that for idling aircraft, CO emissions are underestimated using the ICAO database. The emission indices for NO x determined in this work are lower than given in the ICAO database. In addition, a high variance of emission indices in each aircraft family and from engine to engine of the same engine type was found. During the same measurement campaigns, the emission indices for CO and NO of eight different types of auxilliary power units were investigated.

  12. 39 CFR 3055.5 - Changes to measurement systems, service standards, service goals, or reporting methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., service goals, or reporting methodologies. 3055.5 Section 3055.5 Postal Service POSTAL REGULATORY... reporting methodologies. The Postal Service shall file notice with the Commission describing all changes to measurement systems, service standards, service goals or reporting methodologies, including the use of...

  13. The Role of Measuring in the Learning of Environmental Occupations and Some Aspects of Environmental Methodology

    ERIC Educational Resources Information Center

    István, Lüko

    2016-01-01

    The methodology neglected area of pedagogy, within the environmental specialist teaching methodology cultivating the best part is incomplete. In this article I shall attempt to environmental methodology presented in one part of the University of West Environmental Education workshop, based on the measurement of experiential learning as an…

  14. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  15. Weak measurement and Bohmian conditional wave functions

    SciTech Connect

    Norsen, Travis; Struyve, Ward

    2014-11-15

    It was recently pointed out and demonstrated experimentally by Lundeen et al. that the wave function of a particle (more precisely, the wave function possessed by each member of an ensemble of identically-prepared particles) can be “directly measured” using weak measurement. Here it is shown that if this same technique is applied, with appropriate post-selection, to one particle from a perhaps entangled multi-particle system, the result is precisely the so-called “conditional wave function” of Bohmian mechanics. Thus, a plausibly operationalist method for defining the wave function of a quantum mechanical sub-system corresponds to the natural definition of a sub-system wave function which Bohmian mechanics uniquely makes possible. Similarly, a weak-measurement-based procedure for directly measuring a sub-system’s density matrix should yield, under appropriate circumstances, the Bohmian “conditional density matrix” as opposed to the standard reduced density matrix. Experimental arrangements to demonstrate this behavior–and also thereby reveal the non-local dependence of sub-system state functions on distant interventions–are suggested and discussed. - Highlights: • We study a “direct measurement” protocol for wave functions and density matrices. • Weakly measured states of entangled particles correspond to Bohmian conditional states. • Novel method of observing quantum non-locality is proposed.

  16. Measurement conditions in the space experiment MONICA

    NASA Astrophysics Data System (ADS)

    Bakaldin, A. V.; Karelin, A. V.; Koldashov, S. V.; Voronov, S. A.

    2016-02-01

    The present contribution is dedicated to the investigation of the background conditions for cosmic ray ion ionization state measurements in MONICA experiment [1]. The future experiment MONICA is aimed to study the cosmic ray ion fluxes from H till Ni in energy range 10-300 MeV/n by multilayer semiconductor telescope-spectrometer installed onboard satellite. The satellite orbit parameters (circular, altitude is about 600 km, polar) were chosen for the realization of the unique method for the measurement of the charge state of the ions with energies >10 MeV/n by using the Earth magnetic field as a separator of ion charge. The analysis of the background particle fluxes is presented taking into account the recent data of the satellite experiments and known AE-8, AP-8 models and elaborated the recommendations to improve the background conditions for the MONICA experiment.

  17. Methodology for determining time-dependent mechanical properties of tuff subjected to near-field repository conditions

    SciTech Connect

    Blacic, J.D.; Andersen, R.

    1983-01-01

    We have established a methodology to determine the time dependence of strength and transport properties of tuff under conditions appropriate to a nuclear waste repository. Exploratory tests to determine the approximate magnitudes of thermomechanical property changes are nearly complete. In this report we describe the capabilities of an apparatus designed to precisely measure the time-dependent deformation and permeability of tuff at simulated repository conditions. Preliminary tests with this new apparatus indicate that microclastic creep failure of tuff occurs over a narrow strain range with little precursory Tertiary creep behavior. In one test, deformation under conditions of slowly decreasing effective pressure resulted in failure, whereas some strain indicators showed a decreasing rate of strain.

  18. Response Surface Methodology: An Extensive Potential to Optimize in vivo Photodynamic Therapy Conditions

    SciTech Connect

    Tirand, Loraine; Bastogne, Thierry; Bechet, Denise M.Sc.; Linder, Michel; Thomas, Noemie; Frochot, Celine; Guillemin, Francois; Barberi-Heyob, Muriel

    2009-09-01

    Purpose: Photodynamic therapy (PDT) is based on the interaction of a photosensitizing (PS) agent, light, and oxygen. Few new PS agents are being developed to the in vivo stage, partly because of the difficulty in finding the right treatment conditions. Response surface methodology, an empirical modeling approach based on data resulting from a set of designed experiments, was suggested as a rational solution with which to select in vivo PDT conditions by using a new peptide-conjugated PS targeting agent, neuropilin-1. Methods and Materials: A Doehlert experimental design was selected to model effects and interactions of the PS dose, fluence, and fluence rate on the growth of U87 human malignant glioma cell xenografts in nude mice, using a fixed drug-light interval. All experimental results were computed by Nemrod-W software and Matlab. Results: Intrinsic diameter growth rate, a tumor growth parameter independent of the initial volume of the tumor, was selected as the response variable and was compared to tumor growth delay and relative tumor volumes. With only 13 experimental conditions tested, an optimal PDT condition was selected (PS agent dose, 2.80 mg/kg; fluence, 120 J/cm{sup 2}; fluence rate, 85 mW/cm{sup 2}). Treatment of glioma-bearing mice with the peptide-conjugated PS agent, followed by the optimized PDT condition showed a statistically significant improvement in delaying tumor growth compared with animals who received the PDT with the nonconjugated PS agent. Conclusions: Response surface methodology appears to be a useful experimental approach for rapid testing of different treatment conditions and determination of optimal values of PDT factors for any PS agent.

  19. Calibration methodology for proportional counters applied to yield measurements of a neutron burst

    SciTech Connect

    Tarifeño-Saldivia, Ariel E-mail: atarisal@gmail.com; Pavez, Cristian; Soto, Leopoldo; Mayer, Roberto E.

    2014-01-15

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  20. A methodology for hard/soft information fusion in the condition monitoring of aircraft

    NASA Astrophysics Data System (ADS)

    Bernardo, Joseph T.

    2013-05-01

    Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.

  1. A methodological framework for linking bioreactor function to microbial communities and environmental conditions.

    PubMed

    de los Reyes, Francis L; Weaver, Joseph E; Wang, Ling

    2015-06-01

    In the continuing quest to relate microbial communities in bioreactors to function and environmental and operational conditions, engineers and biotechnologists have adopted the latest molecular and 'omic methods. Despite the large amounts of data generated, gaining mechanistic insights and using the data for predictive and practical purposes is still a huge challenge. We present a methodological framework that can guide experimental design, and discuss specific issues that can affect how researchers generate and use data to elucidate the relationships. We also identify, in general terms, bioreactor research opportunities that appear promising. PMID:25710123

  2. Measurement-based auralization methodology for the assessment of noise mitigation measures

    NASA Astrophysics Data System (ADS)

    Thomas, Pieter; Wei, Weigang; Van Renterghem, Timothy; Botteldooren, Dick

    2016-09-01

    The effect of noise mitigation measures is generally expressed by noise levels only, neglecting the listener's perception. In this study, an auralization methodology is proposed that enables an auditive preview of noise abatement measures for road traffic noise, based on the direction dependent attenuation of a priori recordings made with a dedicated 32-channel spherical microphone array. This measurement-based auralization has the advantage that all non-road traffic sounds that create the listening context are present. The potential of this auralization methodology is evaluated through the assessment of the effect of an L-shaped mound. The angular insertion loss of the mound is estimated by using the ISO 9613-2 propagation model, the Pierce barrier diffraction model and the Harmonoise point-to-point model. The realism of the auralization technique is evaluated by listening tests, indicating that listeners had great difficulty in differentiating between a posteriori recordings and auralized samples, which shows the validity of the followed approaches.

  3. Optimization of hydrolysis conditions for bovine plasma protein using response surface methodology.

    PubMed

    Seo, Hyun-Woo; Jung, Eun-Young; Go, Gwang-Woong; Kim, Gap-Don; Joo, Seon-Tea; Yang, Han-Sul

    2015-10-15

    The purpose of this study was to establish optimal conditions for the hydrolysis of bovine plasma protein. Response surface methodology was used to model and optimize responses [degree of hydrolysis (DH), 2,2-diphenyl-1-picrydrazyl (DPPH) radical-scavenging activity and Fe(2+)-chelating activity]. Hydrolysis conditions, such as hydrolysis temperature (46.6-63.4 °C), hydrolysis time (98-502 min), and hydrolysis pH (6.32-9.68) were selected as the main processing conditions in the hydrolysis of bovine plasma protein. Optimal conditions for maximum DH (%), DPPH radical-scavenging activity (%) and Fe(2+)-chelating activity (%) of the hydrolyzed bovine plasma protein, were respectively established. We discovered the following three conditions for optimal hydrolysis of bovine plasma: pH of 7.82-8.32, temperature of 54.1 °C, and time of 338.4-398.4 min. We consequently succeeded in hydrolyzing bovine plasma protein under these conditions and confirmed the various desirable properties of optimal hydrolysis. PMID:25952847

  4. Alcohol Measurement Methodology in Epidemiology: Recent Advances and Opportunities

    PubMed Central

    Greenfield, Thomas K.; Kerr, William C.

    2009-01-01

    Aim To review and discuss measurement issues in survey assessment of alcohol consumption for epidemiological studies. Methods The following areas are considered: implications of cognitive studies of question answering like self-referenced schemata of drinking, reference period and retrospective recall, as well as the assets and liabilities of types of current (e.g., food frequency, quantity frequency, graduated frequencies, and heavy drinking indicators) and lifetime drinking measures. Finally we consider units of measurement and improving measurement by detailing the ethanol content of drinks in natural settings. Results and conclusions Cognitive studies suggest inherent limitations in the measurement enterprise, yet diary studies show promise of broadly validating methods that assess a range of drinking amounts per occasion; improvements in survey measures of drinking in the life course are indicated; attending in detail to on and off-premise drink pour sizes and ethanol concentrations of various beverages shows promise of narrowing the coverage gap plaguing survey alcohol measurement. PMID:18422826

  5. Methodology of the Westinghouse dynamic rod worth measurement technique

    SciTech Connect

    Chao, Y.A.; Chapman, D.M.; Easter, M.E.; Hill, D.J.; Hoerner, J.A. ); Kurtz, P.N. )

    1992-01-01

    During zero-power physics testing, plant operations personnel use one of various techniques to measure the reactivity worth of the control rods to confirm shutdown margin. A simple and fast procedure for measuring rod worths called dynamic rod worth measurement (DRWM) has been developed at Westinghouse. This procedure was tested at the recent startups of Point Beach Nuclear Power Plant Unit 1 cycle 20 and Unit 2 cycle 18. The results of these tests show that DRWM measures rod worths with accuracy comparable to that of both boron dilution and rod bank exchange measurements. The DRWM procedure is a fast process of measuring the reactivity worth of individual banks by inserting and withdrawing the bank continuously at the maximum stepping speed without changing the boron concentration and recording the signals of the ex-core detectors.

  6. Causality analysis in business performance measurement system using system dynamics methodology

    NASA Astrophysics Data System (ADS)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  7. Measurement-based reliability prediction methodology. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Linn, Linda Shen

    1991-01-01

    In the past, analytical and measurement based models were developed to characterize computer system behavior. An open issue is how these models can be used, if at all, for system design improvement. The issue is addressed here. A combined statistical/analytical approach to use measurements from one environment to model the system failure behavior in a new environment is proposed. A comparison of the predicted results with the actual data from the new environment shows a close correspondence.

  8. A Methodology for Robust Multiproxy Paleoclimate Reconstructions and Modeling of Temperature Conditional Quantiles

    PubMed Central

    Janson, Lucas; Rajaratnam, Bala

    2014-01-01

    Great strides have been made in the field of reconstructing past temperatures based on models relating temperature to temperature-sensitive paleoclimate proxies. One of the goals of such reconstructions is to assess if current climate is anomalous in a millennial context. These regression based approaches model the conditional mean of the temperature distribution as a function of paleoclimate proxies (or vice versa). Some of the recent focus in the area has considered methods which help reduce the uncertainty inherent in such statistical paleoclimate reconstructions, with the ultimate goal of improving the confidence that can be attached to such endeavors. A second important scientific focus in the subject area is the area of forward models for proxies, the goal of which is to understand the way paleoclimate proxies are driven by temperature and other environmental variables. One of the primary contributions of this paper is novel statistical methodology for (1) quantile regression with autoregressive residual structure, (2) estimation of corresponding model parameters, (3) development of a rigorous framework for specifying uncertainty estimates of quantities of interest, yielding (4) statistical byproducts that address the two scientific foci discussed above. We show that by using the above statistical methodology we can demonstrably produce a more robust reconstruction than is possible by using conditional-mean-fitting methods. Our reconstruction shares some of the common features of past reconstructions, but we also gain useful insights. More importantly, we are able to demonstrate a significantly smaller uncertainty than that from previous regression methods. In addition, the quantile regression component allows us to model, in a more complete and flexible way than least squares, the conditional distribution of temperature given proxies. This relationship can be used to inform forward models relating how proxies are driven by temperature. PMID:25587203

  9. Optimization of hull-less pumpkin seed roasting conditions using response surface methodology.

    PubMed

    Vujasinović, Vesna; Radočaj, Olga; Dimić, Etelka

    2012-05-01

    Response surface methodology (RSM) was applied to optimize hull-less pumpkin seed roasting conditions before seed pressing to maximize the biochemical composition and antioxidant capacity of the virgin pumpkin oils obtained using a hydraulic press. Hull-less pumpkin seeds were roasted for various lengths of time (30 to 70 min) at various roasting temperatures (90 to 130 °C), resulting in 9 different oil samples, while the responses were phospholipids content, total phenols content, α- and γ-tocopherols, and antioxidative activity [by 2,2-diphenyl-1-picrylhydrazyl (DPPH) free-radical assay]. Mathematical models have shown that roasting conditions influenced all dependent variables at P < 0.05. The higher roasting temperatures had a significant effect (P < 0.05) on phospholipids, phenols, and α-tocopherols contents, while longer roasting time had a significant effect (P < 0.05) on γ-tocopherol content and antioxidant capacity, among the samples prepared under different roasting conditions. The optimum conditions for roasting the hull-less pumpkin seeds were 120 °C for duration of 49 min, which resulted in these oil concentrations: phospholipids 0.29%, total phenols 23.06 mg/kg, α-tocopherol 5.74 mg/100 g, γ-tocopherol 24.41 mg/100 g, and an antioxidative activity (EC(50)) of 27.18 mg oil/mg DPPH. PMID:23163936

  10. Optimizing ultrasonic ellagic acid extraction conditions from infructescence of Platycarya strobilacea using response surface methodology.

    PubMed

    Zhang, Liang-Liang; Xu, Man; Wang, Yong-Mei; Wu, Dong-Mei; Chen, Jia-Hong

    2010-11-01

    The infructescence of Platycarya strobilacea is a rich source of ellagic acid (EA) which has shown antioxidant, anticancer and antimutagen properties. Response surface methodology (RSM) was used to optimize the conditions for ultrasonic extraction of EA from infructescence of P. strobilacea. A central composite design (CCD) was used for experimental design and analysis of the results to obtain the optimal processing parameters. The content of EA in the extracts was determined by HPLC with UV detection. Three independent variables such as ultrasonic extraction temperature (°C), liquid:solid ratio (mL/g), and ultrasonic extraction time (min) were investigated. The experimental data obtained were fitted to a quadratic equation using multiple regression analysis and also analyzed by appropriate statistical methods. The 3-D response surface and the contour plots derived from the mathematical models were applied to determine the optimal conditions. The optimum ultrasonic extraction conditions were as follows: ultrasonic extraction temperature 70 °C, liquid:solid ratio 22.5, and ultrasonic extraction time 40 min. Under these conditions, the experimental percentage value was 1.961%, which is in close agreement with the value predicted by the model. PMID:21060299

  11. Aqueduct: a methodology to measure and communicate global water risks

    NASA Astrophysics Data System (ADS)

    Gassert, Francis; Reig, Paul

    2013-04-01

    , helping non-expert audiences better understand and evaluate risks facing water users. This presentation will discuss the methodology used to combine the indicator values into aggregated risk scores and lessons learned from working with diverse audiences in academia, development institutions, and the public and private sectors.

  12. Measuring Individual Differences in Decision Biases: Methodological Considerations.

    PubMed

    Aczel, Balazs; Bago, Bence; Szollosi, Aba; Foldes, Andrei; Lukacs, Bence

    2015-01-01

    Individual differences in people's susceptibility to heuristics and biases (HB) are often measured by multiple-bias questionnaires consisting of one or a few items for each bias. This research approach relies on the assumptions that (1) different versions of a decision bias task measure are interchangeable as they measure the same cognitive failure; and (2) that some combination of these tasks measures the same underlying construct. Based on these assumptions, in Study 1 we developed two versions of a new decision bias survey for which we modified 13 HB tasks to increase their comparability, construct validity, and the participants' motivation. The analysis of the responses (N = 1279) showed weak internal consistency within the surveys and a great level of discrepancy between the extracted patterns of the underlying factors. To explore these inconsistencies, in Study 2 we used three original examples of HB tasks for each of seven biases. We created three decision bias surveys by allocating one version of each HB task to each survey. The participants' responses (N = 527) showed a similar pattern as in Study 1, questioning the assumption that the different examples of the HB tasks are interchangeable and that they measure the same underlying construct. These results emphasize the need to understand the domain-specificity of cognitive biases as well as the effect of the wording of the cover story and the response mode on bias susceptibility before employing them in multiple-bias questionnaires. PMID:26635677

  13. Measuring Individual Differences in Decision Biases: Methodological Considerations

    PubMed Central

    Aczel, Balazs; Bago, Bence; Szollosi, Aba; Foldes, Andrei; Lukacs, Bence

    2015-01-01

    Individual differences in people's susceptibility to heuristics and biases (HB) are often measured by multiple-bias questionnaires consisting of one or a few items for each bias. This research approach relies on the assumptions that (1) different versions of a decision bias task measure are interchangeable as they measure the same cognitive failure; and (2) that some combination of these tasks measures the same underlying construct. Based on these assumptions, in Study 1 we developed two versions of a new decision bias survey for which we modified 13 HB tasks to increase their comparability, construct validity, and the participants' motivation. The analysis of the responses (N = 1279) showed weak internal consistency within the surveys and a great level of discrepancy between the extracted patterns of the underlying factors. To explore these inconsistencies, in Study 2 we used three original examples of HB tasks for each of seven biases. We created three decision bias surveys by allocating one version of each HB task to each survey. The participants' responses (N = 527) showed a similar pattern as in Study 1, questioning the assumption that the different examples of the HB tasks are interchangeable and that they measure the same underlying construct. These results emphasize the need to understand the domain-specificity of cognitive biases as well as the effect of the wording of the cover story and the response mode on bias susceptibility before employing them in multiple-bias questionnaires. PMID:26635677

  14. Multi-Attribute Decision Theory methodology for pollution control measure analysis

    SciTech Connect

    Barrera Roldan, A.S.; Corona Juarez, A.; Hardie, R.W.; Thayer, G.R.

    1992-12-31

    A methodology based in Multi-Attribute Decision Theory was developed to prioritize air pollution control measures and strategies (a set of measures) for Mexico City Metropolitan Area (MCMA). We have developed a framework that takes into account economic, technical feasibility, environmental, social, political, and institutional factors to evaluate pollution mitigation measures and strategies utilizing a decision analysis process. In a series of meetings with a panel of experts in air pollution from different offices of the mexican government we have developed General and Specific criteria for a decision analysis tree. With these tools the measures or strategies can be graded and a figure of merit can be assigned to each of them, so they can be ranked. Two pollution mitigation measures were analyzed to test the methodology, the results are presented. This methodology was developed specifically for Mexico City, though the experience gained in this work can be used to develop similar methodologies for other metropolitan areas throughout the world.

  15. Multi-Attribute Decision Theory methodology for pollution control measure analysis

    SciTech Connect

    Barrera Roldan, A.S.; Corona Juarez, A. ); Hardie, R.W.; Thayer, G.R. )

    1992-01-01

    A methodology based in Multi-Attribute Decision Theory was developed to prioritize air pollution control measures and strategies (a set of measures) for Mexico City Metropolitan Area (MCMA). We have developed a framework that takes into account economic, technical feasibility, environmental, social, political, and institutional factors to evaluate pollution mitigation measures and strategies utilizing a decision analysis process. In a series of meetings with a panel of experts in air pollution from different offices of the mexican government we have developed General and Specific criteria for a decision analysis tree. With these tools the measures or strategies can be graded and a figure of merit can be assigned to each of them, so they can be ranked. Two pollution mitigation measures were analyzed to test the methodology, the results are presented. This methodology was developed specifically for Mexico City, though the experience gained in this work can be used to develop similar methodologies for other metropolitan areas throughout the world.

  16. [Methodological challenges in measurements of functional ability in gerontological research].

    PubMed

    Avlund, E K

    1997-10-20

    This article addresses the advantages and disadvantages of different methods in measuring functional ability with its main focus on frame of reference, operationalization, practical procedure, validity, discriminatory power, and responsiveness. When measuring functional ability it is recommended: 1) Always to consider the theoretical frame of reference as part of the validation process. 2) Always to assess the content validity of items before they are combined into an index and before performing tests for construct validity. 3) Not to combine mobility, PADL and IADL in the same index/scale. 4) Not to use IADL as a health-related functional ability measure or, if used, to ask whether problems with IADL or non-performance of IADL are caused by health-related factors. 5) Always to analyse functional ability separately for men and women. 6) To exclude the dead in analyses of change in functional ability if the focus is on predictors of deterioration in functional ability. PMID:9411957

  17. Conceptual and methodological advances in child-reported outcomes measurement

    PubMed Central

    Bevans, Katherine B; Riley, Anne W; Moon, JeanHee; Forrest, Christopher B

    2011-01-01

    Increasingly, clinical, pharmaceutical and translational research studies use patient-reported outcomes as primary and secondary end points. Obtaining this type of information from children themselves is now possible, but effective assessment requires developmentally sensitive conceptual models of child health and an appreciation for the rapid change in children’s cognitive capacities. To overcome these barriers, outcomes researchers have capitalized on innovations in modern measurement theory, qualitative methods for instrument development and new computerized technologies to create reliable and valid methods for obtaining self-reported health data among 8–17-year-old children. This article provides a developmentally focused framework for selecting child-report health assessment instruments. Several generic health-related quality of life instruments and the assessment tools developed by the NIH-sponsored Patient-Reported Outcome Measurement Information System network are discussed to exemplify advances in the measurement of children’s self-reported health, illness, wellbeing and quality of life. PMID:20715916

  18. Effective speed and agility conditioning methodology for random intermittent dynamic type sports.

    PubMed

    Bloomfield, Jonathan; Polman, Remco; O'Donoghue, Peter; McNaughton, Lars

    2007-11-01

    Different coaching methods are often used to improve performance. This study compared the effectiveness of 2 methodologies for speed and agility conditioning for random, intermittent, and dynamic activity sports (e.g., soccer, tennis, hockey, basketball, rugby, and netball) and the necessity for specialized coaching equipment. Two groups were delivered either a programmed method (PC) or a random method (RC) of conditioning with a third group receiving no conditioning (NC). PC participants used the speed, agility, quickness (SAQ) conditioning method, and RC participants played supervised small-sided soccer games. PC was also subdivided into 2 groups where participants either used specialized SAQ equipment or no equipment. A total of 46 (25 males and 21 females) untrained participants received (mean +/- SD) 12.2 +/- 2.1 hours of physical conditioning over 6 weeks between a battery of speed and agility parameter field tests. Two-way analysis of variance results indicated that both conditioning groups showed a significant decrease in body mass and body mass index, although PC achieved significantly greater improvements on acceleration, deceleration, leg power, dynamic balance, and the overall summation of % increases when compared to RC and NC (p < 0.05). PC in the form of SAQ exercises appears to be a superior method for improving speed and agility parameters; however, this study found that specialized SAQ equipment was not a requirement to observe significant improvements. Further research is required to establish whether these benefits transfer to sport-specific tasks as well as to the underlying mechanisms resulting in improved performance. PMID:18076227

  19. Methodology for reliability based condition assessment. Application to concrete structures in nuclear plants

    SciTech Connect

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period.

  20. Sensible organizations: technology and methodology for automatically measuring organizational behavior.

    PubMed

    Olguin Olguin, Daniel; Waber, Benjamin N; Kim, Taemie; Mohan, Akshay; Ara, Koji; Pentland, Alex

    2009-02-01

    We present the design, implementation, and deployment of a wearable computing platform for measuring and analyzing human behavior in organizational settings. We propose the use of wearable electronic badges capable of automatically measuring the amount of face-to-face interaction, conversational time, physical proximity to other people, and physical activity levels in order to capture individual and collective patterns of behavior. Our goal is to be able to understand how patterns of behavior shape individuals and organizations. By using on-body sensors in large groups of people for extended periods of time in naturalistic settings, we have been able to identify, measure, and quantify social interactions, group behavior, and organizational dynamics. We deployed this wearable computing platform in a group of 22 employees working in a real organization over a period of one month. Using these automatic measurements, we were able to predict employees' self-assessments of job satisfaction and their own perceptions of group interaction quality by combining data collected with our platform and e-mail communication data. In particular, the total amount of communication was predictive of both of these assessments, and betweenness in the social network exhibited a high negative correlation with group interaction satisfaction. We also found that physical proximity and e-mail exchange had a negative correlation of r = -0.55 (p 0.01), which has far-reaching implications for past and future research on social networks. PMID:19150759

  1. Methodologies for measuring residual stress distributions in epitaxial thin films

    NASA Astrophysics Data System (ADS)

    Liu, M.; Ruan, H. H.; Zhang, L. C.

    2013-01-01

    Residual stresses in a thin film deposited on a dissimilar substrate can bring about various interface or subsurface damages, such as delamination, dislocation, twinning and cracking. In high performance integrated circuits and MEMS, a too high residual stress can significantly alter their electronic properties. A proper residual stress characterization needs the description of full stress tensors and their variations with thickness. The problem is that film thickness measurement requires different means, and that direct measurement techniques to fulfill the tasks are not straightforward. This paper provides a simple method using X-ray diffraction (XRD) and Raman scattering for the measurement of residual stresses and their thickness dependence. Using the epitaxial silicon film on a sapphire substrate as an example, this paper demonstrates that the improved XRD technique can make use of multiple diffraction peaks to give rise to a highly accurate stress tensor. The co-existence of silicon and sapphire peaks in a Raman spectrum then allows a simultaneous measurement of film thickness from the peak intensity ratio and the residual stress from the peak shift. The paper also concludes the relation between film thickness and residual stresses.

  2. Conceptual and Methodological Issues in Treatment Integrity Measurement

    ERIC Educational Resources Information Center

    McLeod, Bryce D.; Southam-Gerow, Michael A.; Weisz, John R.

    2009-01-01

    This special series focused on treatment integrity in the child mental health and education field is timely. The articles do a laudable job of reviewing (a) the current status of treatment integrity research and measurement, (b) existing conceptual models of treatment integrity, and (c) the limitations of prior research. Overall, this thoughtful…

  3. Measure of Landscape Heterogeneity by Agent-Based Methodology

    NASA Astrophysics Data System (ADS)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  4. MEASUREMENT OF CARDIOPULMONARY FUNCTION BY REBREATHING METHODOLOGY IN PIGLETS

    EPA Science Inventory

    The use of a multiple gas rebreathing method for the measurement of cardiopulmonary function in mechanically ventilated neonates was evaluated. The following indices of cardiopulmonary function were assessed in 20 piglets (mean weight, 2.3 kg): (1) pulmonary capillary blood flow ...

  5. Post-retrieval extinction as reconsolidation interference: methodological issues or boundary conditions?

    PubMed Central

    Auber, Alessia; Tedesco, Vincenzo; Jones, Carolyn E.; Monfils, Marie-H.; Chiamulera, Christian

    2013-01-01

    Memories that are emotionally arousing generally promote the survival of species; however, the systems that modulate emotional learning can go awry, resulting in pathological conditions such as post-traumatic stress disorders, phobias, and addiction. Understanding the conditions under which emotional memories can be targeted is a major research focus as the potential to translate these methods into clinical populations carries important implications. It has been demonstrated that both fear and drug-related memories can be destabilised at their retrieval and require reconsolidation to be maintained. Therefore, memory reconsolidation offers a potential target period during which the aberrant memories underlying psychiatric disorders can be disrupted. Monfils et al. in 2009 have shown for the first time that safe information provided through an extinction session after retrieval (during the reconsolidation window) may update the original memory trace and prevent the return of fear in rats. In recent years several authors have then tested the effect of post-retrieval extinction on reconsolidation of either fear or drug related memories in both laboratory animals and humans. In this article we review the literature on post-reactivation extinction, discuss the differences across studies on the methodological ground, and review the potential boundary conditions that may explain existing discrepancies and limit the potential application of post-reactivation extinction approaches. PMID:23404065

  6. Optimization of culture conditions for flexirubin production by Chryseobacterium artocarpi CECT 8497 using response surface methodology.

    PubMed

    Venil, Chidambaram Kulandaisamy; Zakaria, Zainul Akmar; Ahmad, Wan Azlina

    2015-01-01

    Flexirubins are the unique type of bacterial pigments produced by the bacteria from the genus Chryseobacterium, which are used in the treatment of chronic skin disease, eczema etc. and may serve as a chemotaxonomic marker. Chryseobacterium artocarpi CECT 8497, an yellowish-orange pigment producing strain was investigated for maximum production of pigment by optimizing medium composition employing response surface methodology (RSM). Culture conditions affecting pigment production were optimized statistically in shake flask experiments. Lactose, l-tryptophan and KH2PO4 were the most significant variables affecting pigment production. Box Behnken design (BBD) and RSM analysis were adopted to investigate the interactions between variables and determine the optimal values for maximum pigment production. Evaluation of the experimental results signified that the optimum conditions for maximum production of pigment (521.64 mg/L) in 50 L bioreactor were lactose 11.25 g/L, l-tryptophan 6 g/L and KH2PO4 650 ppm. Production under optimized conditions increased to 7.23 fold comparing to its production prior to optimization. Results of this study showed that statistical optimization of medium composition and their interaction effects enable short listing of the significant factors influencing maximum pigment production from Chryseobacterium artocarpi CECT 8497. In addition, this is the first report optimizing the process parameters for flexirubin type pigment production from Chryseobacterium artocarpi CECT 8497. PMID:25979288

  7. A methodological approach of estimating resistance to flow under unsteady flow conditions

    NASA Astrophysics Data System (ADS)

    Mrokowska, M. M.; Rowiński, P. M.; Kalinowska, M. B.

    2015-10-01

    This paper presents an evaluation and analysis of resistance parameters: friction slope, friction velocity and Manning coefficient in unsteady flow. The methodology to enhance the evaluation of resistance by relations derived from flow equations is proposed. The main points of the methodology are (1) to choose a resistance relation with regard to a shape of a channel and (2) type of wave, (3) to choose an appropriate method to evaluate slope of water depth, and (4) to assess the uncertainty of result. In addition to a critical analysis of existing methods, new approaches are presented: formulae for resistance parameters for a trapezoidal channel, and a translation method instead of Jones' formula to evaluate the gradient of flow depth. Measurements obtained from artificial dam-break flood waves in a small lowland watercourse have made it possible to apply the method and to analyse to what extent resistance parameters vary in unsteady flow. The study demonstrates that results of friction slope and friction velocity are more sensitive to applying simplified formulae than the Manning coefficient (n). n is adequate as a flood routing parameter but may be misleading when information on trend of resistance with flow rate is crucial. Then friction slope or friction velocity seems to be better choice.

  8. [Measuring nursing care times--methodologic and documentation problems].

    PubMed

    Bartholomeyczik, S; Hunstein, D

    2001-08-01

    The time for needed nursing care is one important measurement as a basic for financing care. In Germany the Long Term Care Insurance (LTCI) reimburses nursing care depending on the time family care givers need to complete selected activities. The LTCI recommends certain time ranges for these activities, which are wholly compensatory, as a basic for assessment. The purpose is to enhance assessment justice and comparability. With the example of a German research project, which had to investigate the duration of these activities and the reasons for differences, questions are raised about some definition and interpretation problems. There are definition problems, since caring activities especially in private households are nearly never performed as clearly defined modules. Moreover, often different activities are performed simultaneously. However, the most important question is what exactly time numbers can say about the essentials of nursing care. PMID:12385262

  9. Optimization of microwave-assisted hot air drying conditions of okra using response surface methodology.

    PubMed

    Kumar, Deepak; Prasad, Suresh; Murthy, Ganti S

    2014-02-01

    Okra (Abelmoschus esculentus) was dried to a moisture level of 0.1 g water/g dry matter using a microwave-assisted hot air dryer. Response surface methodology was used to optimize the drying conditions based on specific energy consumption and quality of dried okra. The drying experiments were performed using a central composite rotatable design for three variables: air temperature (40-70 °C), air velocity (1-2 m/s) and microwave power level (0.5-2.5 W/g). The quality of dried okra was determined in terms of color change, rehydration ratio and hardness of texture. A second-order polynomial model was well fitted to all responses and high R(2) values (>0.8) were observed in all cases. The color change of dried okra was found higher at high microwave power and air temperatures. Rehydration properties were better for okra samples dried at higher microwave power levels. Specific energy consumption decreased with increase in microwave power due to decrease in drying time. The drying conditions of 1.51 m/s air velocity, 52.09 °C air temperature and 2.41 W/g microwave power were found optimum for product quality and minimum energy consumption for microwave-convective drying of okra. PMID:24493879

  10. Optimization of osmotic dehydration conditions of peach slices in sucrose solution using response surface methodology.

    PubMed

    Yadav, Baljeet Singh; Yadav, Ritika B; Jatain, Monika

    2012-10-01

    Osmotic dehydration (OD) conditions of peach slices were optimized using response surface methodology (RSM) with respect to sucrose concentration (50-70°B), immersion time (2-4 h) and process temperature (35-55 °C) for maximum water loss (WL), minimum solute gain (SG) and maximum rehydration ratio (RR) as response variables. A central composite rotatable design (CCRD) was used as experimental design. The models developed for all responses were significant. All model terms were significant in WL except the quadratic levels of sucrose concentration and temperature whereas in SG, linear terms of time and linear and quadratic terms of temperature were significant. All the terms except linear term of time and interaction term of time and sucrose concentration, were significant in RR. The optimized conditions were sucrose concentration = 69.9°B, time = 3.97 h and temperature = 37.63 °C in order to obtain WL of 28.42 (g/100 g of fresh weight), SG of 8.39 (g/100 g of fresh weight) and RR of 3.38. PMID:24082265

  11. Optimization conditions for anthocyanin and phenolic content extraction form purple sweet potato using response surface methodology.

    PubMed

    Ahmed, Maruf; Akter, Mst Sorifa; Eun, Jong-Bang

    2011-02-01

    Purple sweet potato flour could be used to enhance the bioactive components such as phenolic compounds and anthocyanin content that might be used as nutraceutical ingredients for formulated foods. Optimization of anthocyanin and phenolic contents of purple sweet potato were investigated using response surface methodology. A face-centered cube design was used to investigate the effects of three independent variables: namely, drying temperature 55-65°C, citric acid concentration 1-3% w/v and soaking time 1-3 min. The optimal conditions for anthocyanin and phenolic contents were 62.91°C, 1.38%, 2.53 min and 60.94°C, 1.04% and 2.24 min, respectively. However, optimal conditions of anthocyanin content were not apparent. The experimental value of anthocyanin content was 19.78 mg/100 g and total phenolic content was 61.55 mg/g. These data showed that the experimental responses were reasonably close to the predicted responses. Therefore, the results showed that treated flours could be used to enhance the antioxidant activities of functional foods. PMID:20858156

  12. A combined linear optimisation methodology for water resources allocation in Alfeios River Basin (Greece) under uncertain and vague system conditions

    NASA Astrophysics Data System (ADS)

    Bekri, Eleni; Yannopoulos, Panayotis; Disse, Markus

    2013-04-01

    In the present study, a combined linear programming methodology, based on Li et al. (2010) and Bekri et al. (2012), is employed for optimizing water allocation under uncertain system conditions in the Alfeios River Basin, in Greece. The Alfeios River is a water resources system of great natural, ecological, social and economic importance for Western Greece, since it has the longest and highest flow rate watercourse in the Peloponnisos region. Moreover, the river basin was exposed in the last decades to a plethora of environmental stresses (e.g. hydrogeological alterations, intensively irrigated agriculture, surface and groundwater overexploitation and infrastructure developments), resulting in the degradation of its quantitative and qualitative characteristics. As in most Mediterranean countries, water resource management in Alfeios River Basin has been focused up to now on an essentially supply-driven approach. It is still characterized by a lack of effective operational strategies. Authority responsibility relationships are fragmented, and law enforcement and policy implementation are weak. The present regulated water allocation puzzle entails a mixture of hydropower generation, irrigation, drinking water supply and recreational activities. Under these conditions its water resources management is characterised by high uncertainty and by vague and imprecise data. The considered methodology has been developed in order to deal with uncertainties expressed as either probability distributions, or/and fuzzy boundary intervals, derived by associated α-cut levels. In this framework a set of deterministic submodels is studied through linear programming. The ad hoc water resources management and alternative management patterns in an Alfeios subbasin are analyzed and evaluated under various scenarios, using the above mentioned methodology, aiming to promote a sustainable and equitable water management. Li, Y.P., Huang, G.H. and S.L., Nie, (2010), Planning water resources

  13. Optimization extraction conditions for improving phenolic content and antioxidant activity in Berberis asiatica fruits using response surface methodology (RSM).

    PubMed

    Belwal, Tarun; Dhyani, Praveen; Bhatt, Indra D; Rawal, Ranbeer Singh; Pande, Veena

    2016-09-15

    This study for the first time designed to optimize the extraction of phenolic compounds and antioxidant potential of Berberis asiatica fruits using response surface methodology (RSM). Solvent selection was done based on the preliminary experiments and a five-factors-three-level, Central Composite Design (CCD). Extraction temperature (X1), sample to solvent ratio (X3) and solvent concentration (X5) significantly affect response variables. The quadratic model well fitted for all the responses. Under optimal extraction conditions, the dried fruit sample mixed with 80% methanol having 3.0 pH in a ratio of 1:50 and the mixture was heated at 80 °C for 30 min; the measured parameters was found in accordance with the predicted values. High Performance Liquid Chromatography (HPLC) analysis at optimized condition reveals 6 phenolic compounds. The results suggest that optimization of the extraction conditions is critical for accurate quantification of phenolics and antioxidants in Berberis asiatica fruits, which may further be utilized for industrial extraction procedure. PMID:27080887

  14. Tracking MOV operability under degraded voltage condition by periodic test measurements

    SciTech Connect

    Hussain, B.; Behera, A.K.; Alsammarae, A.J.

    1996-12-31

    The purpose of this paper is to develop a methodology for evaluating the operability of Alternating Current (AC) Motor Operated Valve (MOV) under degraded voltage condition, based on the seating parameter measured during surveillance/testing. This approach will help resolve Nuclear Regulatory Commission`s (NRC`s) concern on verifying the AC MOV`s design basis capability through periodic testing.

  15. Measuring and Targeting Internal Conditions for School Effectiveness in the Free State of South Africa

    ERIC Educational Resources Information Center

    Kgaile, Abraham; Morrison, Keith

    2006-01-01

    A questionnaire-based methodology for constructing an overall index of school effectiveness is reported, focusing on within-school conditions of schools, which is currently being used in the Free State of South Africa. The article reports the construction and use of a straightforward instrument for measuring the effectiveness of key aspects of the…

  16. Mathematical simulation of power conditioning systems. Volume 1: Simulation of elementary units. Report on simulation methodology

    NASA Technical Reports Server (NTRS)

    Prajous, R.; Mazankine, J.; Ippolito, J. C.

    1978-01-01

    Methods and algorithms used for the simulation of elementary power conditioning units buck, boost, and buck-boost, as well as shunt PWM are described. Definitions are given of similar converters and reduced parameters. The various parts of the simulation to be carried out are dealt with; local stability, corrective network, measurements of input-output impedance and global stability. A simulation example is given.

  17. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology

    PubMed Central

    Arulmathi, P.; Elangovan, G.; Begum, A. Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm2, electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively. PMID:26491716

  18. Methodology of Blade Unsteady Pressure Measurement in the NASA Transonic Flutter Cascade

    NASA Technical Reports Server (NTRS)

    Lepicovsky, J.; McFarland, E. R.; Capece, V. R.; Jett, T. A.; Senyitko, R. G.

    2002-01-01

    In this report the methodology adopted to measure unsteady pressures on blade surfaces in the NASA Transonic Flutter Cascade under conditions of simulated blade flutter is described. The previous work done in this cascade reported that the oscillating cascade produced waves, which for some interblade phase angles reflected off the wind tunnel walls back into the cascade, interfered with the cascade unsteady aerodynamics, and contaminated the acquired data. To alleviate the problems with data contamination due to the back wall interference, a method of influence coefficients was selected for the future unsteady work in this cascade. In this approach only one blade in the cascade is oscillated at a time. The majority of the report is concerned with the experimental technique used and the experimental data generated in the facility. The report presents a list of all test conditions for the small amplitude of blade oscillations, and shows examples of some of the results achieved. The report does not discuss data analysis procedures like ensemble averaging, frequency analysis, and unsteady blade loading diagrams reconstructed using the influence coefficient method. Finally, the report presents the lessons learned from this phase of the experimental effort, and suggests the improvements and directions of the experimental work for tests to be carried out for large oscillation amplitudes.

  19. Technical Report on Preliminary Methodology for Enhancing Risk Monitors with Integrated Equipment Condition Assessment

    SciTech Connect

    Ramuhalli, Pradeep; Coles, Garill A.; Coble, Jamie B.; Hirt, Evelyn H.

    2013-09-17

    Small modular reactors (SMRs) generally include reactors with electric output of ~350 MWe or less (this cutoff varies somewhat but is substantially less than full-size plant output of 700 MWe or more). Advanced SMRs (AdvSMRs) refer to a specific class of SMRs and are based on modularization of advanced reactor concepts. AdvSMRs may provide a longer-term alternative to traditional light-water reactors (LWRs) and SMRs based on integral pressurized water reactor concepts currently being considered. Enhancing affordability of AdvSMRs will be critical to ensuring wider deployment. AdvSMRs suffer from loss of economies of scale inherent in small reactors when compared to large (~greater than 600 MWe output) reactors. Some of this loss can be recovered through reduced capital costs through smaller size, fewer components, modular fabrication processes, and the opportunity for modular construction. However, the controllable day-to-day costs of AdvSMRs will be dominated by operation and maintenance (O&M) costs. Technologies that help characterize real-time risk are important for controlling O&M costs. Risk monitors are used in current nuclear power plants to provide a point-in-time estimate of the system risk given the current plant configuration (e.g., equipment availability, operational regime, and environmental conditions). However, current risk monitors are unable to support the capability requirements listed above as they do not take into account plant-specific normal, abnormal, and deteriorating states of active components and systems. This report documents technology developments that are a step towards enhancing risk monitors that, if integrated with supervisory plant control systems, can provide the capability requirements listed and meet the goals of controlling O&M costs. The report describes research results from an initial methodology for enhanced risk monitors by integrating real-time information about equipment condition and POF into risk monitors.

  20. An Updated Methodology for Enhancing Risk Monitors with Integrated Equipment Condition Assessment

    SciTech Connect

    Ramuhalli, Pradeep; Hirt, Evelyn H.; Coles, Garill A.; Bonebrake, Christopher A.; Ivans, William J.; Wootan, David W.; Mitchell, Mark R.

    2014-07-18

    Small modular reactors (SMRs) generally include reactors with electric output of ~350 MWe or less (this cutoff varies somewhat but is substantially less than full-size plant output of 700 MWe or more). Advanced SMRs (AdvSMRs) refer to a specific class of SMRs and are based on modularization of advanced reactor concepts. Enhancing affordability of AdvSMRs will be critical to ensuring wider deployment, as AdvSMRs suffer from loss of economies of scale inherent in small reactors when compared to large (~greater than 600 MWe output) reactors and the controllable day-to-day costs of AdvSMRs will be dominated by operation and maintenance (O&M) costs. Technologies that help characterize real-time risk are important for controlling O&M costs. Risk monitors are used in current nuclear power plants to provide a point-in-time estimate of the system risk given the current plant configuration (e.g., equipment availability, operational regime, and environmental conditions). However, current risk monitors are unable to support the capability requirements listed above as they do not take into account plant-specific normal, abnormal, and deteriorating states of active components and systems. This report documents technology developments towards enhancing risk monitors that, if integrated with supervisory plant control systems, can provide the capability requirements listed and meet the goals of controlling O&M costs. The report describes research results on augmenting an initial methodology for enhanced risk monitors that integrate real-time information about equipment condition and POF into risk monitors. Methods to propagate uncertainty through the enhanced risk monitor are evaluated. Available data to quantify the level of uncertainty and the POF of key components are examined for their relevance, and a status update of this data evaluation is described. Finally, we describe potential targets for developing new risk metrics that may be useful for studying trade-offs for economic

  1. Measuring Soil Nitrogen Mineralization under Field Conditions

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Land application of animal manure is known to alter rates of nitrogen (N) mineralization in soils, but quantitative information concerning intensity and duration of these effects has been difficult to obtain under field conditions. We estimated net effects of manure on N mineralization in soils unde...

  2. The Methodology of Doppler-Derived Central Blood Flow Measurements in Newborn Infants

    PubMed Central

    de Waal, Koert A.

    2012-01-01

    Central blood flow (CBF) measurements are measurements in and around the heart. It incorporates cardiac output, but also measurements of cardiac input and assessment of intra- and extracardiac shunts. CBF can be measured in the central circulation as right or left ventricular output (RVO or LVO) and/or as cardiac input measured at the superior vena cava (SVC flow). Assessment of shunts incorporates evaluation of the ductus arteriosus and the foramen ovale. This paper describes the methodology of CBF measurements in newborn infants. It provides a brief overview of the evolution of Doppler ultrasound blood flow measurements, basic principles of Doppler ultrasound, and an overview of all used methodology in the literature. A general guide for interpretation and normal values with suggested cutoffs of CBFs are provided for clinical use. PMID:22291718

  3. Methodological Conditions of Congruent Factors: A Comparison of EEG Frequency Structure Between Hemispheres.

    PubMed

    Andresen, B; Stemmler, G; Thom, E; Irrgang, E

    1984-01-01

    A study was conducted to examine congruence relations of EEG frequency factors between hemispheres. The central questions were aimed at procedural details of the factoring process which have the greatest "destructive" influence on congruence of factor patterns. On the basis of serially-rotated oblique solutions and evaluative simple structure criteria, two methodological results are reported: (a) the original principal components of two lead placements show a marked departure from an ideal congruence pattern, but can be forced into high congruence beyond 0.90 for the first twelve components by orthogonal procrustes rotation; (b) after this matching of principal components the expected optimum of congruence is only achieved in rotated solutions showing an optimal simple structure in terms of the Index of Factorial Simplicity (Kaiser, 1974). Additionally, the study leads to two substantive electroencephalographic results; (c) the factorial frequency structure of spontaneous EEG from two homologous positions (C3, C4) of the hemispheres is highly congruent. Thus, an identical measurement rationale for EEG frequency bands can be justified in asymmetry research; (d) the spectral dimensions of the alpha region show good correspondence to a recently published synoptic model of factor analytically defined EEG bands (Andresen, Thom, Irrgang, & Stemmler, 1982). The weight system of the three alpha components in this model can be recommended for psychophysiologically oriented EEG research. PMID:26776065

  4. A measurement methodology for dynamic angle of sight errors in hardware-in-the-loop simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Wen-pan; Wu, Jun-hui; Gan, Lin; Zhao, Hong-peng; Liang, Wei-wei

    2015-10-01

    In order to precisely measure dynamic angle of sight for hardware-in-the-loop simulation, a dynamic measurement methodology was established and a set of measurement system was built. The errors and drifts, such as synchronization delay, CCD measurement error and drift, laser spot error on diffuse reflection plane and optics axis drift of laser, were measured and analyzed. First, by analyzing and measuring synchronization time between laser and time of controlling data, an error control method was devised and lowered synchronization delay to 21μs. Then, the relationship between CCD device and laser spot position was calibrated precisely and fitted by two-dimension surface fitting. CCD measurement error and drift were controlled below 0.26mrad. Next, angular resolution was calculated, and laser spot error on diffuse reflection plane was estimated to be 0.065mrad. Finally, optics axis drift of laser was analyzed and measured which did not exceed 0.06mrad. The measurement results indicate that the maximum of errors and drifts of the measurement methodology is less than 0.275mrad. The methodology can satisfy the measurement on dynamic angle of sight of higher precision and lager scale.

  5. Validation of a CFD Methodology for Variable Speed Power Turbine Relevant Conditions

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.; Giel, Paul W.; McVetta, Ashlie B.

    2013-01-01

    Analysis tools are needed to investigate aerodynamic performance of Variable-Speed Power Turbines (VSPT) for rotorcraft applications. The VSPT operates at low Reynolds numbers (transitional flow) and over a wide range of incidence. Previously, the capability of a published three-equation turbulence model to predict accurately the transition location for three-dimensional heat transfer problems was assessed. In this paper, the results of a post-diction exercise using a three-dimensional flow in a transonic linear cascade comprising VSPT blading are presented. The measured blade pressure distributions and exit total pressure and flow angles for two incidence angles corresponding to cruise (i = 5.8deg) and takeoff (i = -36.7deg) were used for this study. For the higher loading condition of cruise and the negative incidence condition of takeoff, overall agreement with data may be considered satisfactory but areas of needed improvement are also indicated.

  6. A coupled Bayesian and fault tree methodology to assess future groundwater conditions in light of climate change

    NASA Astrophysics Data System (ADS)

    Huang, J. J.; Du, M.; McBean, E. A.; Wang, H.; Wang, J.

    2014-08-01

    Maintaining acceptable groundwater levels, particularly in arid areas, while protecting ecosystems, are key measures against desertification. Due to complicated hydrological processes and their inherent uncertainties, investigations of groundwater recharge conditions are challenging, particularly in arid areas under climate changing conditions. To assist planning to protect against desertification, a fault tree methodology, in conjunction with fuzzy logic and Bayesian data mining, are applied to Minqin Oasis, a highly vulnerable regime in northern China. A set of risk factors is employed within the fault tree framework, with fuzzy logic translating qualitative risk data into probabilities. Bayesian data mining is used to quantify the contribution of each risk factor to the final aggregated risk. The implications of both historical and future climate trends are employed for temperature, precipitation and potential evapotranspiration (PET) to assess water table changes under various future scenarios. The findings indicate that water table levels will continue to drop at the rate of 0.6 m yr-1 in the future when climatic effects alone are considered, if agricultural and industrial production capacity remain at 2004 levels.

  7. Operant place conditioning measures examined using two nondrug reinforcers.

    PubMed

    Crowder, W F; Hutto, C W

    1992-04-01

    Detection of water and social play reinforcers by a place conditioning method based on instrumental conditioning was investigated in rats and compared to detection by conditioned place preference, a method currently used primarily to measure drug reinforcement. Operant place conditioning measures of reinforcement were choices between the reward and nonreward chambers during an apparatus exploration test and during a discrete-trials choice test and also, in some experiments, choices and latencies of chamber entry during training. Three of these four measures showed larger reinforcement effects than did the conditioned place preference measure of relative time spent in the reward chamber. By all reinforcement measures, conditioned place preference training was effective with water reinforcement but was ineffective with social reinforcement. Operant place conditioning was effective with both reinforcers by all measures. PMID:1594650

  8. Analytical Methodology Used To Assess/Refine Observatory Thermal Vacuum Test Conditions For the Landsat 8 Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Fantano, Louis

    2015-01-01

    Thermal and Fluids Analysis Workshop Silver Spring, MD NCTS 21070-15 The Landsat 8 Data Continuity Mission, which is part of the United States Geologic Survey (USGS), launched February 11, 2013. A Landsat environmental test requirement mandated that test conditions bound worst-case flight thermal environments. This paper describes a rigorous analytical methodology applied to assess refine proposed thermal vacuum test conditions and the issues encountered attempting to satisfy this requirement.

  9. Integrating Multi-Tiered Measurement Outcomes for Special Education Eligibility with Sequential Decision-Making Methodology

    ERIC Educational Resources Information Center

    Decker, Scott L.; Englund, Julia; Albritton, Kizzy

    2012-01-01

    Changes to federal guidelines for the identification of children with disabilities have supported the use of multi-tiered models of service delivery. This study investigated the impact of measurement methodology as used across numerous tiers in determining special education eligibility. Four studies were completed using a sample of inner-city…

  10. Computer Science and Technology: Measurement of Interative Computing: Methodology and Application.

    ERIC Educational Resources Information Center

    Cotton, Ira W.

    This dissertation reports the development and application of a new methodology for the measurement and evaluation of interactive computing, applied to either the users of an interactive computing system or to the system itself, including the service computer and any communications network through which the service is delivered. The focus is on the…

  11. A BOUNDARY LAYER SAMPLING METHODOLOGY FOR MEASURING GASEOUS EMISSIONS FROM CAFO'S

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Various methodologies have been employed to measure CAFO emissions (e.g. flux chambers, lasers, and stationary towers), but these methods are usually limited in their ability to fully characterize the emission plume from a heterogeneous farm, and thus are limited in their ability to quantify total e...

  12. Measuring College Students' Alcohol Consumption in Natural Drinking Environments: Field Methodologies for Bars and Parties

    ERIC Educational Resources Information Center

    Clapp, John D.; Holmes, Megan R.; Reed, Mark B.; Shillington, Audrey M.; Freisthler, Bridget; Lange, James E.

    2007-01-01

    In recent years researchers have paid substantial attention to the issue of college students' alcohol use. One limitation to the current literature is an over reliance on retrospective, self-report survey data. This article presents field methodologies for measuring college students' alcohol consumption in natural drinking environments.…

  13. Extended Axiomatic Conjoint Measurement: A Solution to a Methodological Problem in Studying Fertility-Related Behaviors.

    ERIC Educational Resources Information Center

    Nickerson, Carol A.; McClelland, Gary H.

    1988-01-01

    A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)

  14. Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?

    ERIC Educational Resources Information Center

    Brondani, Mario; He, Sarah

    2013-01-01

    Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…

  15. 39 CFR 3055.5 - Changes to measurement systems, service standards, service goals, or reporting methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 39 Postal Service 1 2012-07-01 2012-07-01 false Changes to measurement systems, service standards, service goals, or reporting methodologies. 3055.5 Section 3055.5 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL SERVICE PERFORMANCE AND CUSTOMER SATISFACTION REPORTING Annual Reporting of Service Performance Achievements § 3055.5 Changes...

  16. 39 CFR 3055.5 - Changes to measurement systems, service standards, service goals, or reporting methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 39 Postal Service 1 2014-07-01 2014-07-01 false Changes to measurement systems, service standards, service goals, or reporting methodologies. 3055.5 Section 3055.5 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL SERVICE PERFORMANCE AND CUSTOMER SATISFACTION REPORTING Annual Reporting of...

  17. 39 CFR 3055.5 - Changes to measurement systems, service standards, service goals, or reporting methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 39 Postal Service 1 2013-07-01 2013-07-01 false Changes to measurement systems, service standards, service goals, or reporting methodologies. 3055.5 Section 3055.5 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL SERVICE PERFORMANCE AND CUSTOMER SATISFACTION REPORTING Annual Reporting of...

  18. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a... of organs transplanted per standard criteria donor, including pancreata used for islet...

  19. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a... of organs transplanted per standard criteria donor, including pancreata used for islet...

  20. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a... transplantation; (ii) The number of organs transplanted per expanded criteria donor, including pancreata used...

  1. 42 CFR 486.318 - Condition: Outcome measures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SUPPLIERS Requirements for Certification and Designation and Conditions for Coverage: Organ Procurement Organizations Organ Procurement Organization Outcome Requirements § 486.318 Condition: Outcome measures. (a... transplantation; (ii) The number of organs transplanted per expanded criteria donor, including pancreata used...

  2. The development and application of an automatic boundary segmentation methodology to evaluate the vaporizing characteristics of diesel spray under engine-like conditions

    NASA Astrophysics Data System (ADS)

    Ma, Y. J.; Huang, R. H.; Deng, P.; Huang, S.

    2015-04-01

    Studying the vaporizing characteristics of diesel spray could greatly help to reduce engine emission and improve performance. The high-speed schlieren imaging method is an important optical technique for investigating the macroscopic vaporizing morphological evolution of liquid fuel, and pre-combustion constant volume combustion bombs are often used to simulate the high pressure and high temperature conditions occurring in diesel engines. Complicated background schlieren noises make it difficult to segment the spray region in schlieren spray images. To tackle this problem, this paper develops a vaporizing spray boundary segmentation methodology based on an automatic threshold determination algorithm. The methodology was also used to quantify the macroscopic characteristics of vaporizing sprays including tip penetration, near-field and far-field angles, and projected spray area and spray volume. The spray boundary segmentation methodology was realized in a MATLAB-based program. Comparisons were made between the spray characteristics obtained using the program method and those acquired using a manual method and the Hiroyasu prediction model. It is demonstrated that the methodology can segment and measure vaporizing sprays precisely and efficiently. Furthermore, the experimental results show that the spray angles were slightly affected by the injection pressure at high temperature and high pressure and under inert conditions. A higher injection pressure leads to longer spray tip penetration and a larger projected area and volume, while elevating the temperature of the environment can significantly promote the evaporation of cold fuel.

  3. ASSESSING THE CONDITION OF SOUTH CAROLINA'S ESTUARIES: A NEW APPROACH INVOLVING INTEGRATED MEASURES OF CONDITION

    EPA Science Inventory

    The South Carolina Estuarine and Coastal Assessment Program (SCECAP) was initiated in 1999 to assess the condition of the state's coastal habitats using multiple measures of water quality, sediment quality, and biological condition. Sampling has subsequently been expanded to incl...

  4. A system and methodology for measuring volatile organic compounds produced by hydroponic lettuce in a controlled environment

    NASA Technical Reports Server (NTRS)

    Charron, C. S.; Cantliffe, D. J.; Wheeler, R. M.; Manukian, A.; Heath, R. R.

    1996-01-01

    A system and methodology were developed for the nondestructive qualitative and quantitative analysis of volatile emissions from hydroponically grown 'Waldmann's Green' leaf lettuce (Lactuca sativa L.). Photosynthetic photon flux (PPF), photoperiod, and temperature were automatically controlled and monitored in a growth chamber modified for the collection of plant volatiles. The lipoxygenase pathway products (Z)-3-hexenal, (Z)-3-hexenol, and (Z)-3-hexenyl acetate were emitted by lettuce plants after the transition from the light period to the dark period. The volatile collection system developed in this study enabled measurements of volatiles emitted by intact plants, from planting to harvest, under controlled environmental conditions.

  5. An inverse problem for a class of conditional probability measure-dependent evolution equations

    NASA Astrophysics Data System (ADS)

    Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.

    2016-09-01

    We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by partial differential equation models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach.

  6. Inferential Conditions in the Statistical Detection of Measurement Bias.

    ERIC Educational Resources Information Center

    Millsap, Roger E.; Meredith, William

    1992-01-01

    Inferential conditions in the statistical detection of measurement bias are discussed in the contexts of differential item functioning and predictive bias in educational and employment settings. It is concluded that bias measures that rely strictly on observed measures are not generally diagnostic of measurement bias or lack of bias. (SLD)

  7. Major methodological constraints to the assessment of environmental status based on the condition of benthic communities

    NASA Astrophysics Data System (ADS)

    Medeiros, João Paulo; Pinto, Vanessa; Sá, Erica; Silva, Gilda; Azeda, Carla; Pereira, Tadeu; Quintella, Bernardo; Raposo de Almeida, Pedro; Lino Costa, José; José Costa, Maria; Chainho, Paula

    2014-05-01

    The Marine Strategy Framework Directive (MSFD) was published in 2008 and requires Member States to take the necessary measures to achieve or maintain good environmental status in aquatic ecosystems by the year of 2020. The MSFD indicates 11 qualitative descriptors for environmental status assessment, including seafloor integrity, using the condition of the benthic community as an assessment indicator. Member States will have to define monitoring programs for each of the MSFD descriptors based on those indicators in order to understand which areas are in a Good Environmental Status and what measures need to be implemented to improve the status of areas that fail to achieve that major objective. Coastal and offshore marine waters are not frequently monitored in Portugal and assessment tools have only been developed very recently with the implementation of the Water Framework Directive (WFD). The lack of historical data and knowledge on the constraints of benthic indicators in coastal areas requires the development of specific studies addressing this issue. The major objective of the current study was to develop and test and experimental design to assess impacts of offshore projects. The experimental design consisted on the seasonal and interannual assessment of benthic invertebrate communities in the area of future implementation of the structures (impact) and two potential control areas 2 km from the impact area. Seasonal benthic samples were collected at nine random locations within the impact and control areas in two consecutive years. Metrics included in the Portuguese benthic assessment tool (P-BAT) were calculated since this multimetric tool was proposed for the assessment of the ecological status in Portuguese coastal areas under the WFD. Results indicated a high taxonomic richness in this coastal area and no significant differences were found between impact and control areas, indicating the feasibility of establishing adequate control areas in marine

  8. Methodology for prediction of ecological impacts under real conditions in Mexico

    NASA Astrophysics Data System (ADS)

    Bojórquez-Tapia, Luis Antonio

    1989-09-01

    Most environmental impact assessments in Mexico are short-duration studies that tend to be descriptive rather than analytical and predictive. Therefore, a general methodology has been developed for the prediction and communication of ecological impacts in situations where time, funds, and information are major limitations. The approach] involves two stages: (1) environmental characterization and (2) prediction and analysis of impacts. In stage 1, the assessment team is divided into autonomous groups; the groups work on the same land systems, which are characterized based on literature, aerial photography, and short field observations. In stage 2, the groups meet to evaluate impacts by means of interaction matrices and nonnumerical simulation models. This methodology has proved to be valuable to perform environmental impact statements for large engineering projects in Mexico.

  9. A Practical Methodology to Measure Unbiased Gas Chromatographic Retention Factor vs. Temperature Relationships

    PubMed Central

    Peng, Baijie; Kuo, Mei-Yi; Yang, Panhia; Hewitt, Joshua T.; Boswell, Paul G.

    2014-01-01

    Compound identification continues to be a major challenge. Gas chromatography-mass spectrometry (GC-MS) is a primary tool used for this purpose, but the GC retention information it provides is underutilized because existing retention databases are experimentally restrictive and unreliable. A methodology called “retention projection” has the potential to overcome these limitations, but it requires the retention factor (k) vs. T relationship of a compound to calculate its retention time. Direct methods of measuring k vs. T relationships from a series of isothermal runs are tedious and time-consuming. Instead, a series of temperature programs can be used to quickly measure the k vs. T relationships, but they are generally not as accurate when measured this way because they are strongly biased by non-ideal behavior of the GC system in each of the runs. In this work, we overcome that problem by using the retention times of 25 n-alkanes to back-calculate the effective temperature profile and hold-up time vs. T profiles produced in each of six temperature programs. When the profiles were measured this way and taken into account, the k vs. T relationships measured from each of two different GC-MS instruments were nearly as accurate as the ones measured isothermally, showing less than 2-fold more error. Furthermore, temperature-programmed retention times calculated in five other labs from the new k vs. T relationships had the same distribution of error as when they were calculated from k vs. T relationships measured isothermally. Free software was developed to make the methodology easy to use. The new methodology potentially provides a relatively fast and easy way to measure unbiased k vs. T relationships. PMID:25496658

  10. Validation of a Monte Carlo Based Depletion Methodology Using HFIR Post-Irradiation Measurements

    SciTech Connect

    Chandler, David; Maldonado, G Ivan; Primm, Trent

    2009-11-01

    Post-irradiation uranium isotopic atomic densities within the core of the High Flux Isotope Reactor (HFIR) were calculated and compared to uranium mass spectrographic data measured in the late 1960s and early 70s [1]. This study was performed in order to validate a Monte Carlo based depletion methodology for calculating the burn-up dependent nuclide inventory, specifically the post-irradiation uranium

  11. Methodologies for measuring travelers' risk perception of infectious diseases: A systematic review.

    PubMed

    Sridhar, Shruti; Régner, Isabelle; Brouqui, Philippe; Gautret, Philippe

    2016-01-01

    Numerous studies in the past have stressed the importance of travelers' psychology and perception in the implementation of preventive measures. The aim of this systematic review was to identify the methodologies used in studies reporting on travelers' risk perception of infectious diseases. A systematic search for relevant literature was conducted according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. There were 39 studies identified. In 35 of 39 studies, the methodology used was that of a knowledge, attitude and practice (KAP) survey based on questionnaires. One study used a combination of questionnaires and a visual psychometric measuring instrument called the 'pictorial representation of illness and self-measurement" or PRISM. One study used a self-representation model (SRM) method. Two studies measured psychosocial factors. Valuable information was obtained from KAP surveys showing an overall lack of knowledge among travelers about the most frequent travel-associated infections and associated preventive measures. This methodological approach however, is mainly descriptive, addressing knowledge, attitudes, and practices separately and lacking an examination of the interrelationships between these three components. Another limitation of the KAP method is underestimating psychosocial variables that have proved influential in health related behaviors, including perceived benefits and costs of preventive measures, perceived social pressure, perceived personal control, unrealistic optimism and risk propensity. Future risk perception studies in travel medicine should consider psychosocial variables with inferential and multivariate statistical analyses. The use of implicit measurements of attitudes could also provide new insights in the field of travelers' risk perception of travel-associated infectious diseases. PMID:27238906

  12. A methodology to determine boundary conditions from forced convection experiments using liquid crystal thermography

    NASA Astrophysics Data System (ADS)

    Jakkareddy, Pradeep S.; Balaji, C.

    2016-05-01

    This paper reports the results of an experimental study to estimate the heat flux and convective heat transfer coefficient using liquid crystal thermography and Bayesian inference in a heat generating sphere, enclosed in a cubical Teflon block. The geometry considered for the experiments comprises a heater inserted in a hollow hemispherical aluminium ball, resulting in a volumetric heat generation source that is placed at the center of the Teflon block. Calibrated thermochromic liquid crystal sheets are used to capture the temperature distribution at the front face of the Teflon block. The forward model is the three dimensional conduction equation which is solved within the Teflon block to obtain steady state temperatures, using COMSOL. Match up experiments are carried out for various velocities by minimizing the residual between TLC and simulated temperatures for every assumed loss coefficient, to obtain a correlation of average Nusselt number against Reynolds number. This is used for prescribing the boundary condition for the solution to the forward model. A surrogate model obtained by artificial neural network built upon the data from COMSOL simulations is used to drive a Markov Chain Monte Carlo based Metropolis Hastings algorithm to generate the samples. Bayesian inference is adopted to solve the inverse problem for determination of heat flux and heat transfer coefficient from the measured temperature field. Point estimates of the posterior like the mean, maximum a posteriori and standard deviation of the retrieved heat flux and convective heat transfer coefficient are reported. Additionally the effect of number of samples on the performance of the estimation process has been investigated.

  13. Testing VOC emission measurement techniques in wood-coating industrial processes and developing a cost-effective measurement methodology.

    PubMed

    Ojala, S; Lassi, U; Keiski, R L

    2006-01-01

    Availability of reliable emission measurements of concentrated volatile organic compounds (VOCs) bear great significance in facilitating the selection of a feasible emission abatement technique. There are numerous methods, which can be used to measure VOC emissions, however, there is no single method that would allow sampling of the whole range of volatile organics. In addition, research efforts are usually directed to the development of measuring VOCs in diluted concentrations. Therefore, there is a need for a novel measurement method, which can give reliable results while entailing simple operations and low costs. This paper represents a development effort of finding a reliable measurement procedure. A methodology is proposed and used to measure solvent emissions from coating processes. PMID:15893795

  14. A methodology to condition distorted acoustic emission signals to identify fracture timing from human cadaver spine impact tests.

    PubMed

    Arun, Mike W J; Yoganandan, Narayan; Stemper, Brian D; Pintar, Frank A

    2014-12-01

    While studies have used acoustic sensors to determine fracture initiation time in biomechanical studies, a systematic procedure is not established to process acoustic signals. The objective of the study was to develop a methodology to condition distorted acoustic emission data using signal processing techniques to identify fracture initiation time. The methodology was developed from testing a human cadaver lumbar spine column. Acoustic sensors were glued to all vertebrae, high-rate impact loading was applied, load-time histories were recorded (load cell), and fracture was documented using CT. Compression fracture occurred to L1 while other vertebrae were intact. FFT of raw voltage-time traces were used to determine an optimum frequency range associated with high decibel levels. Signals were bandpass filtered in this range. Bursting pattern was found in the fractured vertebra while signals from other vertebrae were silent. Bursting time was associated with time of fracture initiation. Force at fracture was determined using this time and force-time data. The methodology is independent of selecting parameters a priori such as fixing a voltage level(s), bandpass frequency and/or using force-time signal, and allows determination of force based on time identified during signal processing. The methodology can be used for different body regions in cadaver experiments. PMID:25241279

  15. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    NASA Astrophysics Data System (ADS)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  16. Optimisation of Ultrasound-Assisted Extraction Conditions for Phenolic Content and Antioxidant Capacity from Euphorbia tirucalli Using Response Surface Methodology

    PubMed Central

    Vuong, Quan V.; Goldsmith, Chloe D.; Dang, Trung Thanh; Nguyen, Van Tang; Bhuyan, Deep Jyoti; Sadeqzadeh, Elham; Scarlett, Christopher J.; Bowyer, Michael C.

    2014-01-01

    Euphorbia tirucalli (E. tirucalli) is now widely distributed around the world and is well known as a source of traditional medicine in many countries. This study aimed to utilise response surface methodology (RSM) to optimise ultrasonic-assisted extraction (UAE) conditions for total phenolic compounds (TPC) and antioxidant capacity from E. tirucalli leaf. The results showed that ultrasonic temperature, time and power effected TPC and antioxidant capacity; however, the effects varied. Ultrasonic power had the strongest influence on TPC; whereas ultrasonic temperature had the greatest impact on antioxidant capacity. Ultrasonic time had the least impact on both TPC and antioxidant capacity. The optimum UAE conditions were determined to be 50 °C, 90 min. and 200 W. Under these conditions, the E. tirucalli leaf extract yielded 2.93 mg GAE/g FW of TPC and exhibited potent antioxidant capacity. These conditions can be utilised for further isolation and purification of phenolic compounds from E. tirucalli leaf. PMID:26785074

  17. Sensitive Measures of Condition Change in EEG Data

    SciTech Connect

    Hively, L.M.; Gailey, P.C.; Protopopescu, V.

    1999-03-10

    We present a new, robust, model-independent technique for measuring condition change in nonlinear data. We define indicators of condition change by comparing distribution functions (DF) defined on the attractor for time windowed data sets via L{sub 1}-distance and {chi}{sup 2} statistics. The new measures are applied to EEG data with the objective of detecting the transition between non-seizure and epileptic brain activity in an accurate and timely manner. We find a clear superiority of the new metrics in comparison to traditional nonlinear measures as discriminators of condition change.

  18. A Methodology to Measure Synergy Among Energy-Efficiency Programs at the Program Participant Level

    SciTech Connect

    Tonn, B.E.

    2003-11-14

    This paper presents a methodology designed to measure synergy among energy-efficiency programs at the program participant level (e.g., households, firms). Three different definitions of synergy are provided: strong, moderate, and weak. Data to measure synergy can be collected through simple survey questions. Straightforward mathematical techniques can be used to estimate the three types of synergy and explore relative synergistic impacts of different subsets of programs. Empirical research is needed to test the concepts and methods and to establish quantitative expectations about synergistic relationships among programs. The market for new energy-efficient motors is the context used to illustrate all the concepts and methods in this paper.

  19. Structural acoustic control of plates with variable boundary conditions: design methodology.

    PubMed

    Sprofera, Joseph D; Cabell, Randolph H; Gibbs, Gary P; Clark, Robert L

    2007-07-01

    A method for optimizing a structural acoustic control system subject to variations in plate boundary conditions is provided. The assumed modes method is used to build a plate model with varying levels of rotational boundary stiffness to simulate the dynamics of a plate with uncertain edge conditions. A transducer placement scoring process, involving Hankel singular values, is combined with a genetic optimization routine to find spatial locations robust to boundary condition variation. Predicted frequency response characteristics are examined, and theoretically optimized results are discussed in relation to the range of boundary conditions investigated. Modeled results indicate that it is possible to minimize the impact of uncertain boundary conditions in active structural acoustic control by optimizing the placement of transducers with respect to those uncertainties. PMID:17614487

  20. An in-situ soil structure characterization methodology for measuring soil compaction

    NASA Astrophysics Data System (ADS)

    Dobos, Endre; Kriston, András; Juhász, András; Sulyok, Dénes

    2016-04-01

    The agricultural cultivation has several direct and indirect effects on the soil properties, among which the soil structure degradation is the best known and most detectable one. Soil structure degradation leads to several water and nutrient management problems, which reduce the efficiency of agricultural production. There are several innovative technological approaches aiming to reduce these negative impacts on the soil structure. The tests, validation and optimization of these methods require an adequate technology to measure the impacts on the complex soil system. This study aims to develop an in-situ soil structure and root development testing methodology, which can be used in field experiments and which allows one to follow the real time changes in the soil structure - evolution / degradation and its quantitative characterization. The method is adapted from remote sensing image processing technology. A specifically transformed A/4 size scanner is placed into the soil into a safe depth that cannot be reached by the agrotechnical treatments. Only the scanner USB cable comes to the surface to allow the image acquisition without any soil disturbance. Several images from the same place can be taken throughout the vegetation season to follow the soil consolidation and structure development after the last tillage treatment for the seedbed preparation. The scanned image of the soil profile is classified using supervised image classification, namely the maximum likelihood classification algorithm. The resulting image has two principal classes, soil matrix and pore space and other complementary classes to cover the occurring thematic classes, like roots, stones. The calculated data is calibrated with filed sampled porosity data. As the scanner is buried under the soil with no changes in light conditions, the image processing can be automated for better temporal comparison. Besides the total porosity each pore size fractions and their distributions can be calculated for

  1. [Methodological aspects of measuring injuries from traffic accidents at the site of occurrence].

    PubMed

    Híjar-Medina, M C; López-López, M V; Flores-Aldana, M; Anaya, R

    1997-02-01

    Traffic accidents are a well-known public health problem worldwide. In Mexico research into risk factors for motor involving vehicles accidents and their consequences has recently been taken into account. The relevant literature does not normally describe the methodological aspects involved in the collection of primary data, since most studies have used secondary data the good quality and validity of which are assumed. The paper presented seeks to discuss and share with researchers in this field, some of the methodological aspects to be considered in the attempt to recreate the scene of the accident and obtain information approximating to reality. The measurements in situ of, such traffic accident variables as injury, use of seat belt, speed and alcohol intake are discussed. PMID:9430931

  2. Optimization of meat level and processing conditions for development of chicken meat noodles using response surface methodology.

    PubMed

    Khare, Anshul Kumar; Biswas, Asim Kumar; Balasubramanium, S; Chatli, Manish Kumar; Sahoo, Jhari

    2015-06-01

    Response surface methodology (RSM) is a mathematical and statistical technique for testing multiple process variables and their interactive, linear and quadratic effects, and useful in solving multivariable equations obtained from experiments simultaneously. In present study optimum meat level and processing conditions for development of shelf stable chicken meat noodles was determined using central composite design of response surface methodology (RSM). Effects of meat level (110-130 g); processing conditions such as steaming time (12-18 min) and drying time (7-9 h) on the water activity, yield, water absorption index, water solubility index, hardness, overall acceptability and total colour change of chicken noodles were investigated. The aim of present study was to optimize meat level and processing conditions for development of chicken noodles. The coefficients of determination, R(2) of all the response variables were higher than 0.8. Based on the response surface and superimposed plots, the optimum conditions such as 60 % meat level, 12 min steaming time and 9 h drying time for development of chicken noodles with desired sensory quality was obtained. PMID:26028756

  3. Methodology for the assessment of measuring uncertainties of articulated arm coordinate measuring machines

    NASA Astrophysics Data System (ADS)

    Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Coquet, Richard; François Fontaine, Jean

    2014-12-01

    Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed.

  4. Methodology for the calibration of and data acquisition with a six-degree-of-freedom acceleration measurement device

    NASA Astrophysics Data System (ADS)

    Lee, Harvey; Plank, Gordon; Weinstock, Herbert; Coltman, Michael

    1989-06-01

    Described here is a methodology for calibrating and gathering data with a six-degree-of-freedom acceleration measurement device that is intended to measure head acceleration of anthropomorphic dummies and human volunteers in automotive crash testing and head impact trauma studies. Error models (system equations) were developed for systems using six accelerometers in a coplanar (3-2-1) configuration, nine accelerometers in a coplanar (3-3-3) configuration and nine accelerometers in a non-coplanar (3-2-2-2) configuration and the accuracy and stability of these systems were compared. The model was verified under various input and computational conditions. Results of parametric sensitivity analyses which included parameters such as system geometry, coordinate system location, data sample rate and accelerometer cross axis sensitivities are presented. Recommendations to optimize data collection and reduction are given. Complete source listings of all of the software developed are presented.

  5. Assessing the Practical Equivalence of Conversions when Measurement Conditions Change

    ERIC Educational Resources Information Center

    Liu, Jinghua; Dorans, Neil J.

    2012-01-01

    At times, the same set of test questions is administered under different measurement conditions that might affect the psychometric properties of the test scores enough to warrant different score conversions for the different conditions. We propose a procedure for assessing the practical equivalence of conversions developed for the same set of test…

  6. An innovative methodology for measurement of stress distribution of inflatable membrane structures

    NASA Astrophysics Data System (ADS)

    Zhao, Bing; Chen, Wujun; Hu, Jianhui; Chen, Jianwen; Qiu, Zhenyu; Zhou, Jinyu; Gao, Chengjun

    2016-02-01

    The inflatable membrane structure has been widely used in the fields of civil building, industrial building, airship, super pressure balloon and spacecraft. It is important to measure the stress distribution of the inflatable membrane structure because it influences the safety of the structural design. This paper presents an innovative methodology for the measurement and determination of the stress distribution of the inflatable membrane structure under different internal pressures, combining photogrammetry and the force-finding method. The shape of the inflatable membrane structure is maintained by the use of pressurized air, and the internal pressure is controlled and measured by means of an automatic pressure control system. The 3D coordinates of the marking points pasted on the membrane surface are acquired by three photographs captured from three cameras based on photogrammetry. After digitizing the markings on the photographs, the 3D curved surfaces are rebuilt. The continuous membrane surfaces are discretized into quadrilateral mesh and simulated by membrane links to calculate the stress distributions using the force-finding method. The internal pressure is simplified to the external node forces in the normal direction according to the contributory area of the node. Once the geometry x, the external force r and the topology C are obtained, the unknown force densities q in each link can be determined. Therefore, the stress distributions of the inflatable membrane structure can be calculated, combining the linear adjustment theory and the force density method based on the force equilibrium of inflated internal pressure and membrane internal force without considering the mechanical properties of the constitutive material. As the use of the inflatable membrane structure is attractive in the field of civil building, an ethylene-tetrafluoroethylene (ETFE) cushion is used with the measurement model to validate the proposed methodology. The comparisons between the

  7. The Epidemiology of Lead Toxicity in Adults: Measuring Dose and Consideration of Other Methodologic Issues

    PubMed Central

    Hu, Howard; Shih, Regina; Rothenberg, Stephen; Schwartz, Brian S.

    2007-01-01

    We review several issues of broad relevance to the interpretation of epidemiologic evidence concerning the toxicity of lead in adults, particularly regarding cognitive function and the cardiovascular system, which are the subjects of two systematic reviews that are also part of this mini-monograph. Chief among the recent developments in methodologic advances has been the refinement of concepts and methods for measuring individual lead dose in terms of appreciating distinctions between recent versus cumulative doses and the use of biological markers to measure these parameters in epidemiologic studies of chronic disease. Attention is focused particularly on bone lead levels measured by K-shell X-ray fluorescence as a relatively new biological marker of cumulative dose that has been used in many recent epidemiologic studies to generate insights into lead’s impact on cognition and risk of hypertension, as well as the alternative method of estimating cumulative dose using available repeated measures of blood lead to calculate an individual’s cumulative blood lead index. We review the relevance and interpretation of these lead biomarkers in the context of the toxico-kinetics of lead. In addition, we also discuss methodologic challenges that arise in studies of occupationally and environmentally exposed subjects and those concerning race/ethnicity and socioeconomic status and other important covariates. PMID:17431499

  8. Ultrasonic Technique for Density Measurement of Liquids in Extreme Conditions

    PubMed Central

    Kazys, Rymantas; Sliteris, Reimondas; Rekuviene, Regina; Zukauskas, Egidijus; Mazeika, Liudas

    2015-01-01

    An ultrasonic technique, invariant to temperature changes, for a density measurement of different liquids under in situ extreme conditions is presented. The influence of geometry and material parameters of the measurement system (transducer, waveguide, matching layer) on measurement accuracy and reliability is analyzed theoretically along with experimental results. The proposed method is based on measurement of the amplitude of the ultrasonic wave, reflected from the interface of the solid/liquid medium under investigation. In order to enhance sensitivity, the use of a quarter wavelength acoustic matching layer is proposed. Therefore, the sensitivity of the measurement system increases significantly. Density measurements quite often must be performed in extreme conditions at high temperature (up to 220 °C) and high pressure. In this case, metal waveguides between piezoelectric transducer and the measured liquid are used in order to protect the conventional transducer from the influence of high temperature and to avoid depolarization. The presented ultrasonic density measurement technique is suitable for density measurement in different materials, including liquids and polymer melts in extreme conditions. A new calibration algorithm was proposed. The metrological evaluation of the measurement method was performed. The expanded measurement uncertainty Uρ = 7.4 × 10−3 g/cm3 (1%). PMID:26262619

  9. Ultrasonic Technique for Density Measurement of Liquids in Extreme Conditions.

    PubMed

    Kazys, Rymantas; Sliteris, Reimondas; Rekuviene, Regina; Zukauskas, Egidijus; Mazeika, Liudas

    2015-01-01

    An ultrasonic technique, invariant to temperature changes, for a density measurement of different liquids under in situ extreme conditions is presented. The influence of geometry and material parameters of the measurement system (transducer, waveguide, matching layer) on measurement accuracy and reliability is analyzed theoretically along with experimental results. The proposed method is based on measurement of the amplitude of the ultrasonic wave, reflected from the interface of the solid/liquid medium under investigation. In order to enhance sensitivity, the use of a quarter wavelength acoustic matching layer is proposed. Therefore, the sensitivity of the measurement system increases significantly. Density measurements quite often must be performed in extreme conditions at high temperature (up to 220 °C) and high pressure. In this case, metal waveguides between piezoelectric transducer and the measured liquid are used in order to protect the conventional transducer from the influence of high temperature and to avoid depolarization. The presented ultrasonic density measurement technique is suitable for density measurement in different materials, including liquids and polymer melts in extreme conditions. A new calibration algorithm was proposed. The metrological evaluation of the measurement method was performed. The expanded measurement uncertainty Uρ = 7.4 × 10(-3) g/cm(3) (1%). PMID:26262619

  10. The methodologies and instruments of vehicle particulate emission measurement for current and future legislative regulations

    NASA Astrophysics Data System (ADS)

    Otsuki, Yoshinori; Nakamura, Hiroshi; Arai, Masataka; Xu, Min

    2015-09-01

    Since the health risks associated with fine particles whose aerodynamic diameters are smaller than 2.5 μm was first proven, regulations restricting particulate matter (PM) mass emissions from internal combustion engines have become increasingly severe. Accordingly, the gravimetric method of PM mass measurement is facing its lower limit of detection as the emissions from vehicles are further reduced. For example, the variation in the adsorption of gaseous components such as hydrocarbons from unburned fuel and lubricant oil and the presence of agglomerated particles, which are not directly generated in engine combustion but re-entrainment particulates from walls of sampling pipes, can cause uncertainty in measurement. The PM mass measurement systems and methodologies have been continuously refined in order to improve measurement accuracy. As an alternative metric, the particle measurement programme (PMP) within the United Nations Economic Commission for Europe (UNECE) developed a solid particle number measurement method in order to improve the sensitivity of particulate emission measurement from vehicles. Consequently, particle number (PN) limits were implemented into the regulations in Europe from 2011. Recently, portable emission measurement systems (PEMS) for in-use vehicle emission measurements are also attracting attention, currently in North America and Europe, and real-time PM mass and PN instruments are under evaluation.

  11. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  12. Hyper-X Mach 10 Engine Flowpath Development: Fifth Entry Test Conditions and Methodology

    NASA Technical Reports Server (NTRS)

    Bakos, R. J.; Tsai, C.-Y.; Rogers, R. C.; Shih, A. T.

    2001-01-01

    A series of Hyper-X Mach 10 flowpath ground tests are underway to obtain engine performance and operation data and to confirm and refine the flowpath design methods. The model used is a full-scale height, partial-width replica of the Hyper-X Research Vehicle propulsive flowpath with truncated forebody and aftbody. This is the fifth test entry for this model in the NASA-HYPULSE facility at GASL. For this entry the facility nozzle and model forebody were modified to better simulate the engine inflow conditions at the target flight conditions. The forebody was modified to be a wide flat plate with no flow fences, the facility nozzle Mach number was increased, and the model was positioned to be tested in a semi-direct-connect arrangement. This paper presents a review of the test conditions, model calibrations, and a description of steady flow confirmation. The test series included runs using hydrogen fuel, and a silane-in-hydrogen fuel mixture. Other test parameters included the model mounting angle (relative to the tunnel flow), and the test gas oxygen fraction to account for the presence of [NO] in the test gas at the M10 conditions.

  13. Measurement of multijunction cells under close-match conditions

    SciTech Connect

    Wilkinson, V.A.; Goodbody, C.; Williams, W.G.

    1997-12-31

    This paper presents details of a new close-match solar simulator developed for DERA`s Space Power Laboratory for the accurate characterization of multijunction solar cells. The authors present data on the simulator measurements of dual and triple junction cells. The measurements are compared with those made under less ideal spectral conditions.

  14. Measurement of Lubricating Condition between Swashplate and Shoe in Swashplate Compressors under Practical Operating Conditions

    NASA Astrophysics Data System (ADS)

    Suzuki, Hisashi; Fukuta, Mitsuhiro; Yanagisawa, Tadashi

    In this paper, lubricating conditions between a swashplate and a shoe in a swashplate compressor for automotive air conditioners is investigated experimentally. The conditions are measured with an electric resistance method that utilizes the swash plate and the shoe as electrodes respectively. The instrumented compressor is connected to an experimental cycle with R134a and operated under various operating conditions of pressure and rotational speed. An improved measurement technique and applying a solid contact ratio to the measurement results permit to measure the lubricating condition at high rotational speed (more than 8000 rpm) and to predic an occurrence of scuffing between the swashplate and the shoe, and therefore enables a detailed study of lubricating characteristics. It is shown by the measurement that the voltage of the contact signal decreases, which means better lubricating condition, with the decrease of the compression pressure and with the increase of the rotational speed from 1000 rpm through 5000 rpm. The lubricating condition tends to worsen at more than 5000 rpm. Furthermore, it is confirmed that the lubricating condition under transient operation is worse obviously as compared with that under steady-state operation.

  15. A method of measuring dynamic strain under electromagnetic forming conditions.

    PubMed

    Chen, Jinling; Xi, Xuekui; Wang, Sijun; Lu, Jun; Guo, Chenglong; Wang, Wenquan; Liu, Enke; Wang, Wenhong; Liu, Lin; Wu, Guangheng

    2016-04-01

    Dynamic strain measurement is rather important for the characterization of mechanical behaviors in electromagnetic forming process, but it has been hindered by high strain rate and serious electromagnetic interference for years. In this work, a simple and effective strain measuring technique for physical and mechanical behavior studies in the electromagnetic forming process has been developed. High resolution (∼5 ppm) of strain curves of a budging aluminum tube in pulsed electromagnetic field has been successfully measured using this technique. The measured strain rate is about 10(5) s(-1), which depends on the discharging conditions, nearly one order of magnitude of higher than that under conventional split Hopkins pressure bar loading conditions (∼10(4) s(-1)). It has been found that the dynamic fracture toughness of an aluminum alloy is significantly enhanced during the electromagnetic forming, which explains why the formability is much larger under electromagnetic forging conditions in comparison with conventional forging processes. PMID:27131683

  16. Absolute Radiation Measurements in Earth and Mars Entry Conditions

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.

    2014-01-01

    This paper reports on the measurement of radiative heating for shock heated flows which simulate conditions for Mars and Earth entries. Radiation measurements are made in NASA Ames' Electric Arc Shock Tube at velocities from 3-15 km/s in mixtures of N2/O2 and CO2/N2/Ar. The technique and limitations of the measurement are summarized in some detail. The absolute measurements will be discussed in regards to spectral features, radiative magnitude and spatiotemporal trends. Via analysis of spectra it is possible to extract properties such as electron density, and rotational, vibrational and electronic temperatures. Relaxation behind the shock is analyzed to determine how these properties relax to equilibrium and are used to validate and refine kinetic models. It is found that, for some conditions, some of these values diverge from non-equilibrium indicating a lack of similarity between the shock tube and free flight conditions. Possible reasons for this are discussed.

  17. A method of measuring dynamic strain under electromagnetic forming conditions

    NASA Astrophysics Data System (ADS)

    Chen, Jinling; Xi, Xuekui; Wang, Sijun; Lu, Jun; Guo, Chenglong; Wang, Wenquan; Liu, Enke; Wang, Wenhong; Liu, Lin; Wu, Guangheng

    2016-04-01

    Dynamic strain measurement is rather important for the characterization of mechanical behaviors in electromagnetic forming process, but it has been hindered by high strain rate and serious electromagnetic interference for years. In this work, a simple and effective strain measuring technique for physical and mechanical behavior studies in the electromagnetic forming process has been developed. High resolution (˜5 ppm) of strain curves of a budging aluminum tube in pulsed electromagnetic field has been successfully measured using this technique. The measured strain rate is about 105 s-1, which depends on the discharging conditions, nearly one order of magnitude of higher than that under conventional split Hopkins pressure bar loading conditions (˜104 s-1). It has been found that the dynamic fracture toughness of an aluminum alloy is significantly enhanced during the electromagnetic forming, which explains why the formability is much larger under electromagnetic forging conditions in comparison with conventional forging processes.

  18. Dielectric Barrier Discharge (DBD) Plasma Actuators Thrust-Measurement Methodology Incorporating New Anti-Thrust Hypothesis

    NASA Technical Reports Server (NTRS)

    Ashpis, David E.; Laun, Matthew C.

    2014-01-01

    We discuss thrust measurements of Dielectric Barrier Discharge (DBD) plasma actuators devices used for aerodynamic active flow control. After a review of our experience with conventional thrust measurement and significant non-repeatability of the results, we devised a suspended actuator test setup, and now present a methodology of thrust measurements with decreased uncertainty. The methodology consists of frequency scans at constant voltages. The procedure consists of increasing the frequency in a step-wise fashion from several Hz to the maximum frequency of several kHz, followed by frequency decrease back down to the start frequency of several Hz. This sequence is performed first at the highest voltage of interest, then repeated at lower voltages. The data in the descending frequency direction is more consistent and selected for reporting. Sample results show strong dependence of thrust on humidity which also affects the consistency and fluctuations of the measurements. We also observed negative values of thrust or "anti-thrust", at low frequencies between 4 Hz and up to 64 Hz. The anti-thrust is proportional to the mean-squared voltage and is frequency independent. Departures from the parabolic anti-thrust curve are correlated with appearance of visible plasma discharges. We propose the anti-thrust hypothesis. It states that the measured thrust is a sum of plasma thrust and anti-thrust, and assumes that the anti-thrust exists at all frequencies and voltages. The anti-thrust depends on actuator geometry and materials and on the test installation. It enables the separation of the plasma thrust from the measured total thrust. This approach enables more meaningful comparisons between actuators at different installations and laboratories. The dependence on test installation was validated by surrounding the actuator with a large diameter, grounded, metal sleeve.

  19. A smartphone-driven methodology for estimating physical activities and energy expenditure in free living conditions.

    PubMed

    Guidoux, Romain; Duclos, Martine; Fleury, Gérard; Lacomme, Philippe; Lamaudière, Nicolas; Manenq, Pierre-Henri; Paris, Ludivine; Ren, Libo; Rousset, Sylvie

    2014-12-01

    This paper introduces a function dedicated to the estimation of total energy expenditure (TEE) of daily activities based on data from accelerometers integrated into smartphones. The use of mass-market sensors such as accelerometers offers a promising solution for the general public due to the growing smartphone market over the last decade. The TEE estimation function quality was evaluated using data from intensive numerical experiments based, first, on 12 volunteers equipped with a smartphone and two research sensors (Armband and Actiheart) in controlled conditions (CC) and, then, on 30 other volunteers in free-living conditions (FLC). The TEE given by these two sensors in both conditions and estimated from the metabolic equivalent tasks (MET) in CC served as references during the creation and evaluation of the function. The TEE mean gap in absolute value between the function and the three references was 7.0%, 16.4% and 2.7% in CC, and 17.0% and 23.7% according to Armband and Actiheart, respectively, in FLC. This is the first step in the definition of a new feedback mechanism that promotes self-management and daily-efficiency evaluation of physical activity as part of an information system dedicated to the prevention of chronic diseases. PMID:25048352

  20. Optimization of fermentation conditions for 1,3-propanediol production by marine Klebsiella pneumonia HSL4 using response surface methodology

    NASA Astrophysics Data System (ADS)

    Li, Lili; Zhou, Sheng; Ji, Huasong; Gao, Ren; Qin, Qiwei

    2014-09-01

    The industrially important organic compound 1,3-propanediol (1,3-PDO) is mainly used as a building block for the production of various polymers. In the present study, response surface methodology protocol was followed to determine and optimize fermentation conditions for the maximum production of 1,3-PDO using marine-derived Klebsiella pneumoniae HSL4. Four nutritional supplements together with three independent culture conditions were optimized as follows: 29.3 g/L glycerol, 8.0 g/L K2 HPO4, 7.6 g/L (NH4)2 SO4, 3.0 g/L KH2 PO4, pH 7.1, cultivation at 35°C for 12 h. Under the optimal conditions, a maximum 1,3-PDO concentration of 14.5 g/L, a productivity of 1.21 g/(L·h) and a conversion of glycerol of 0.49 g/g were obtained. In comparison with the control conditions, fermentation under the optimized conditions achieved an increase of 38.8% in 1,3-PDO concentration, 39.0% in productivity and 25.7% in glycerol conversion in flask. This enhancement trend was further confirmed when the fermentation was conducted in a 5-L fermentor. The optimized fermentation conditions could be an important basis for developing lowcost, large-scale methods for industrial production of 1,3-PDO in the future.

  1. A Uniform Methodology for Measuring Parental Ability to Pay: Implications for the College Scholarship Service in 1975-76.

    ERIC Educational Resources Information Center

    Bowman, James L.

    The movement toward a uniform methodology of determining parental ability to pay to be used over time by all institutions and agencies awarding financial aid funds is consistent with the goals and objectives of the College Scholarship Service (CSS). This paper describes a proposed system for a uniform methodology for measuring parental ability to…

  2. Technical Guide for "Measuring Up 2006": Documenting Methodology, Indicators, and Data Sources. National Center #06-6

    ERIC Educational Resources Information Center

    National Center for Public Policy and Higher Education, 2006

    2006-01-01

    This "Technical Guide" describes the methodology and concepts used to measure and grade the performance of the 50 states in the higher education arena. Part I presents the methodology for grading states and provides information on data collection and reporting. Part II explains the indicators that comprise each of the graded categories.…

  3. Reducing ultraviolet radiation exposure to prevent skin cancer methodology and measurement.

    PubMed

    Glanz, Karen; Mayer, Joni A

    2005-08-01

    Skin cancer is the most common type of cancer, and is also one of the most preventable. This paper builds on an evidence review of skin cancer prevention interventions that was conducted for the Guide to Community Preventive Services (n=85 studies), and summarizes the state of knowledge about research methodology and measurement in studies of the effectiveness of interventions to reduce ultraviolet radiation (UVR) exposure. As this field advances, researchers should strive to minimize threats to validity in their study designs, as well as to consider the balance between internal and external validity. There is a need for more longer-duration interventions, and follow-up periods that make possible conclusions about the potential of these interventions to affect intermediate markers of skin cancer or at least sustained behavior change. Also, more work is needed to minimize attrition and characterize nonresponders and study dropouts. Verbal report measures of behavior are the most widely used measures of solar protection behavior. Given their limitations, investigators should routinely collect data about reliability and validity of those measures. They should also increase efforts to complement verbal data with objective measures including observations, skin reflectance, personal dosimetry, skin swabbing, and inspection of moles. Measures of environments and policies should incorporate observations, documentation, and direct measures of ambient UVR and shade. This article places the data derived from the evidence review in the context of needs and recommendations for future research in skin cancer prevention. PMID:16005810

  4. Refinement of current monitoring methodology for electroosmotic flow assessment under low ionic strength conditions.

    PubMed

    Saucedo-Espinosa, Mario A; Lapizco-Encinas, Blanca H

    2016-05-01

    Current monitoring is a well-established technique for the characterization of electroosmotic (EO) flow in microfluidic devices. This method relies on monitoring the time response of the electric current when a test buffer solution is displaced by an auxiliary solution using EO flow. In this scheme, each solution has a different ionic concentration (and electric conductivity). The difference in the ionic concentration of the two solutions defines the dynamic time response of the electric current and, hence, the current signal to be measured: larger concentration differences result in larger measurable signals. A small concentration difference is needed, however, to avoid dispersion at the interface between the two solutions, which can result in undesired pressure-driven flow that conflicts with the EO flow. Additional challenges arise as the conductivity of the test solution decreases, leading to a reduced electric current signal that may be masked by noise during the measuring process, making for a difficult estimation of an accurate EO mobility. This contribution presents a new scheme for current monitoring that employs multiple channels arranged in parallel, producing an increase in the signal-to-noise ratio of the electric current to be measured and increasing the estimation accuracy. The use of this parallel approach is particularly useful in the estimation of the EO mobility in systems where low conductivity mediums are required, such as insulator based dielectrophoresis devices. PMID:27375813

  5. Deception detection with behavioral, autonomic, and neural measures: Conceptual and methodological considerations that warrant modesty.

    PubMed

    Meijer, Ewout H; Verschuere, Bruno; Gamer, Matthias; Merckelbach, Harald; Ben-Shakhar, Gershon

    2016-05-01

    The detection of deception has attracted increased attention among psychological researchers, legal scholars, and ethicists during the last decade. Much of this has been driven by the possibility of using neuroimaging techniques for lie detection. Yet, neuroimaging studies addressing deception detection are clouded by lack of conceptual clarity and a host of methodological problems that are not unique to neuroimaging. We review the various research paradigms and the dependent measures that have been adopted to study deception and its detection. In doing so, we differentiate between basic research designed to shed light on the neurocognitive mechanisms underlying deceptive behavior and applied research aimed at detecting lies. We also stress the distinction between paradigms attempting to detect deception directly and those attempting to establish involvement by detecting crime-related knowledge, and discuss the methodological difficulties and threats to validity associated with each paradigm. Our conclusion is that the main challenge of future research is to find paradigms that can isolate cognitive factors associated with deception, rather than the discovery of a unique (brain) correlate of lying. We argue that the Comparison Question Test currently applied in many countries has weak scientific validity, which cannot be remedied by using neuroimaging measures. Other paradigms are promising, but the absence of data from ecologically valid studies poses a challenge for legal admissibility of their outcomes. PMID:26787599

  6. Gas concentration measurement by optical similitude absorption spectroscopy: methodology and experimental demonstration.

    PubMed

    Anselmo, Christophe; Welschinger, Jean-Yves; Cariou, Jean-Pierre; Miffre, Alain; Rairoux, Patrick

    2016-06-13

    We propose a new methodology to measure gas concentration by light-absorption spectroscopy when the light source spectrum is larger than the spectral width of one or several molecular gas absorption lines. We named it optical similitude absorption spectroscopy (OSAS), as the gas concentration is derived from a similitude between the light source and the target gas spectra. The main OSAS-novelty lies in the development of a robust inversion methodology, based on the Newton-Raphson algorithm, which allows retrieving the target gas concentration from spectrally-integrated differential light-absorption measurements. As a proof, OSAS is applied in laboratory to the 2ν3 methane absorption band at 1.66 µm with uncertainties revealed by the Allan variance. OSAS has also been applied to non-dispersive infra-red and the optical correlation spectroscopy arrangements. This all-optics gas concentration retrieval does not require the use of a gas calibration cell and opens new tracks to atmospheric gas pollution and greenhouse gases sources monitoring. PMID:27410280

  7. Advanced haptic sensor for measuring human skin conditions

    NASA Astrophysics Data System (ADS)

    Tsuchimi, Daisuke; Okuyama, Takeshi; Tanaka, Mami

    2009-12-01

    This paper is concerned with the development of a tactile sensor using PVDF (Polyvinylidene Fluoride) film as a sensory receptor of the sensor to evaluate softness, smoothness, and stickiness of human skin. Tactile sense is the most important sense in the sensation receptor of the human body along with eyesight, and we can examine skin condition quickly using these sense. But, its subjectivity and ambiguity make it difficult to quantify skin conditions. Therefore, development of measurement device which can evaluate skin conditions easily and objectively is demanded by dermatologists, cosmetic industries, and so on. In this paper, an advanced haptic sensor system that can measure multiple information of skin condition in various parts of human body is developed. The applications of the sensor system to evaluate softness, smoothness, and stickiness of skin are investigated through two experiments.

  8. Advanced haptic sensor for measuring human skin conditions

    NASA Astrophysics Data System (ADS)

    Tsuchimi, Daisuke; Okuyama, Takeshi; Tanaka, Mami

    2010-01-01

    This paper is concerned with the development of a tactile sensor using PVDF (Polyvinylidene Fluoride) film as a sensory receptor of the sensor to evaluate softness, smoothness, and stickiness of human skin. Tactile sense is the most important sense in the sensation receptor of the human body along with eyesight, and we can examine skin condition quickly using these sense. But, its subjectivity and ambiguity make it difficult to quantify skin conditions. Therefore, development of measurement device which can evaluate skin conditions easily and objectively is demanded by dermatologists, cosmetic industries, and so on. In this paper, an advanced haptic sensor system that can measure multiple information of skin condition in various parts of human body is developed. The applications of the sensor system to evaluate softness, smoothness, and stickiness of skin are investigated through two experiments.

  9. Intercomparison of magnetic field measurements near MV/LV transformer substations: methodological planning and results.

    PubMed

    Violanti, S; Fraschetta, M; Adda, S; Caputo, E

    2009-12-01

    Within the framework of Environmental Agencies system's activities, coordinated by ISPRA (superior institute for environmental protection and research), a comparison among measurements was designed and accomplished, in order to go into depth on the matter of measurement problems and to evaluate magnetic field at power frequencies. These measurements have been taken near medium voltage /low voltage transformer substation. This project was developed with the contribution of several experts who belong to different Regional Agencies. In three of these regions, substations having specific international standard characteristics were chosen; then a measurement and data analysis protocol was arranged. Data analysis showed a good level of coherence among results obtained by different laboratories. However, a range of problems emerged, either during the protocol predisposition and definition of the data analysis procedure or during the execution of measures and data reprocessing, because of the spatial and temporal variability of magnetic field. These problems represent elements of particular interest in determining a correct measurement methodology, whose purpose is the comparison with limits of exposure, attention values and quality targets. PMID:19843547

  10. Methodology for quality control when conditioning radioactive waste regarding the flow chart procedure

    SciTech Connect

    Ham, U.; Kunz, W.

    1995-12-31

    Radioactive waste that is to be stored into an interim and/or a final storage facility has to fulfill several quality requirements. These requirements are the result of safety analysis for the individual storage facility. The fulfillment of these requirements has to be proven before storing the waste package in the designated storage facility. In co-operation with the responsible authorities and experts, the Gesellschaft fuer Nuklear-Service mbH (GNS), Essen, Germany, has developed the flow chart procedure for low and medium active waste for proving and documenting the quality of the product when conditioning the radioactive waste. Meanwhile, this flow chart procedure is part of the ``Technical Acceptance Criteria`` for interim storage facilities and the ``Requirements for Final Storage`` for final storage facilities respectively for low and medium active waste in Germany. This procedure can also be used for high-active vitrified waste from the reprocessing of irradiated fuel.

  11. Self-calibration methodology by normalized intensity for wavelength modulation spectroscopy measurement

    NASA Astrophysics Data System (ADS)

    Shao, Jie; Guo, Jie; Wang, Liming; Ying, Chaofu; Zhou, Zhen

    2015-02-01

    A methodology of self-calibration for concentration measurement based on wavelength modulation absorption spectroscopy has been developed by normalized fixed intensity. Experimental results show that the simple self-calibration method not only effectively improves the calibration accuracy, but also greatly improves stabilization and reliability as compared with the popular method called self-calibration by 1f normalized. In addition, the proposed system do not need any additional equipment when comparing with the use of traditional wavelength modulation absorption spectrometer. The standard deviation of a concentration measurement with different optical intensity aimed on the detector has been found to be below 1.0%. The dependence of the concentration assessment on the laser intensity fluctuation has also been investigated, which shows that the method of self-calibration could be applied in the field, specially where the dust is easily splattered on the windows of the detector.

  12. Methodology for application of field rainfall simulator to revise c-factor database for conditions of the Czech Republic

    NASA Astrophysics Data System (ADS)

    Neumann, Martin; Dostál, Tomáš; Krása, Josef; Kavka, Petr; Davidová, Tereza; Brant, Václav; Kroulík, Milan; Mistr, Martin; Novotný, Ivan

    2016-04-01

    The presentation will introduce a methodology of determination of crop and cover management factor (C-faktor) for the universal soil loss equation (USLE) using field rainfall simulator. The aim of the project is to determine the C-factor value for the different phenophases of the main crops of the central-european region, while also taking into account the different agrotechnical methods. By using the field rainfall simulator, it is possible to perform the measurements in specific phenophases, which is otherwise difficult to execute due to the variability and fortuity of the natural rainfall. Due to the number of measurements needed, two identical simulators will be used, operated by two independent teams, with coordinated methodology. The methodology will mainly specify the length of simulation, the rainfall intensity, and the sampling technique. The presentation includes a more detailed account of the methods selected. Due to the wide range of variable crops and soils, it is not possible to execute the measurements for all possible combinations. We therefore decided to perform the measurements for previously selected combinations of soils,crops and agrotechnologies that are the most common in the Czech Republic. During the experiments, the volume of the surface runoff and amount of sediment will be measured in their temporal distribution, as well as several other important parameters. The key values of the 3D matrix of the combinations of the crop, agrotechnique and soil will be determined experimentally. The remaining values will be determined by interpolation or by a model analogy. There are several methods used for C-factor calculation from measured experimental data. Some of these are not suitable to be used considering the type of data gathered. The presentation will discuss the benefits and drawbacks of these methods, as well as the final design of the method used. The problems concerning the selection of a relevant measurement method as well as the final

  13. A new metric for measuring condition in large predatory sharks.

    PubMed

    Irschick, D J; Hammerschlag, N

    2014-09-01

    A simple metric (span condition analysis; SCA) is presented for quantifying the condition of sharks based on four measurements of body girth relative to body length. Data on 104 live sharks from four species that vary in body form, behaviour and habitat use (Carcharhinus leucas, Carcharhinus limbatus, Ginglymostoma cirratum and Galeocerdo cuvier) are given. Condition shows similar levels of variability among individuals within each species. Carcharhinus leucas showed a positive relationship between condition and body size, whereas the other three species showed no relationship. There was little evidence for strong differences in condition between males and females, although more male sharks are needed for some species (e.g. G. cuvier) to verify this finding. SCA is potentially viable for other large marine or terrestrial animals that are captured live and then released. PMID:25130454

  14. Mass measuring instrument for use under microgravity conditions

    SciTech Connect

    Fujii, Yusaku; Yokota, Masayuki; Hashimoto, Seiji; Sugita, Yoichi; Ito, Hitomi; Shimada, Kazuhito

    2008-05-15

    A prototype instrument for measuring astronaut body mass under microgravity conditions has been developed and its performance was evaluated by parabolic flight tests. The instrument, which is the space scale, is applied as follows. Connect the subject astronaut to the space scale with a rubber cord. Use a force transducer to measure the force acting on the subject and an optical interferometer to measure the velocity of the subject. The subject's mass is calculated as the impulse divided by the velocity change, i.e., M={integral}Fdt/{delta}v. Parabolic flight by using a jet aircraft produces a zero-gravity condition lasting approximately 20 s. The performance of the prototype space scale was evaluated during such a flight by measuring the mass of a sample object.

  15. Study on optical measurement conditions for noninvasive blood glucose sensing

    NASA Astrophysics Data System (ADS)

    Xu, Kexin; Chen, Wenliang; Jiang, Jingying; Qiu, Qingjun

    2004-05-01

    Utilizing Near-infrared Spectroscopy for non-invasive glucose concentration sensing has been a focusing topic in biomedical optics applications. In this paper study on measuring conditions of spectroscopy on human body is carried out and a series of experiments on glucose concentration sensing are conducted. First, Monte Carlo method is applied to simulate and calculate photons" penetration depth within skin tissues at 1600 nm. The simulation results indicate that applying our designed optical probe, the detected photons can penetrate epidermis of the palm and meet the glucose sensing requirements within the dermis. Second, we analyze the influence of the measured position variations and the contact pressure between the optical fiber probe and the measured position on the measured spectrum during spectroscopic measurement of a human body. And, a measurement conditions reproduction system is introduced to enhance the measurement repeatability. Furthermore, through a series of transmittance experiments on glucose aqueous solutions sensing from simple to complex we found that though some absorption variation information of glucose can be obtained from measurements using NIR spectroscopy, while under the same measuring conditions and with the same modeling method, choices toward measured components reduce when complication degree of components increases, and this causes a decreased prediction accuracy. Finally, OGTT experiments were performed, and a PLS (Partial Least Square) mathematical model for a single experiment was built. We can easily get a prediction expressed as RMSEP (Root Mean Square Error of Prediction) with a value of 0.5-0.8mmol/dl. But the model"s extended application and reliability need more investigation.

  16. Determination of backscattering cross section of individual particles from cytometric measurements: a new methodology.

    PubMed

    Duforêt-Gaurier, Lucile; Moutier, William; Guiselin, Natacha; Thyssen, Mélilotus; Dubelaar, George; Mériaux, Xavier; Courcot, Lucie; Dessailly, David; Loisel, Hubert

    2015-11-30

    A methodology is developed to derive the backscattering cross section of individual particles as measured with the CytoSense (CytoBuoy b.v., NL). This in situ flow cytometer detects light scatter in forward and sideward directions and fluorescence in various spectral bands for a wide range of particles. First, the weighting functions are determined for the forward and sideward detectors to take into account their instrumental response as a function of the scattering angle. The CytoSense values are converted into forward and sideward scattering cross sections. The CytoSense estimates of uniform polystyrene microspheres from 1 to 90 μm are compared with Mie computations. The mean absolute relative differences ΔE are around 33.7% and 23.9% for forward and sideward scattering, respectively. Then, a theoretical relationship is developed to convert sideward scattering into backscattering cross section, from a synthetic database of 495,900 simulations including homogeneous and multi-layered spheres. The relationship follows a power law with a coefficient of determination of 0.95. To test the methodology, a laboratory experiment is carried out on a suspension of silica beads to compare backscattering cross section as measured by the WET Labs ECO-BB9 and derived from CytoSense. Relative differences are between 35% and 60%. They are of the same order of magnitude as the instrumental variability. Differences can be partly explained by the fact that the two instruments do not measure exactly the same parameter: the cross section of individual particles for the CytoSense and the bulk cross section for the ECO-BB9. PMID:26698775

  17. Methodology for Intraoperative Laser Doppler Vibrometry Measurements of Ossicular Chain Reconstruction

    PubMed Central

    Sokołowski, Jacek; Lachowska, Magdalena; Bartoszewicz, Robert; Niemczyk, Kazimierz

    2016-01-01

    Objectives Despite the increasing number of research concerning the applications of the Laser Doppler Vibrometry (LDV) in medicine, its usefulness is still under discussion. The aim of this study is to present a methodology developed in our Department for the LDV intraoperative assessment of ossicular chain reconstruction. Methods Ten patients who underwent “second look” tympanoplasty were involved in the study. The measurements of the acoustic conductivity of the middle ear were performed using the LDV system. Tone bursts with carrier frequencies of 500, 1,000, 2,000, and 4,000 Hz set in motion the ossicular chain. The study was divided into four experiments that examined the intra- and interindividual reproducibility, the utility of the posterior tympanotomy, the impact of changes in the laser beam angle, and the influence of reflective tape presence on measurements. Results There were no statistically significant differences between the two measurements performed in the same patient. However, interindividual differences were significant. In all cases, posterior tympanotomy proved to be useful for LDV measurements of the ossicular prosthesis vibrations. In most cases, changing the laser beam angle decreased signal amplitude about 1.5% (not significant change). The reflective tape was necessary to achieve adequate reflection of the laser beam. Conclusion LDV showed to be a valuable noncontact intraoperative tool for measurements of the middle ear conductive system mobility with a very good intraindividual repeatability. Neither a small change in the angle of the laser beam nor performing the measurements through posterior tympanotomy showed a significant influence on the results. Reflective tape was necessary to obtain good quality responses in LDV measurements. PMID:27090282

  18. Assessing Long-Term Wind Conditions by Combining Different Measure-Correlate-Predict Algorithms: Preprint

    SciTech Connect

    Zhang, J.; Chowdhury, S.; Messac, A.; Hodge, B. M.

    2013-08-01

    This paper significantly advances the hybrid measure-correlate-predict (MCP) methodology, enabling it to account for variations of both wind speed and direction. The advanced hybrid MCP method uses the recorded data of multiple reference stations to estimate the long-term wind condition at a target wind plant site. The results show that the accuracy of the hybrid MCP method is highly sensitive to the combination of the individual MCP algorithms and reference stations. It was also found that the best combination of MCP algorithms varies based on the length of the correlation period.

  19. The Harmonizing Outcome Measures for Eczema (HOME) roadmap: a methodological framework to develop core sets of outcome measurements in dermatology.

    PubMed

    Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C

    2015-01-01

    Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care. PMID:25186228

  20. Quadratic Measurement and Conditional State Preparation in an Optomechanical System

    NASA Astrophysics Data System (ADS)

    Brawley, George; Vanner, Michael; Bowen, Warwick; Schmid, Silvan; Boisen, Anja

    2014-03-01

    An important requirement in the study of quantum systems is the ability to measure non-linear observables at the level of quantum fluctuations. Such measurements enable the conditional preparation of highly non-classical states. Nonlinear measurement, although achieved in a variety of quantum systems including microwave cavity modes and optical fields, remains an outstanding problem in both electromechanical and optomechanical systems. To the best of our knowledge, previous experimental efforts to achieve nonlinear measurement of mechanical motion have not yielded strong coupling, nor the observation of quadratic mechanical motion. Here using a new technique reliant on the intrinsic nonlinearity of the optomechanical interaction, we experimentally observe for the first time a position squared (x2) measurement of the room-temperature Brownian motion of a nanomechanical oscillator. We utilize this measurement to conditionally prepare non-Gaussian bimodal states, which are the high temperature classical analogue of quantum macroscopic superposition states, or cat states. In the future with the aid of cryogenics and state-of-the-art optical cavities, our approach will provide a viable method of generating quantum superposition states of mechanical oscillators. This research was funded by the ARC Center of Excellence for Engineered Quantum Systems.

  1. Optimization of Fermentation Conditions for Recombinant Human Interferon Beta Production by Escherichia coli Using the Response Surface Methodology

    PubMed Central

    Morowvat, Mohammad Hossein; Babaeipour, Valiollah; Rajabi Memari, Hamid; Vahidi, Hossein

    2015-01-01

    Background: The periplasmic overexpression of recombinant human interferon beta (rhIFN-β)-1b using a synthetic gene in Escherichia coli BL21 (DE3) was optimized in shake flasks using Response Surface Methodology (RSM) based on the Box-Behnken Design (BBD). Objectives: This study aimed to predict and develop the optimal fermentation conditions for periplasmic expression of rhIFN-β-1b in shake flasks whilst keeping the acetate excretion as the lowest amount and exploit the best results condition for rhIFN-β in a bench top bioreactor. Materials and Methods: The process variables studied were the concentration of glucose as carbon source, cell density prior the induction (OD 600 nm) and induction temperature. Ultimately, a three-factor three-level BBD was employed during the optimization process. The rhIFN-β production and the acetate excretion served as the evaluated responses. Results: The proposed optimum fermentation condition consisted of 7.81 g L-1 glucose, OD 600 nm prior induction 1.66 and induction temperature of 30.27°C. The model prediction of 0.267 g L-1 of rhIFN-β and 0.961 g L-1 of acetate at the optimum conditions was verified experimentally as 0.255 g L-1 and 0.981 g L-1 of acetate. This agreement between the predicted and observed values confirmed the precision of the applied method to predict the optimum conditions. Conclusions: It can be concluded that the RSM is an effective method for the optimization of recombinant protein expression using synthetic genes in E. coli. PMID:26034535

  2. Application of nitric oxide measurements in clinical conditions beyond asthma

    PubMed Central

    Malinovschi, Andrei; Ludviksdottir, Dora; Tufvesson, Ellen; Rolla, Giovanni; Bjermer, Leif; Alving, Kjell; Diamant, Zuzana

    2015-01-01

    Fractional exhaled nitric oxide (FeNO) is a convenient, non-invasive method for the assessment of active, mainly Th2-driven, airway inflammation, which is sensitive to treatment with standard anti-inflammatory therapy. Consequently, FeNO serves as a valued tool to aid diagnosis and monitoring in several asthma phenotypes. More recently, FeNO has been evaluated in several other respiratory, infectious, and/or immunological conditions. In this short review, we provide an overview of several clinical studies and discuss the status of potential applications of NO measurements in clinical conditions beyond asthma. PMID:26672962

  3. Moving to Capture Children's Attention: Developing a Methodology for Measuring Visuomotor Attention.

    PubMed

    Hill, Liam J B; Coats, Rachel O; Mushtaq, Faisal; Williams, Justin H G; Aucott, Lorna S; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child's development. However, methodological limitations currently make large-scale assessment of children's attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of 'Visual Motor Attention' (VMA)-a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method's core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults' attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  4. Moving to Capture Children’s Attention: Developing a Methodology for Measuring Visuomotor Attention

    PubMed Central

    Coats, Rachel O.; Mushtaq, Faisal; Williams, Justin H. G.; Aucott, Lorna S.; Mon-Williams, Mark

    2016-01-01

    Attention underpins many activities integral to a child’s development. However, methodological limitations currently make large-scale assessment of children’s attentional skill impractical, costly and lacking in ecological validity. Consequently we developed a measure of ‘Visual Motor Attention’ (VMA)—a construct defined as the ability to sustain and adapt visuomotor behaviour in response to task-relevant visual information. In a series of experiments, we evaluated the capability of our method to measure attentional processes and their contributions in guiding visuomotor behaviour. Experiment 1 established the method’s core features (ability to track stimuli moving on a tablet-computer screen with a hand-held stylus) and demonstrated its sensitivity to principled manipulations in adults’ attentional load. Experiment 2 standardised a format suitable for use with children and showed construct validity by capturing developmental changes in executive attention processes. Experiment 3 tested the hypothesis that children with and without coordination difficulties would show qualitatively different response patterns, finding an interaction between the cognitive and motor factors underpinning responses. Experiment 4 identified associations between VMA performance and existing standardised attention assessments and thereby confirmed convergent validity. These results establish a novel approach to measuring childhood attention that can produce meaningful functional assessments that capture how attention operates in an ecologically valid context (i.e. attention's specific contribution to visuomanual action). PMID:27434198

  5. A new methodology for measurement of sludge residence time distribution in a paddle dryer using X-ray fluorescence analysis.

    PubMed

    Charlou, Christophe; Milhé, Mathieu; Sauceau, Martial; Arlabosse, Patricia

    2015-02-01

    Drying is a necessary step before sewage sludge energetic valorization. Paddle dryers allow working with such a complex material. However, little is known about sludge flow in this kind of processes. This study intends to set up an original methodology for sludge residence time distribution (RTD) measurement in a continuous paddle dryer, based on the detection of mineral tracers by X-ray fluorescence. This accurate analytical technique offers a linear response to tracer concentration in dry sludge; the protocol leads to a good repeatability of RTD measurements. Its equivalence to RTD measurement by NaCl conductivity in sludge leachates is assessed. Moreover, it is shown that tracer solubility has no influence on RTD: liquid and solid phases have the same flow pattern. The application of this technique on sludge with different storage duration at 4 °C emphasizes the influence of this parameter on sludge RTD, and thus on paddle dryer performances: the mean residence time in a paddle dryer is almost doubled between 24 and 48 h of storage for identical operating conditions. PMID:25463926

  6. Phoretic and Radiometric Force Measurements on Microparticles in Microgravity Conditions

    NASA Technical Reports Server (NTRS)

    Davis, E. James

    1996-01-01

    Thermophoretic, diffusiophoretic and radiometric forces on microparticles are being measured over a wide range of gas phase and particle conditions using electrodynamic levitation of single particles to simulate microgravity conditions. The thermophoretic force, which arises when a particle exists in a gas having a temperature gradient, is measured by levitating an electrically charged particle between heated and cooled plates mounted in a vacuum chamber. The diffusiophoretic force arising from a concentration gradient in the gas phase is measured in a similar manner except that the heat exchangers are coated with liquids to establish a vapor concentration gradient. These phoretic forces and the radiation pressure force acting on a particle are measured directly in terms of the change in the dc field required to levitate the particle with and without the force applied. The apparatus developed for the research and the experimental techniques are discussed, and results obtained by thermophoresis experiments are presented. The determination of the momentum and energy accommodation coefficients associated with molecular collisions between gases molecules and particles and the measurement of the interaction between electromagnetic radiation and small particles are of particular interest.

  7. Wall Conditioning and Impurity Measurements in the PEGASUS Experiment

    NASA Astrophysics Data System (ADS)

    Ono, M.; Fonck, R.; Toonen, R.; Thorson, T.; Tritz, K.; Winz, G.

    1999-11-01

    Wall conditioning and impurity effects on plasma evolution are increasingly relevant to the PEGASUS program. Surface conditioning consists of hydrogen glow discharge cleaning (GDC) to remove water and oxides, followed by He GDC to reduce the hydrogen inventory. Isotope exchange measurements indicate that periodic He GDC almost eliminates uncontrolled fueling from gas desorbed from the limiting surfaces. Additional wall conditioning will include Ti gettering and/or boronization. Impurity monitoring is provided by the recent installation of a SPRED multichannel VUV spectrometer (wavelength range = 10-110 nm; 1 msec time resolution), several interference filter (IF) monochromators, and a multichannel Ross-filter SXR diode assembly (for CV, CVI, OVII, and OVIII). The IF monitors indicate increased C radiation upon contact of the plasma with the upper and lower limiters for highly elongated plasmas. This radiation appears correlated with a subsequent rollover in the plasma current, and motivates an upgrade to the poloidal limiters to provide better plasma-wall interaction control.

  8. A novel methodology for online measurement of thoron using Lucas scintillation cell

    NASA Astrophysics Data System (ADS)

    Eappen, K. P.; Sapra, B. K.; Mayya, Y. S.

    2007-03-01

    The use of Lucas scintillation cell (LSC) technique for thoron estimation requires a modified methodology as opposed to radon estimation. While in the latter, the α counting is performed after a delay period varying between few hours to few days, in the case of thoron estimation the α counting has to be carried out immediately after sampling owing to the short half-life of thoron (55 s). This can be achieved best by having an on-line LSC sampling and counting system. However, half-life of the thoron decay product 212Pb being 10.6 h, the background accumulates in LSC during online measurements and hence subsequent use of LSC is erroneous unless normal background level is achieved in the cell. This problem can be circumvented by correcting for the average background counts accumulated during the counting period which may be theoretically estimated. In this study, a methodology has been developed to estimate the true counts due to thoron. A linear regression between the counts obtained experimentally and the fractional decay in regular intervals of time is used to obtain the actual thoron concentration. The novelty of this approach is that the background of the cell is automatically estimated as the intercept of the regression graph. The results obtained by this technique compare well with the two filter method and the thoron concentration produced from a standard thoron source. However, the LSC as such cannot be used for environmental samples because the minimum detection level is comparable with that of thoron concentrations prevailing in normal atmosphere.

  9. [Measuring quality in the German Guideline Programme in Oncology (GGPO)—methodology and implementation].

    PubMed

    Nothacker, Monika; Muche-Borowski, Cathleen; Kopp, Ina B

    2014-01-01

    The German Guideline Programme in Oncology (GGPO) is a joint initiative between the German Cancer Society, the Association of the Scientific Medical Societies in Germany and German Cancer Aid. In accordance with the aims of the German National Cancer Plan, the GGPO supports the systematic development of high-quality guidelines. To enhance implementation and evaluation, the suggestion of performance measures (PMs) derived from guideline recommendations following a standardised methodology is obligatory within the GGPO. For this purpose, PM teams are convened representing the multidisciplinary guideline development groups including clinical experts, methodologists and patient representatives as well as those organisations that take an active part in and share responsibility for documentation and quality improvement, i.e., clinical cancer registries, certified cancer centres and, if appropriate, the institution responsible for external quality assurance according to the German Social Code (SGB). The primary selection criteria for PMs include strength of the underlying recommendation (strong, grade A), existing potential for improvement of care and measurability. The premises of data economy and standardised documentation are taken into account. Between May 2008 and July 2014, 12 guidelines with suggestions for 100 PMs have been published. The majority of the suggested performance measures is captured by the specific documentation requirements of the clinical cancer registries and certified cancer centres. This creates a solid basis for an active quality management and re-evaluation of the suggested PMs. In addition, the suspension of measures should be considered if improvement has been achieved on a broad scale and for a longer period in order to concentrate on a quality-oriented, economic documentation. PMID:25523845

  10. Analysis and methodology for measuring oxygen concentration in liquid sodium with a plugging meter

    SciTech Connect

    Nollet, B. K.; Hvasta, M.; Anderson, M.

    2012-07-01

    Oxygen concentration in liquid sodium is a critical measurement in assessing the potential for corrosion damage in sodium-cooled fast reactors (SFRs). There has been little recent work on sodium reactors and oxygen detection. Thus, the technical expertise dealing with oxygen measurements within sodium is no longer readily available in the U.S. Two methods of oxygen detection that have been investigated are the plugging meter and the galvanic cell. One of the overall goals of the Univ. of Wisconsin's sodium research program is to develop an affordable, reliable galvanic cell oxygen sensor. Accordingly, attention must first be dedicated to a well-known standard known as a plugging meter. Therefore, a sodium loop has been constructed on campus in effort to develop the plugging meter technique and gain experience working with liquid metal. The loop contains both a galvanic cell test section and a plugging meter test section. Consistent plugging results have been achieved below 20 [wppm], and a detailed process for achieving effective plugging has been developed. This paper will focus both on an accurate methodology to obtain oxygen concentrations from a plugging meter, and on how to easily control the oxygen concentration of sodium in a test loop. Details of the design, materials, manufacturing, and operation will be presented. Data interpretation will also be discussed, since a modern discussion of plugging data interpretation does not currently exist. (authors)

  11. Extremely low frequency electromagnetic field measurements at the Hylaty station and methodology of signal analysis

    NASA Astrophysics Data System (ADS)

    Kulak, Andrzej; Kubisz, Jerzy; Klucjasz, Slawomir; Michalec, Adam; Mlynarczyk, Janusz; Nieckarz, Zenon; Ostrowski, Michal; Zieba, Stanislaw

    2014-06-01

    We present the Hylaty geophysical station, a high-sensitivity and low-noise facility for extremely low frequency (ELF, 0.03-300 Hz) electromagnetic field measurements, which enables a variety of geophysical and climatological research related to atmospheric, ionospheric, magnetospheric, and space weather physics. The first systematic observations of ELF electromagnetic fields at the Jagiellonian University were undertaken in 1994. At the beginning the measurements were carried out sporadically, during expeditions to sparsely populated areas of the Bieszczady Mountains in the southeast of Poland. In 2004, an automatic Hylaty ELF station was built there, in a very low electromagnetic noise environment, which enabled continuous recording of the magnetic field components of the ELF electromagnetic field in the frequency range below 60 Hz. In 2013, after 8 years of successful operation, the station was upgraded by extending its frequency range up to 300 Hz. In this paper we show the station's technical setup, and how it has changed over the years. We discuss the design of ELF equipment, including antennas, receivers, the time control circuit, and power supply, as well as antenna and receiver calibration. We also discuss the methodology we developed for observations of the Schumann resonance and wideband observations of ELF field pulses. We provide examples of various kinds of signals recorded at the station.

  12. Optimization of conditions for probiotic curd formulation by Enterococcus faecium MTCC 5695 with probiotic properties using response surface methodology.

    PubMed

    Ramakrishnan, Vrinda; Goveas, Louella Concepta; Prakash, Maya; Halami, Prakash M; Narayan, Bhaskar

    2014-11-01

    Enterococcus faecium MTCC 5695 possessing potential probiotic properties as well as enterocin producing ability was used as starter culture. Effect of time (12-24 h) and inoculum level (3-7 % v/v) on cell growth, bacteriocin production, antioxidant property, titrable acidity and pH of curd was studied by response surface methodology (RSM). The optimized conditions were 26.48 h and 2.17%v/v inoculum and the second order model validated. Co cultivation studies revealed that the formulated product had the ability to prevent growth of foodborne pathogens that affect keeping quality of the product during storage. The results indicated that application of E. faecium MTCC 5695 along with usage of optimized conditions attributed to the formation of highly consistent well set curd with bioactive and bioprotective properties. Formulated curd with potential probiotic attributes can be used as therapeutic agent for the treatment of foodborne diseases like Traveler's diarrhea and gastroenteritis which thereby help in improvement of bowel health. PMID:26396297

  13. Statistical optimization of ultraviolet irradiate conditions for vitamin D₂ synthesis in oyster mushrooms (Pleurotus ostreatus) using response surface methodology.

    PubMed

    Wu, Wei-Jie; Ahn, Byung-Yong

    2014-01-01

    Response surface methodology (RSM) was used to determine the optimum vitamin D2 synthesis conditions in oyster mushrooms (Pleurotus ostreatus). Ultraviolet B (UV-B) was selected as the most efficient irradiation source for the preliminary experiment, in addition to the levels of three independent variables, which included ambient temperature (25-45°C), exposure time (40-120 min), and irradiation intensity (0.6-1.2 W/m2). The statistical analysis indicated that, for the range which was studied, irradiation intensity was the most critical factor that affected vitamin D2 synthesis in oyster mushrooms. Under optimal conditions (ambient temperature of 28.16°C, UV-B intensity of 1.14 W/m2, and exposure time of 94.28 min), the experimental vitamin D2 content of 239.67 µg/g (dry weight) was in very good agreement with the predicted value of 245.49 µg/g, which verified the practicability of this strategy. Compared to fresh mushrooms, the lyophilized mushroom powder can synthesize remarkably higher level of vitamin D2 (498.10 µg/g) within much shorter UV-B exposure time (10 min), and thus should receive attention from the food processing industry. PMID:24736742

  14. Statistical Optimization of Ultraviolet Irradiate Conditions for Vitamin D2 Synthesis in Oyster Mushrooms (Pleurotus ostreatus) Using Response Surface Methodology

    PubMed Central

    Wu, Wei-Jie; Ahn, Byung-Yong

    2014-01-01

    Response surface methodology (RSM) was used to determine the optimum vitamin D2 synthesis conditions in oyster mushrooms (Pleurotus ostreatus). Ultraviolet B (UV-B) was selected as the most efficient irradiation source for the preliminary experiment, in addition to the levels of three independent variables, which included ambient temperature (25–45°C), exposure time (40–120 min), and irradiation intensity (0.6–1.2 W/m2). The statistical analysis indicated that, for the range which was studied, irradiation intensity was the most critical factor that affected vitamin D2 synthesis in oyster mushrooms. Under optimal conditions (ambient temperature of 28.16°C, UV-B intensity of 1.14 W/m2, and exposure time of 94.28 min), the experimental vitamin D2 content of 239.67 µg/g (dry weight) was in very good agreement with the predicted value of 245.49 µg/g, which verified the practicability of this strategy. Compared to fresh mushrooms, the lyophilized mushroom powder can synthesize remarkably higher level of vitamin D2 (498.10 µg/g) within much shorter UV-B exposure time (10 min), and thus should receive attention from the food processing industry. PMID:24736742

  15. H/L ratio as a measurement of stress in laying hens - methodology and reliability.

    PubMed

    Lentfer, T L; Pendl, H; Gebhardt-Henrich, S G; Fröhlich, E K F; Von Borell, E

    2015-04-01

    Measuring the ratio of heterophils and lymphocytes (H/L) in response to different stressors is a standard tool for assessing long-term stress in laying hens but detailed information on the reliability of measurements, measurement techniques and methods, and absolute cell counts is often lacking. Laying hens offered different sites of the nest boxes at different ages were compared in a two-treatment crossover experiment to provide detailed information on the procedure for measuring and the difficulties in the interpretation of H/L ratios in commercial conditions. H/L ratios were pen-specific and depended on the age and aviary system. There was no effect for the position of the nest. Heterophiles and lymphocytes were not correlated within individuals. Absolute cell counts differed in the number of heterophiles and lymphocytes and H/L ratios, whereas absolute leucocyte counts between individuals were similar. The reliability of the method using relative cell counts was good, yielding a correlation coefficient between double counts of r > 0.9. It was concluded that population-based reference values may not be sensitive enough to detect individual stress reactions and that the H/L ratio as an indicator of stress under commercial conditions may not be useful because of confounding factors and that other, non-invasive, measurements should be adopted. PMID:25622692

  16. Measurement of laminar burning speeds and Markstein lengths using a novel methodology

    SciTech Connect

    Tahtouh, Toni; Halter, Fabien; Mounaim-Rousselle, Christine

    2009-09-15

    Three different methodologies used for the extraction of laminar information are compared and discussed. Starting from an asymptotic analysis assuming a linear relation between the propagation speed and the stretch acting on the flame front, temporal radius evolutions of spherically expanding laminar flames are postprocessed to obtain laminar burning velocities and Markstein lengths. The first methodology fits the temporal radius evolution with a polynomial function, while the new methodology proposed uses the exact solution of the linear relation linking the flame speed and the stretch as a fit. The last methodology consists in an analytical resolution of the problem. To test the different methodologies, experiments were carried out in a stainless steel combustion chamber with methane/air mixtures at atmospheric pressure and ambient temperature. The equivalence ratio was varied from 0.55 to 1.3. The classical shadowgraph technique was used to detect the reaction zone. The new methodology has proven to be the most robust and provides the most accurate results, while the polynomial methodology induces some errors due to the differentiation process. As original radii are used in the analytical methodology, it is more affected by the experimental radius determination. Finally, laminar burning velocity and Markstein length values determined with the new methodology are compared with results reported in the literature. (author)

  17. [Methodological Approaches to the Organization of Counter Measures Taking into Account Landscape Features of Radioactively Contaminated Territories].

    PubMed

    Kuznetsov, V K; Sanzharova, N I

    2016-01-01

    Methodological approaches to the organization of counter measures are considered taking into account the landscape features of the radioactively contaminated territories. The current status and new requirements to the organization of counter measures in the contaminated agricultural areas are analyzed. The basic principles, objectives and problems of the formation of counter measures with regard to the landscape characteristics of the territory are presented; also substantiated are the organization and optimization of the counter measures in radioactively contaminated agricultural landscapes. PMID:27245009

  18. Extension of laboratory-measured soil spectra to field conditions

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.; Weismiller, R. A.; Biehl, L. L.; Robinson, B. F.

    1982-01-01

    Spectral responses of two glaciated soils, Chalmers silty clay loam and Fincastle silt loam, formed under prairie grass and forest vegetation, respectively, were measured in the laboratory under controlled moisture equilibria using an Exotech Model 20C spectroradiometer to obtain spectral data in the laboratory under artificial illumination. The same spectroradiometer was used outdoors under solar illumination to obtain spectral response from dry and moistened field plots with and without corn residue cover, representing the two different soils. Results indicate that laboratory-measured spectra of moist soil are directly proportional to the spectral response of that same field-measured moist bare soil over the 0.52 micrometer to 1.75 micrometer wavelength range. The magnitudes of difference in spectral response between identically treated Chalmers and Fincastle soils are greatest in the 0.6 micrometers to 0.8 micrometer transition region between the visible and near infrared, regardless of field condition or laboratory preparation studied.

  19. Thermophysical Properties Measurement of High-Temperature Liquids Under Microgravity Conditions in Controlled Atmospheric Conditions

    NASA Technical Reports Server (NTRS)

    Watanabe, Masahito; Ozawa, Shumpei; Mizuno, Akotoshi; Hibiya, Taketoshi; Kawauchi, Hiroya; Murai, Kentaro; Takahashi, Suguru

    2012-01-01

    Microgravity conditions have advantages of measurement of surface tension and viscosity of metallic liquids by the oscillating drop method with an electromagnetic levitation (EML) device. Thus, we are preparing the experiments of thermophysical properties measurements using the Materials-Science Laboratories ElectroMagnetic-Levitator (MSL-EML) facilities in the international Space station (ISS). Recently, it has been identified that dependence of surface tension on oxygen partial pressure (Po2) must be considered for industrial application of surface tension values. Effect of Po2 on surface tension would apparently change viscosity from the damping oscillation model. Therefore, surface tension and viscosity must be measured simultaneously in the same atmospheric conditions. Moreover, effect of the electromagnetic force (EMF) on the surface oscillations must be clarified to obtain the ideal surface oscillation because the EMF works as the external force on the oscillating liquid droplets, so extensive EMF makes apparently the viscosity values large. In our group, using the parabolic flight levitation experimental facilities (PFLEX) the effect of Po2 and external EMF on surface oscillation of levitated liquid droplets was systematically investigated for the precise measurements of surface tension and viscosity of high temperature liquids for future ISS experiments. We performed the observation of surface oscillations of levitated liquid alloys using PFLEX on board flight experiments by Gulfstream II (G-II) airplane operated by DAS. These observations were performed under the controlled Po2 and also under the suitable EMF conditions. In these experiments, we obtained the density, the viscosity and the surface tension values of liquid Cu. From these results, we discuss about as same as reported data, and also obtained the difference of surface oscillations with the change of the EMF conditions.

  20. Assessment of hygienic conditions of ground pepper (Piper nigrum L.) on the market in Sao Paulo City, by means of two methodologies for detecting the light filth

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pepper should to be collected, processed, and packed under optimum conditions to avoid the presence of foreign matter. The hygienic conditions of ground pepper marketted in São Paulo city were assessed in determining the presence of foreign matter by means of two extraction methodologies. This study...

  1. Classification of heart valve condition using acoustic measurements

    SciTech Connect

    Clark, G.

    1994-11-15

    Prosthetic heart valves and the many great strides in valve design have been responsible for extending the life spans of many people with serious heart conditions. Even though the prosthetic valves are extremely reliable, they are eventually susceptible to long-term fatigue and structural failure effects expected from mechanical devices operating over long periods of time. The purpose of our work is to classify the condition of in vivo Bjork-Shiley Convexo-Concave (BSCC) heart valves by processing acoustic measurements of heart valve sounds. The structural failures of interest for Bscc valves is called single leg separation (SLS). SLS can occur if the outlet strut cracks and separates from the main structure of the valve. We measure acoustic opening and closing sounds (waveforms) using high sensitivity contact microphones on the patient`s thorax. For our analysis, we focus our processing and classification efforts on the opening sounds because they yield direct information about outlet strut condition with minimal distortion caused by energy radiated from the valve disc.

  2. Detection of Chamber Conditioning Through Optical Emission and Impedance Measurements

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.; Rao, M. V. V. S.; Sharma, Surendra P.; Meyyappan, Meyya

    2001-01-01

    During oxide etch processes, buildup of fluorocarbon residues on reactor sidewalls can cause run-to-run drift and will necessitate some time for conditioning and seasoning of the reactor. Though diagnostics can be applied to study and understand these phenomena, many of them are not practical for use in an industrial reactor. For instance, measurements of ion fluxes and energy by mass spectrometry show that the buildup of insulating fluorocarbon films on the reactor surface will cause a shift in both ion energy and current in an argon plasma. However, such a device cannot be easily integrated into a processing system. The shift in ion energy and flux will be accompanied by an increase in the capacitance of the plasma sheath. The shift in sheath capacitance can be easily measured by a common commercially available impedance probe placed on the inductive coil. A buildup of film on the chamber wall is expected to affect the production of fluorocarbon radicals, and thus the presence of such species in the optical emission spectrum of the plasma can be monitored as well. These two techniques are employed on a GEC (Gaseous Electronics Conference) Reference Cell to assess the validity of optical emission and impedance monitoring as a metric of chamber conditioning. These techniques are applied to experimental runs with CHF3 and CHF3/O2/Ar plasmas, with intermediate monitoring of pure argon plasmas as a reference case for chamber conditions.

  3. What about temperature? Measuring permeability at magmatic conditions.

    NASA Astrophysics Data System (ADS)

    Kushnir, Alexandra R. L.; Martel, Caroline; Champallier, Rémi; Reuschlé, Thierry

    2015-04-01

    The explosive potential of volcanoes is intimately linked to permeability, which is governed by the connectivity of the porous structure of the magma and surrounding edifice. As magma ascends, volatiles exsolve from the melt and expand, creating a gas phase within the conduit. In the absence of a permeable structure capable of dissipating these gases, the propulsive force of an explosive eruption arises from the gas expansion and the build up of subsurface overpressures. Thus, characterizing the permeability of volcanic rocks under in-situ conditions (high temperature and pressure) allows us to better understand the outgassing potential and explosivity of volcanic systems. Current studies of the permeabilities of volcanic rocks generally measure permeability at room temperature using gas permeameters or model permeability using analytic imaging. Our goal is to perform and assess permeability measurements made at high temperature and high pressure in the interest of approaching the permeability of the samples at magmatic conditions. We measure the permeability of andesitic samples expelled during the 2010 Mt. Merapi eruption. We employ and compare two protocols for measuring permeability at high temperature and under high pressure using argon gas in an internally heated Paterson apparatus with an isolated pore fluid system. We first use the pulse decay method to measure the permeability of our samples, then compare these values to permeability measurements performed under steady state flow. We consider the steady state flow method the more rigorous of the two protocols, as we are more capable of accounting for the temperature gradient within the entire pore fluid system. At temperatures in excess of 700°C and pressures of 100 MPa, permeability values plummet by several orders of magnitude. These values are significantly lower than those commonly reported for room temperature permeameter measurements. The reduction in permeability at high temperature is a

  4. Safety Assessment for a Surface Repository in the Chernobyl Exclusion Zone - Methodology for Assessing Disposal under Intervention Conditions - 13476

    SciTech Connect

    Haverkamp, B.; Krone, J.; Shybetskyi, I.

    2013-07-01

    The Radioactive Waste Disposal Facility (RWDF) Buryakovka was constructed in 1986 as part of the intervention measures after the accident at Chernobyl NPP (ChNPP). Today, RWDF Buryakovka is still being operated but its maximum capacity is nearly reached. Plans for enlargement of the facility exist since more than 10 years but have not been implemented yet. In the framework of an European Commission Project DBE Technology GmbH prepared a safety analysis report of the facility in its current state (SAR) and a preliminary safety analysis report (PSAR) based on the planned enlargement. Due to its history RWDF Buryakovka does not fully comply with today's best international practices and the latest Ukrainian regulations in this area. The most critical aspects are its inventory of long-lived radionuclides, and the non-existent multi-barrier waste confinement system. A significant part of the project was dedicated, therefore, to the development of a methodology for the safety assessment taking into consideration the facility's special situation and to reach an agreement with all stakeholders involved in the later review and approval procedure of the safety analysis reports. Main aspect of the agreed methodology was to analyze the safety, not strictly based on regulatory requirements but on the assessment of the actual situation of the facility including its location within the Exclusion Zone. For both safety analysis reports, SAR and PSAR, the assessment of the long-term safety led to results that were either within regulatory limits or within the limits allowing for a specific situational evaluation by the regulator. (authors)

  5. The application of conditioning paradigms in the measurement of pain

    PubMed Central

    Li, Jun-Xu

    2013-01-01

    Pain is a private experience that involves both sensory and emotional components. Animal studies of pain can only be inferred by their responses, and therefore the measurement of reflexive responses dominate the pain literature for nearly a century. It has been argued that although reflexive responses are important to unveil the sensory nature of pain in organisms, pain affect is equally important but largely ignored in pain studies primarily due to the lack of validated animal models. One strategy to begin to understand pain affect is to use conditioning principles to indirectly reveal the affective condition of pain. This review critically analyzed several procedures that are thought to measure affective learning of pain. The procedures regarding the current knowledge, the applications, and their advantages and disadvantages in pain research are discussed. It is proposed that these procedures should be combined with traditional reflex-based pain measurements in future studies of pain, which could greatly benefit both the understanding of neural underpinnings of pain and preclinical assessment of novel analgesics. PMID:23500202

  6. Anonymous indexing of health conditions for a similarity measure.

    PubMed

    Song, Insu; Marsh, Nigel V

    2012-07-01

    A health social network is an online information service which facilitates information sharing between closely related members of a community with the same or a similar health condition. Over the years, many automated recommender systems have been developed for social networking in order to help users find their communities of interest. For health social networking, the ideal source of information for measuring similarities of patients is the medical information of the patients. However, it is not desirable that such sensitive and private information be shared over the Internet. This is also true for many other security sensitive domains. A new information-sharing scheme is developed where each patient is represented as a small number of (possibly disjoint) d-words (discriminant words) and the d-words are used to measure similarities between patients without revealing sensitive personal information. The d-words are simple words like "food,'' and thus do not contain identifiable personal information. This makes our method an effective one-way hashing of patient assessments for a similarity measure. The d-words can be easily shared on the Internet to find peers who might have similar health conditions. PMID:22531815

  7. Current psychometric and methodological issues in the measurement of overgeneral autobiographical memory.

    PubMed

    Griffith, James W; Sumner, Jennifer A; Raes, Filip; Barnhofer, Thorsten; Debeer, Elise; Hermans, Dirk

    2012-12-01

    Autobiographical memory is a multifaceted construct that is related to psychopathology and other difficulties in functioning. Across many studies, a variety of methods have been used to study autobiographical memory. The relationship between overgeneral autobiographical memory (OGM) and psychopathology has been of particular interest, and many studies of this cognitive phenomenon rely on the Autobiographical Memory Test (AMT) to assess it. In this paper, we examine several methodological approaches to studying autobiographical memory, and focus primarily on methodological and psychometric considerations in OGM research. We pay particular attention to what is known about the reliability, validity, and methodological variations of the AMT. The AMT has adequate psychometric properties, but there is great variability in methodology across studies that use it. Methodological recommendations and suggestions for future studies are presented. PMID:23200427

  8. Development and validation of a continuous measure of patient condition using the Electronic Medical Record.

    PubMed

    Rothman, Michael J; Rothman, Steven I; Beals, Joseph

    2013-10-01

    Patient condition is a key element in communication between clinicians. However, there is no generally accepted definition of patient condition that is independent of diagnosis and that spans acuity levels. We report the development and validation of a continuous measure of general patient condition that is independent of diagnosis, and that can be used for medical-surgical as well as critical care patients. A survey of Electronic Medical Record data identified common, frequently collected non-static candidate variables as the basis for a general, continuously updated patient condition score. We used a new methodology to estimate in-hospital risk associated with each of these variables. A risk function for each candidate input was computed by comparing the final pre-discharge measurements with 1-year post-discharge mortality. Step-wise logistic regression of the variables against 1-year mortality was used to determine the importance of each variable. The final set of selected variables consisted of 26 clinical measurements from four categories: nursing assessments, vital signs, laboratory results and cardiac rhythms. We then constructed a heuristic model quantifying patient condition (overall risk) by summing the single-variable risks. The model's validity was assessed against outcomes from 170,000 medical-surgical and critical care patients, using data from three US hospitals. Outcome validation across hospitals yields an area under the receiver operating characteristic curve(AUC) of ≥0.92 when separating hospice/deceased from all other discharge categories, an AUC of ≥0.93 when predicting 24-h mortality and an AUC of 0.62 when predicting 30-day readmissions. Correspondence with outcomes reflective of patient condition across the acuity spectrum indicates utility in both medical-surgical units and critical care units. The model output, which we call the Rothman Index, may provide clinicians with a longitudinal view of patient condition to help address known

  9. High Accuracy Temperature Measurements Using RTDs with Current Loop Conditioning

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.

    1997-01-01

    To measure temperatures with a greater degree of accuracy than is possible with thermocouples, RTDs (Resistive Temperature Detectors) are typically used. Calibration standards use specialized high precision RTD probes with accuracies approaching 0.001 F. These are extremely delicate devices, and far too costly to be used in test facility instrumentation. Less costly sensors which are designed for aeronautical wind tunnel testing are available and can be readily adapted to probes, rakes, and test rigs. With proper signal conditioning of the sensor, temperature accuracies of 0.1 F is obtainable. For reasons that will be explored in this paper, the Anderson current loop is the preferred method used for signal conditioning. This scheme has been used in NASA Lewis Research Center's 9 x 15 Low Speed Wind Tunnel, and is detailed.

  10. Innovative Methodologies for thermal Energy Release Measurement: case of La Solfatara volcano (Italy)

    NASA Astrophysics Data System (ADS)

    Marfe`, Barbara; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Marotta, Enrica; Peluso, Rosario

    2015-04-01

    This work is devoted to improve the knowledge on the parameters that control the heat flux anomalies associated with the diffuse degassing processes of volcanic and hydrothermal areas. The methodologies currently used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. A new method, based on the use of thermal imaging cameras, has been applied to estimate the heat flux and its time variations. This approach will allow faster heat flux measurement than already accredited methods, improving in this way the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The idea is to extrapolate the heat flux from the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. We use thermal imaging cameras, at short distances (meters to hundreds of meters), to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature. Preliminary studies have been carried out throughout the whole of the La Solfatara crater in order to investigate a possible correlation between the surface temperature and the shallow thermal gradient. We have used a FLIR SC640 thermal camera and K type thermocouples to assess the two measurements at the same time. Results suggest a good correlation between the shallow temperature gradient ΔTs and the surface temperature Ts depurated from background, and despite the campaigns took place during a period of time of a few years, this correlation seems to be stable over the time. This is an extremely motivating result for a further development of a measurement method based only on the use of small range thermal imaging camera. Surveys with thermal cameras may be manually done using a tripod to take thermal images of small contiguous areas and then joining

  11. Three-dimensional force measurements on oral implants: a methodological study.

    PubMed

    Duyck, J; Van Oosterwyck, H; De Cooman, M; Puers, R; Vander Sloten, J; Naert, I

    2000-09-01

    This paper describes a methodology that allows in vitro and in vivo quantification and qualification of forces on oral implants. Strain gauges are adapted to the outer surface of 5.5 and 7 mm standard abutments (Brånemark System, Nobel Biocare, Sweden). The readings of the strain gauges are transformed into a numerical representation of the normal force and the bending moment around the X- and Y-axis. The hardware and the software of the 3D measuring device based on the strain gauge technology is explained and its accuracy and reliability tested. The accuracy level for axial forces and bending moments is 9.72 N and 2.5 N x cm, respectively, based on the current techniques for strain gauged abutments. As an example, an in vivo force analysis was performed in a patient with a full fixed prosthesis in the mandible. Since axial loads of 450 N and bending moments of 70 N x cm were recorded, it was concluded that the accuracy of the device falls well within the scope of our needs. Nevertheless, more in vivo research is needed before well defined conclusions can be drawn and strategies developed to improve the biomechanics of oral implants. PMID:11012848

  12. Measurement error and outcome distributions: Methodological issues in regression analyses of behavioral coding data

    PubMed Central

    Holsclaw, Tracy; Hallgren, Kevin A.; Steyvers, Mark; Smyth, Padhraic; Atkins, David C.

    2015-01-01

    Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non-normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased type-I and type-II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally-technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in supplementary materials. PMID:26098126

  13. Measurement error and outcome distributions: Methodological issues in regression analyses of behavioral coding data.

    PubMed

    Holsclaw, Tracy; Hallgren, Kevin A; Steyvers, Mark; Smyth, Padhraic; Atkins, David C

    2015-12-01

    Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased Type I and Type II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in online supplemental materials. PMID:26098126

  14. Thermal-Diffusivity Measurements of Mexican Citrus Essential Oils Using Photoacoustic Methodology in the Transmission Configuration

    NASA Astrophysics Data System (ADS)

    Muñoz, G. A. López; González, R. F. López; López, J. A. Balderas; Martínez-Pérez, L.

    2011-05-01

    Photoacoustic methodology in the transmission configuration (PMTC) was used to study the thermophysical properties and their relation with the composition in Mexican citrus essential oils providing the viability of using photothermal techniques for quality control and for authentication of oils and their adulteration. Linear relations for the amplitude (on a semi-log scale) and phase, as functions of the sample's thickness, for the PMTC was obtained through a theoretical model fit to the experimental data for thermal-diffusivity measurements in Mexican orange, pink grapefruit, mandarin, lime type A, centrifuged essential oils, and Mexican distilled lime essential oil. Gas chromatography for distilled lime essential oil and centrifuged lime essential oil type A is reported to complement the study. Experimental results showed close thermal-diffusivity values between Mexican citrus essential oils obtained by centrifugation, but a significant difference of this physical property for distilled lime oil and the corresponding value obtained by centrifugation, which is due to their different chemical compositions involved with the extraction processes.

  15. An analysis of the pilot point methodology for automated calibration of an ensemble of conditionally simulated transmissivity fields

    USGS Publications Warehouse

    Cooley, R.L.

    2000-01-01

    An analysis of the pilot point method for automated calibration of an ensemble of conditionally simulated transmissivity fields was conducted on the basis of the simplifying assumption that the flow model is a linear function of log transmissivity. The analysis shows that the pilot point and conditional simulation method of model calibration and uncertainty analysis can produce accurate uncertainty measures if it can be assumed that errors of unknown origin in the differences between observed and model-computed water pressures are small. When this assumption is not met, the method could yield significant errors from overparameterization and the neglect of potential sources of model inaccuracy. The conditional simulation part of the method is also shown to be a variant of the percentile bootstrap method, so that when applied to a nonlinear model, the method is subject to bootstrap errors. These sources of error must be considered when using the method.

  16. A METHODOLOGY TO INTEGRATE MAGNETIC RESONANCE AND ACOUSTIC MEASUREMENTS FOR RESERVOIR CHARACTERIZATION

    SciTech Connect

    Jorge O. Parra; Chris L. Hackert; Lorna L. Wilson

    2002-09-20

    The work reported herein represents the third year of development efforts on a methodology to interpret magnetic resonance and acoustic measurements for reservoir characterization. In this last phase of the project we characterize a vuggy carbonate aquifer in the Hillsboro Basin, Palm Beach County, South Florida, using two data sets--the first generated by velocity tomography and the second generated by reflection tomography. First, we integrate optical macroscopic (OM), scanning electron microscope (SEM) and x-ray computed tomography (CT) images, as well as petrography, as a first step in characterizing the aquifer pore system. This pore scale integration provides information with which to evaluate nuclear magnetic resonance (NMR) well log signatures for NMR well log calibration, interpret ultrasonic data, and characterize flow units at the field scale between two wells in the aquifer. Saturated and desaturated NMR core measurements estimate the irreducible water in the rock and the variable T{sub 2} cut-offs for the NMR well log calibration. These measurements establish empirical equations to extract permeability from NMR well logs. Velocity and NMR-derived permeability and porosity relationships integrated with velocity tomography (based on crosswell seismic measurements recorded between two wells 100 m apart) capture two flow units that are supported with pore scale integration results. Next, we establish a more detailed picture of the complex aquifer pore structures and the critical role they play in water movement, which aids in our ability to characterize not only carbonate aquifers, but reservoirs in general. We analyze petrography and cores to reveal relationships between the rock physical properties that control the compressional and shear wave velocities of the formation. A digital thin section analysis provides the pore size distributions of the rock matrix, which allows us to relate pore structure to permeability and to characterize flow units at the

  17. A Novel Fiber Bragg Grating Based Sensing Methodology for Direct Measurement of Surface Strain on Body Muscles during Physical Exercises

    NASA Astrophysics Data System (ADS)

    Prasad Arudi Subbarao, Guru; Subbaramajois Narasipur, Omkar; Kalegowda, Anand; Asokan, Sundarrajan

    2012-07-01

    The present work proposes a new sensing methodology, which uses Fiber Bragg Gratings (FBGs) to measure in vivo the surface strain and strain rate on calf muscles while performing certain exercises. Two simple exercises, namely ankle dorsi-flexion and ankle plantar-flexion, have been considered and the strain induced on the medial head of the gastrocnemius muscle while performing these exercises has been monitored. The real time strain generated has been recorded and the results are compared with those obtained using a commercial Color Doppler Ultrasound (CDU) system. It is found that the proposed sensing methodology is promising for surface strain measurements in biomechanical applications.

  18. Methodological issues in ethnic and racial identity research with ethnic minority populations: theoretical precision, measurement issues, and research designs.

    PubMed

    Schwartz, Seth J; Syed, Moin; Yip, Tiffany; Knight, George P; Umaña-Taylor, Adriana J; Rivas-Drake, Deborah; Lee, Richard M

    2014-01-01

    This article takes stock of research methods employed in the study of racial and ethnic identity with ethnic minority populations. The article is presented in three parts. The first section reviews theories, conceptualizations, and measurement of ethnic and racial identity (ERI) development. The second section reviews theories, conceptualizations, and measurement of ERI content. The final section reviews key methodological and analytic principles that are important to consider for both ERI development and content. The article concludes with suggestions for future research addressing key methodological limitations when studying ERI. PMID:24490892

  19. A simple methodology to calibrate local meltrate function from ground observation in absence of density measurements

    NASA Astrophysics Data System (ADS)

    Mariani, Davide; Guyennon, Nicolas; Maggioni, Margherita; Salerno, Franco; Romano, Emanuele

    2015-04-01

    Snowpack melting can represent a major contribution to the seasonal variability of the surface and ground water budget at basin scale: the snow pack, acting as a natural reservoir, stores water during the winter season and releases it during spring and summer. The radiative budget driving the melting process depends on numerous variables that may be affected by the ongoing climatic changes. As a result, a shift in time during the spring and summer discharge may significantly affect surface water management at basin scale. For this reason, a reliable model able to quantitatively describe the snow melting processes is very important also for management purposes. The estimation of snow melt rate requires a full energy (mass) balance snowpack assessment. The limited availability of necessary data often does not allow implementing a radiative (mass) balance model. As an alternative we propose here a simple methodology to reconstruct the daily snowmelt and associated melt rate function based only on solid precipitation, air mean temperature and snowpack depth measurements, while snow density observations are often missing. The model differentiates between the melting and the compaction processes based on a daily mean temperature threshold (i.e. above or below the freezing point) and the snowpack state. The snow pack is described as two-layer model, each of them considers its own depth and density. The first one is a fresh snow surface layer whose density is a constant parameter. It is modulated by the daily snowfall/melting budget or it can be compacted and embedded within the second layer. The second one is the ripe snow, whose density is a weighted average with depths of antecedent snowpack and possible first layer contribution. The two snow layers allow starting the fusion by the snowpack's top where the density is lower, while much water is released when the ripe snow starts melting during the late melting season. Finally, we estimate the associated degree-day and

  20. Reverse micellar extraction of lectin from black turtle bean (Phaseolus vulgaris): optimisation of extraction conditions by response surface methodology.

    PubMed

    He, Shudong; Shi, John; Walid, Elfalleh; Zhang, Hongwei; Ma, Ying; Xue, Sophia Jun

    2015-01-01

    Lectin from black turtle bean (Phaseolus vulgaris) was extracted and purified by reverse micellar extraction (RME) method. Response surface methodology (RSM) was used to optimise the processing parameters for both forward and backward extraction. Hemagglutinating activity analysis, SDS-PAGE, RP-HPLC and FTIR techniques were used to characterise the lectin. The optimum extraction conditions were determined as 77.59 mM NaCl, pH 5.65, AOT 127.44 mM sodium bis (2-ethylhexyl) sulfosuccinate (AOT) for the forward extraction; and 592.97 mM KCl, pH 8.01 for the backward extraction. The yield was 63.21 ± 2.35 mg protein/g bean meal with a purification factor of 8.81 ± 0.17. The efficiency of RME was confirmed by SDS-PAGE and RP-HPLC, respectively. FTIR analysis indicated there were no significant changes in the secondary protein structure. Comparison with conventional chromatographic method confirmed that the RME method could be used for the purification of lectin from the crude extract. PMID:25053033

  1. Non-invasive ultrasound based temperature measurements at reciprocating screw plastication units: Methodology and applications

    NASA Astrophysics Data System (ADS)

    Straka, Klaus; Praher, Bernhard; Steinbichler, Georg

    2015-05-01

    Previous attempts to accurately measure the real polymer melt temperature in the screw chamber as well as in the screw channels have failed on account of the challenging metrological boundary conditions (high pressure, high temperature, rotational and axial screw movement). We developed a novel ultrasound system - based on reflection measurements - for the online determination of these important process parameters. Using available pressure-volume-temperature (pvT) data from a polymer it is possible to estimate the density and adiabatic compressibility of the material and therefore the pressure and temperature depending longitudinal ultrasound velocity. From the measured ultrasonic reflection time from the screw root and barrel wall and the pressure it is possible to calculate the mean temperature in the screw channel or in the chamber in front of the screw (in opposition to flush mounted infrared or thermocouple probes). By means of the above described system we are able to measure axial profiles of the mean temperature in the screw chamber. The data gathered by the measurement system can be used to develop control strategies for the plastication process to reduce temperature gradients within the screw chamber or as input data for injection moulding simulation.

  2. Practical Experience of Discharge Measurement in Flood Conditions with ADP

    NASA Astrophysics Data System (ADS)

    Vidmar, A.; Brilly, M.; Rusjan, S.

    2009-04-01

    Accurate discharge estimation is important for an efficient river basin management and especially for flood forecasting. The traditional way of estimating the discharge in hydrological practice is to measure the water stage and to convert the recorded water stage values into discharge by using the single-valued rating curve .Relationship between the stage and discharge values of the rating curve for the extreme events are usually extrapolated by using different mathematical methods and are not directly measured. Our practice shows that by using the Accoustic Doppler Profiler (ADP) instrument we can record the actual relation between the water stage and the flow velocity at the occurrence of flood waves very successfully. Measurement in flood conditions it is not easy task, because of high water surface velocity and large amounts of sediments in the water and floating objects on the surface like branches, bushes, trees, piles and others which can also easily damage ADP instrument. We made several measurements in such extreme events on the Sava River down to the nuclear power plant Kr\\vsko where we have install fixed cable way. During the several measurement with traditional "moving-boat" measurement technique a mowing bed phenomenon was clearly seen. Measuring flow accurately using ADP that uses the "moving-boat" technique, the system needs a reference against which to relate water velocities to. This reference is river bed and must not move. During flood events we detected difficulty finding a static bed surface to which to relate water velocities. This is caused by motion of the surface layer of bed material or also sediments suspended in the water near bed very densely. So these traditional »moving-boat« measurement techniques that we normally use completely fail. Using stationary measurement method to making individual velocity profile measurements, using an Acoustic Doppler Profiler (ADP), at certain time at fixed locations across the width of a stream gave

  3. Development of a system to measure local measurement conditions around textile electrodes.

    PubMed

    Kim, Saim; Oliveira, Joana; Roethlingshoefer, Lisa; Leonhard, Steffen

    2010-01-01

    The three main influence factors on the interface between textile electrode an skin are: temperature, contact pressure and relative humidity. This paper presents first results of a prototype, which measures these local measurement conditions around textile electrodes. The wearable prototype is a data acquisition system based on a microcontroller with a flexible sensor sleeve. Validation measurements included variation of ambient temperature, contact pressures and sleeve material. Results show a good correlation with data found in literature. PMID:21096676

  4. Lab measurements to support modeling terahertz propagation in brownout conditions

    NASA Astrophysics Data System (ADS)

    Fiorino, Steven T.; Grice, Phillip M.; Krizo, Matthew J.; Bartell, Richard J.; Haiducek, John D.; Cusumano, Salvatore J.

    2010-04-01

    Brownout, the loss of visibility caused by dust and debris introduced into the atmosphere by the downwash of a helicopter, currently represents a serious challenge to U.S. military operations in Iraq and Afghanistan, where it has been cited as a factor in the majority of helicopter accidents. Brownout not only reduces visibility, but can create visual illusions for the pilot and difficult conditions for crew beneath the aircraft. Terahertz imaging may provide one solution to this problem. Terahertz frequency radiation readily propagates through the dirt aerosols present in brownout, and therefore can provide an imaging capability to improve effective visibility for pilots, helping prevent the associated accidents. To properly model the success of such systems, it is necessary to determine the optical properties of such obscurants in the terahertz regime. This research attempts to empirically determine, and measure in the laboratory, the full complex index of refraction optical properties of dirt aerosols representative of brownout conditions. These properties are incorporated into the AFIT/CDE Laser Environmental Effects Definition and Reference (LEEDR) software, allowing this program to more accurately assess the propagation of terahertz radiation under brownout conditions than was done in the past with estimated optical properties.

  5. Inference of human affective states from psychophysiological measurements extracted under ecologically valid conditions

    PubMed Central

    Betella, Alberto; Zucca, Riccardo; Cetnarski, Ryszard; Greco, Alberto; Lanatà, Antonio; Mazzei, Daniele; Tognetti, Alessandro; Arsiwalla, Xerxes D.; Omedas, Pedro; De Rossi, Danilo; Verschure, Paul F. M. J.

    2014-01-01

    Compared to standard laboratory protocols, the measurement of psychophysiological signals in real world experiments poses technical and methodological challenges due to external factors that cannot be directly controlled. To address this problem, we propose a hybrid approach based on an immersive and human accessible space called the eXperience Induction Machine (XIM), that incorporates the advantages of a laboratory within a life-like setting. The XIM integrates unobtrusive wearable sensors for the acquisition of psychophysiological signals suitable for ambulatory emotion research. In this paper, we present results from two different studies conducted to validate the XIM as a general-purpose sensing infrastructure for the study of human affective states under ecologically valid conditions. In the first investigation, we recorded and classified signals from subjects exposed to pictorial stimuli corresponding to a range of arousal levels, while they were free to walk and gesticulate. In the second study, we designed an experiment that follows the classical conditioning paradigm, a well-known procedure in the behavioral sciences, with the additional feature that participants were free to move in the physical space, as opposed to similar studies measuring physiological signals in constrained laboratory settings. Our results indicate that, by using our sensing infrastructure, it is indeed possible to infer human event-elicited affective states through measurements of psychophysiological signals under ecological conditions. PMID:25309310

  6. A METHODOLOGY FOR ESTIMATING ARMY TRAINING AND TESTING AREA CARRYING CAPACITY (ATTACC) VEHICLE SEVERITY FACTORS AND LOCAL CONDITION FACTORS

    EPA Science Inventory

    The Army Training and Testing Area Carrying Capacity (ATTACC) program is a methodology for estimating training and testing land carrying capacity. The methodology is used to determine land rehabilitation and maintenance costs associated with land-based training. ATTACC is part of...

  7. "MARK I" MEASUREMENT METHODOLOGY FOR POLLUTION PREVENTION PROGRESS OCCURRING AS A RESULT OF PRODUCT DECISIONS

    EPA Science Inventory

    A methodology for assessing progress in pollution prevention resulting from product redesign, reformulation or replacement is described. The method compares the pollution generated by the original product with that from the modified or replacement product, taking into account, if...

  8. Suspended matter concentrations in coastal waters: Methodological improvements to quantify individual measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Röttgers, Rüdiger; Heymann, Kerstin; Krasemann, Hajo

    2014-12-01

    Measurements of total suspended matter (TSM) concentration and the discrimination of the particulate inorganic (PIM) and organic matter fraction by the loss on ignition methods are susceptible to significant and contradictory bias errors by: (a) retention of sea salt in the filter (despite washing with deionized water), and (b) filter material loss during washing and combustion procedures. Several methodological procedures are described to avoid or correct errors associated with these biases but no analysis of the final uncertainty for the overall mass concentration determination has yet been performed. Typically, the exact values of these errors are unknown and can only be estimated. Measurements were performed in coastal and estuarine waters of the German Bight that allowed the individual error for each sample to be determined with respect to a systematic mass offset. This was achieved by using different volumes of the sample and analyzing the mass over volume relationship by linear regression. The results showed that the variation in the mass offset is much larger than expected (mean mass offset: 0.85 ± 0.84 mg, range: -2.4 - 7.5 mg) and that it often leads to rather large relative errors even when TSM concentrations were high. Similarly large variations were found for the mass offset for PIM measurements. Correction with a mean offset determined with procedural control filters reduced the maximum error to <60%. The determination errors for the TSM concentration was <40% when three different volume were used, and for the majority of the samples the error was <10%. When six different volumes were used and outliers removed, the error was always <25%, very often errors of only a few percent were obtained. The approach proposed here can determine the individual determination error for each sample, is independent of bias errors, can be used for TSM and PIM determination, and allows individual quality control for samples from coastal and estuarine waters. It should

  9. Optimization of Extraction Conditions for Maximal Phenolic, Flavonoid and Antioxidant Activity from Melaleuca bracteata Leaves Using the Response Surface Methodology.

    PubMed

    Hou, Wencheng; Zhang, Wei; Chen, Guode; Luo, Yanping

    2016-01-01

    Melaleuca bracteata is a yellow-leaved tree belonging to the Melaleuca genus. Species from this genus are known to be good sources of natural antioxidants, for example, the "tea tree oil" derived from M. alternifolia is used in food processing to extend the shelf life of products. In order to determine whether M. bracteata contains novel natural antioxidants, the components of M. bracteata ethanol extracts were analyzed by gas chromatography-mass spectrometry. Total phenolic and flavonoid contents were extracted and the antioxidant activities of the extracts evaluated. Single-factor experiments, central composite rotatable design (CCRD) and response surface methodology (RSM) were used to optimize the extraction conditions for total phenolic content (TPC) and total flavonoid content (TFC). Ferric reducing power (FRP) and 1,1-Diphenyl-2-picrylhydrazyl radical (DPPH·) scavenging capacity were used as the evaluation indices of antioxidant activity. The results showed that the main components of M. bracteata ethanol extracts are methyl eugenol (86.86%) and trans-cinnamic acid methyl ester (6.41%). The single-factor experiments revealed that the ethanol concentration is the key factor determining the TPC, TFC, FRP and DPPH·scavenging capacity. RSM results indicated that the optimal condition of all four evaluation indices was achieved by extracting for 3.65 days at 53.26°C in 34.81% ethanol. Under these conditions, the TPC, TFC, FRP and DPPH·scavenging capacity reached values of 88.6 ± 1.3 mg GAE/g DW, 19.4 ± 0.2 mg RE/g DW, 2.37 ± 0.01 mM Fe2+/g DW and 86.0 ± 0.3%, respectively, which were higher than those of the positive control, methyl eugenol (FRP 0.97 ± 0.02 mM, DPPH·scavenging capacity 58.6 ± 0.7%) at comparable concentrations. Therefore, the extracts of M. bracteata leaves have higher antioxidant activity, which did not only attributed to the methyl eugenol. Further research could lead to the development of a potent new natural antioxidant. PMID

  10. Nuclear Structure Measurements of Fermium-254 and Advances in Target Production Methodologies

    NASA Astrophysics Data System (ADS)

    Gothe, Oliver Ralf

    The Berkeley Gas-filled Separator (BGS) has been upgraded with a new gas control system. It allows for accurate control of hydrogen and helium gas mixtures. This greatly increases the capabilities of the separator by reducing background signals in the focal plane detector for asymmetric nuclear reactions. It has also been shown that gas mixtures can be used to focus the desired reaction products into a smaller area, thereby increasing the experimental efficiency. A new electrodeposition cell has been developed to produce metal oxide targets for experiments at the BGS. The new cell has been characterized and was used to produce americium targets for the production of element 115 in the reaction 243Am(48Ca.3n) 288115. Additionally, a new method of producing targets for nuclear reactions was explored. A procedure for producing targets via Polymer Assisted Deposition (PAD) was developed and targets produced via this method were tested using the nuclear reaction 208Pb(40Ar.4 n)244Fm to determine their in-beam performance. It was determined that the silicon nitride backings used in this procedure are not feasible due to their crystal structures, and alternative backing materials have been tested and proposed. A previously unknown level in 254Fm has been identified at 985.7 keV utilizing a newly developed low background coincident apparatus. 254m was produced in the reaction 208Pb(48Ca. n)254No. Reaction products were guided to the two-clover low background detector setup via a recoil transfer chamber. The new level has been assigned a spin of 2- and has tentatively been identified as the octupole vibration in 254Fm. Transporting evaporation residues to a two-clover, low background detector setup can effectively be used to perform gamma-spectroscopy measurements of nuclei that are not accessible by current common methodologies. This technique provides an excellent addition to previously available tools such as in-beam spectroscopy and gamma-ray tracking arrays.

  11. Measurement in Learning Games Evolution: Review of Methodologies Used in Determining Effectiveness of "Math Snacks" Games and Animations

    ERIC Educational Resources Information Center

    Trujillo, Karen; Chamberlin, Barbara; Wiburg, Karin; Armstrong, Amanda

    2016-01-01

    This article captures the evolution of research goals and methodologies used to assess the effectiveness and impact of a set of mathematical educational games and animations for middle-school aged students. The researchers initially proposed using a mixed model research design of formative and summative measures, such as user-testing,…

  12. Use of Response Surface Methodology to Optimize Culture Conditions for Hydrogen Production by an Anaerobic Bacterial Strain from Soluble Starch

    NASA Astrophysics Data System (ADS)

    Kieu, Hoa Thi Quynh; Nguyen, Yen Thi; Dang, Yen Thi; Nguyen, Binh Thanh

    2016-05-01

    Biohydrogen is a clean source of energy that produces no harmful byproducts during combustion, being a potential sustainable energy carrier for the future. Therefore, biohydrogen produced by anaerobic bacteria via dark fermentation has attracted attention worldwide as a renewable energy source. However, the hydrogen production capability of these bacteria depends on major factors such as substrate, iron-containing hydrogenase, reduction agent, pH, and temperature. In this study, the response surface methodology (RSM) with central composite design (CCD) was employed to improve the hydrogen production by an anaerobic bacterial strain isolated from animal waste in Phu Linh, Soc Son, Vietnam (PL strain). The hydrogen production process was investigated as a function of three critical factors: soluble starch concentration (8 g L-1 to 12 g L-1), ferrous iron concentration (100 mg L-1 to 200 mg L-1), and l-cysteine concentration (300 mg L-1 to 500 mg L-1). RSM analysis showed that all three factors significantly influenced hydrogen production. Among them, the ferrous iron concentration presented the greatest influence. The optimum hydrogen concentration of 1030 mL L-1 medium was obtained with 10 g L-1 soluble starch, 150 mg L-1 ferrous iron, and 400 mg L-1 l-cysteine after 48 h of anaerobic fermentation. The hydrogen concentration produced by the PL strain was doubled after using RSM. The obtained results indicate that RSM with CCD can be used as a technique to optimize culture conditions for enhancement of hydrogen production by the selected anaerobic bacterial strain. Hydrogen production from low-cost organic substrates such as soluble starch using anaerobic fermentation methods may be one of the most promising approaches.

  13. Feasibility of density and viscosity measurements under ammonothermal conditions

    NASA Astrophysics Data System (ADS)

    Steigerwald, Thomas G.; Alt, Nicolas S. A.; Hertweck, Benjamin; Schluecker, Eberhard

    2014-10-01

    With an eye on numerical simulations of ammonothermal growth of group III-V bulk single crystals, precise data for viscosity and density are strongly needed. In this work, changes in viscosity depending on temperature and pressure are traced in the developed ball viscometer. There, the falling time is detected by acquiring the acoustic signal of the ball using a high temperature borne-noise acceleration sensor. The results for the viscosity of pure ammonia at ammonothermal conditions already show good accuracy. The apparatus is designed to measure the density in addition to the viscosity, by the substitution of the rolling ball material in later experiments. This is important because the density of the flowing fluid is not constant due to the solubility change of GaN in ammonia by the mineralizers obligatory in ammonothermal process.

  14. Theoretical prediction of lung nodule measurement accuracy under different acquisition and reconstruction conditions

    NASA Astrophysics Data System (ADS)

    Hsieh, Jiang; Karau, Kelly

    2004-04-01

    Utilization of computed tomography (CT) for lung cancer screening has attracted significant research interests in recent years. Images reconstructed from CT studies are used for lung nodule characterization and three-dimensional lung lesion sizing. Methodologies have been developed to automatically identify and characterize lung nodules. In this paper, we analyze the impact of acquisition and reconstruction parameters on the accuracy of quantitative lung nodule characterization. The two major data acquisition parameters that impact the accuracy of the lung nodule measurement are acquisition mode and slice aperture. Acquisition mode includes both axial and helical scans. The investigated reconstruction parameters are the reconstruction filters and field-of-view. We first develop theoretical models that predict the system response under various acquisition and reconstruction conditions. These models allow clinicians to compare results under different conditions and make appropriate acquisition and reconstruction decisions. To validate our model, extensive phantom experiments are conducted. Experiments have demonstrated that our analytical models accurately predict the performance parameters under various conditions. Our study indicates that acquisition and reconstruction parameters can significantly impact the accuracy of the nodule volume measurement. Consequently, when conducting quantitative analysis on lung nodules, especially in sequential growth studies, it is important to make appropriate adjustment and correction to maintain the desired accuracy and to ensure effective patient management.

  15. A novel methodology to measure methane bubble sizes in the water column

    NASA Astrophysics Data System (ADS)

    Hemond, H.; Delwiche, K.; Senft-Grupp, S.; Manganello, T.

    2014-12-01

    The fate of methane ebullition from lake sediments is dependent on initial bubble size. Rising bubbles are subject to dissolution, reducing the fraction of methane that ultimately enters the atmosphere while increasing concentrations of aqueous methane. Smaller bubbles not only rise more slowly, but dissolve more rapidly larger bubbles. Thus, understanding methane bubble size distributions in the water column is critical to predicting atmospheric methane emissions from ebullition. However, current methods of measuring methane bubble sizes in-situ are resource-intensive, typically requiring divers, video equipment, sonar, or hydroacoustic instruments. The complexity and cost of these techniques points to the strong need for a simple, autonomous device that can measure bubble size distributions and be deployed unattended over long periods of time. We describe a bubble sizing device that can be moored in the subsurface and can intercept and measure the size of bubbles as they rise. The instrument uses a novel optical measurement technique with infrared LEDs and IR-sensitive photodetectors combined with a custom-designed printed circuit board. An on-board microcomputer handles raw optical signals and stores the relevant information needed to calculate bubble volume. The electronics are housed within a pressure case fabricated from standard PVC fittings and are powered by size C alkaline batteries. The bill of materials cost is less than $200, allowing us to deploy multiple sensors at various locations within Upper Mystic Lake, MA. This novel device will provide information on how methane bubble sizes may vary both spatially and temporally. We present data from tests under controlled laboratory conditions and from deployments in Upper Mystic Lake.

  16. Measurements of optical underwater turbulence under controlled conditions

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.; Gladysz, S.; Almeida de Sá Barros, R.; Matt, S.; Nootz, G. A.; Josset, D. B.; Hou, W.

    2016-05-01

    Laser beam propagation underwater is becoming an important research topic because of high demand for its potential applications. Namely, ability to image underwater at long distances is highly desired for scientific and military purposes, including submarine awareness, diver visibility, and mine detection. Optical communication in the ocean can provide covert data transmission with much higher rates than that available with acoustic techniques, and it is now desired for certain military and scientific applications that involve sending large quantities of data. Unfortunately underwater environment presents serious challenges for propagation of laser beams. Even in clean ocean water, the extinction due to absorption and scattering theoretically limit the useful range to few attenuation lengths. However, extending the laser light propagation range to the theoretical limit leads to significant beam distortions due to optical underwater turbulence. Experiments show that the magnitude of the distortions that are caused by water temperature and salinity fluctuations can significantly exceed the magnitude of the beam distortions due to atmospheric turbulence even for relatively short propagation distances. We are presenting direct measurements of optical underwater turbulence in controlled conditions of laboratory water tank using two separate techniques involving wavefront sensor and LED array. These independent approaches will enable development of underwater turbulence power spectrum model based directly on the spatial domain measurements and will lead to accurate predictions of underwater beam propagation.

  17. The cost of a hospital ward in Europe: is there a methodology available to accurately measure the costs?

    PubMed

    Negrini, D; Kettle, A; Sheppard, L; Mills, G H; Edbrooke, D L

    2004-01-01

    Costing health care services has become a major requirement due to an increase in demand for health care and technological advances. Several studies have been published describing the computation of the costs of hospital wards. The objective of this article is to examine the methodologies utilised to try to describe the basic components of a standardised method, which could be applied throughout Europe. Cost measurement however is a complex matter and a lack of clarity exists in the terminology and the cost concepts utilised. The methods discussed in this review make it evident that there is a lack of standardized methodologies for the determination of accurate costs of hospital wards. A standardized costing methodology would facilitate comparisons, encourage economic evaluation within the ward and hence assist in the decision-making process with regard to the efficient allocation of resources. PMID:15366283

  18. A methodological evaluation of volumetric measurement techniques including three-dimensional imaging in breast surgery.

    PubMed

    Hoeffelin, H; Jacquemin, D; Defaweux, V; Nizet, J L

    2014-01-01

    Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients) and we compared this system to other techniques (CT scanning and Archimedes' principle) under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress) and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection) of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery. PMID:24511536

  19. Measuring Sense of Community: A Methodological Interpretation of the Factor Structure Debate

    ERIC Educational Resources Information Center

    Peterson, N. Andrew; Speer, Paul W.; Hughey, Joseph

    2006-01-01

    Instability in the factor structure of the Sense of Community Index (SCI) was tested as a methodological artifact. Confirmatory factor analyses, tested with two data sets, supported neither the proposed one-factor nor the four-factor (needs fulfillment, group membership, influence, and emotional connection) SCI. Results demonstrated that the SCI…

  20. The Measure of a Nation: The USDA and the Rise of Survey Methodology

    ERIC Educational Resources Information Center

    Mahoney, Kevin T.; Baker, David B.

    2007-01-01

    Survey research has played a major role in American social science. An outgrowth of efforts by the United States Department of Agriculture in the 1930s, the Division of Program Surveys (DPS) played an important role in the development of survey methodology. The DPS was headed by the ambitious and entrepreneurial Rensis Likert, populated by young…

  1. Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology

    ERIC Educational Resources Information Center

    Johnson, Tristan E.; O'Connor, Debra L.

    2008-01-01

    Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

  2. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  3. What Do We Measure? Methodological versus Institutional Validity in Student Surveys

    ERIC Educational Resources Information Center

    Johnson, Jeffrey Alan

    2011-01-01

    This paper examines the tension in the process of designing student surveys between the methodological requirements of good survey design and the institutional needs for survey data. Building on the commonly used argumentative approach to construct validity, I build an interpretive argument for student opinion surveys that allows assessment of the…

  4. The Continued Salience of Methodological Issues for Measuring Psychiatric Disorders in International Surveys

    ERIC Educational Resources Information Center

    Tausig, Mark; Subedi, Janardan; Broughton, Christopher; Pokimica, Jelena; Huang, Yinmei; Santangelo, Susan L.

    2011-01-01

    We investigated the extent to which methodological concerns explicitly addressed by the designers of the World Mental Health Surveys persist in the results that were obtained using the WMH-CIDI instrument. We compared rates of endorsement of mental illness symptoms in the United States (very high) and Nepal (very low) as they were affected by…

  5. Measuring the Regional Economic Importance of Early Care and Education: The Cornell Methodology Guide

    ERIC Educational Resources Information Center

    Ribeiro, Rosaria; Warner, Mildred

    2004-01-01

    This methodology guide is designed to help study teams answer basic questions about how to conduct a regional economic analysis of the child care sector. Specific examples are drawn from a local study, Tompkins County, NY, and two state studies, Kansas and New York, which the Cornell team conducted. Other state and local studies are also…

  6. Water depression storage under different tillage conditions: measuring and modelling

    NASA Astrophysics Data System (ADS)

    Giménez, R.; Campo, M. A.; González-Audicana, M.; Álvarez-Mozos, J.; Casalí, J.

    2012-04-01

    Water storage in surface depressions (DS) is an important process which affects infiltration, runoff and erosion. Since DS is driven by micro relief, in agricultural soils DS is much affected by tillage and by the direction of tillage rows in relation to the main slope. A direct and accurate measurement of DS requires making the soil surface waterproof -soil is very permeable especially under tillage- but preserving all details of the soil roughness including aggregates over the soil surface (micro-roughness). All this is a very laborious and time-consuming task. That is why hydrological and erosion models for DS estimation normally use either empirical relationships based on some roughness index or numerical approaches. The aim of this work was (i) to measure directly in the field the DS of a soil under different tillage conditions and (ii) to assess the performance of existing empirical 2D models and of a numerical 2D algorithm for DS estimation. Three types of tillage classes (mouldbard+roller, roller compacted and chisel) in 2 tillage directions (parallel and perpendicular to the main slope) were assessed in an experimental hillslope (10% slope) which defines then 6 treatments. Experiments were carried out in 12, 1-m2 micro-plots delimited by metal sheets; that is, a pair of repetitions for each treatment. In each plot, soil surface was gently impregnated with a waterproof, white paint but without altering micro-roughness. A known amount of water (stained with a blue dye) was poured all over the surface with a measuring cup. The excess water was captured in a gutter and measured. Soon after finishing the experiment, pictures of the surface was taken in order to analyze water storage pattern (from stained water) by image processing. Besides, longitudinal height profiles were measured using a laser profilemeter. Finally, infiltration rate was measured near the plot using a double ring infiltrometer. For all the treatments, DS ranged from 2 mm to 17 mm. For the

  7. A supervised vibration-based statistical methodology for damage detection under varying environmental conditions & its laboratory assessment with a scale wind turbine blade

    NASA Astrophysics Data System (ADS)

    Gómez González, A.; Fassois, S. D.

    2016-03-01

    The problem of vibration-based damage detection under varying environmental conditions and uncertainty is considered, and a novel, supervised, PCA-type statistical methodology is postulated. The methodology employs vibration data records from the healthy and damaged states of a structure under various environmental conditions. Unlike standard PCA-type methods in which a feature vector corresponding to the least important eigenvalues is formed in a single step, the postulated methodology uses supervised learning in which damaged-state data records are employed to sequentially form a feature vector by appending a transformed scalar element at a time under the condition that it optimally, among all remaining elements, improves damage detectability. This leads to the formulation of feature vectors with optimized sensitivity to damage, and thus high damage detectability. Within this methodology three particular methods, two non-parametric and one parametric, are formulated. These are validated and comparatively assessed via a laboratory case study focusing on damage detection on a scale wind turbine blade under varying temperature and the potential presence of sprayed water. Damage detection performance is shown to be excellent based on a single vibration response sensor and a limited frequency bandwidth.

  8. A methodological frame for assessing benzene induced leukemia risk mitigation due to policy measures.

    PubMed

    Karakitsios, Spyros P; Sarigiannis, Dimosthenis Α; Gotti, Alberto; Kassomenos, Pavlos A; Pilidis, Georgios A

    2013-01-15

    The study relies on the development of a methodology for assessing the determinants that comprise the overall leukemia risk due to benzene exposure and how these are affected by outdoor and indoor air quality regulation. An integrated modeling environment was constructed comprising traffic emissions, dispersion models, human exposure models and a coupled internal dose/biology-based dose-response risk assessment model, in order to assess the benzene imposed leukemia risk, as much as the impact of traffic fleet renewal and smoking banning to these levels. Regarding traffic fleet renewal, several "what if" scenarios were tested. The detailed full-chain methodology was applied in a South-Eastern European urban setting in Greece and a limited version of the methodology in Helsinki. Non-smoking population runs an average risk equal to 4.1·10(-5) compared to 23.4·10(-5) for smokers. The estimated lifetime risk for the examined occupational groups was higher than the one estimated for the general public by 10-20%. Active smoking constitutes a dominant parameter for benzene-attributable leukemia risk, much stronger than any related activity, occupational or not. From the assessment of mitigation policies it was found that the associated leukemia risk in the optimum traffic fleet scenario could be reduced by up to 85% for non-smokers and up to 8% for smokers. On the contrary, smoking banning provided smaller gains for (7% for non-smokers, 1% for smokers), while for Helsinki, smoking policies were found to be more efficient than traffic fleet renewal. The methodology proposed above provides a general framework for assessing aggregated exposure and the consequent leukemia risk from benzene (incorporating mechanistic data), capturing exposure and internal dosimetry dynamics, translating changes in exposure determinants to actual changes in population risk, providing a valuable tool for risk management evaluation and consequently to policy support. PMID:23220388

  9. A new methodology of determining directional albedo models from Nimbus 7 ERB scanning radiometer measurements

    NASA Technical Reports Server (NTRS)

    House, Frederick B.

    1986-01-01

    The Nimbus 7 Earth Radiation Budget (ERB) data set is reviewed to examine its strong and weak points. In view of the timing of this report relative to the processing schedule of Nimbus 7 ERB observations, emphasis is placed on the methodology of interpreting the scanning radiometer data to develop directional albedo models. These findings enhance the value of the Nimbus 7 ERB data set and can be applied to the interpretation of both the scanning and nonscanning radiometric observations.

  10. Martian dust threshold measurements: Simulations under heated surface conditions

    NASA Technical Reports Server (NTRS)

    White, Bruce R.; Greeley, Ronald; Leach, Rodman N.

    1991-01-01

    Diurnal changes in solar radiation on Mars set up a cycle of cooling and heating of the planetary boundary layer, this effect strongly influences the wind field. The stratification of the air layer is stable in early morning since the ground is cooler than the air above it. When the ground is heated and becomes warmer than the air its heat is transferred to the air above it. The heated parcels of air near the surface will, in effect, increase the near surface wind speed or increase the aeolian surface stress the wind has upon the surface when compared to an unheated or cooled surface. This means that for the same wind speed at a fixed height above the surface, ground-level shear stress will be greater for the heated surface than an unheated surface. Thus, it is possible to obtain saltation threshold conditions at lower mean wind speeds when the surface is heated. Even though the mean wind speed is less when the surface is heated, the surface shear stress required to initiate particle movement remains the same in both cases. To investigate this phenomenon, low-density surface dust aeolian threshold measurements have been made in the MARSWIT wind tunnel located at NASA Ames Research Center, Moffett Field, California. The first series of tests examined threshold values of the 100 micron sand material. At 13 mb surface pressure the unheated surface had a threshold friction speed of 2.93 m/s (and approximately corresponded to a velocity of 41.4 m/s at a height of 1 meter) while the heated surface equivalent bulk Richardson number of -0.02, yielded a threshold friction speed of 2.67 m/s (and approximately corresponded to a velocity of 38.0 m/s at a height of 1 meter). This change represents an 8.8 percent decrease in threshold conditions for the heated case. The values of velocities are well within the threshold range as observed by Arvidson et al., 1983. As the surface was heated the threshold decreased. At a value of bulk Richardson number equal to -0.02 the threshold