Sample records for universal robust analysis

  1. A Merged IQC/SOS Theory for Analysis and Synthesis of Nonlinear Control Systems

    DTIC Science & Technology

    2015-06-23

    constraints. As mentioned previously, this enables new applications of IQCs to analyze the robustness of time-varying and nonlinear systems . This...enables new applications of IQCs to analyze the robustness of time-varying and nonlinear systems . This section considers the analysis of nonlinear systems ...AFRL-AFOSR-VA-TR-2016-0008 A Merged IQC/SOS Theory for Analysis and Synthesis of Nonlinear Control Systems Gary Balas REGENTS OF THE UNIVERSITY OF

  2. Logging in and Dropping out: Exploring Student Non-Completion in Higher Education Using Electronic Footprint Analysis

    ERIC Educational Resources Information Center

    Buglear, John

    2009-01-01

    Student retention in higher education might be prioritised by funding authorities and universities but robust measurement of non-completion is elusive. This investigation explores untapped data sources to enrich understanding of non-completion. The analysis features the main undergraduate course in a part of a large UK university with retention…

  3. GWAR: robust analysis and meta-analysis of genome-wide association studies.

    PubMed

    Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G

    2017-05-15

    In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. Robust Bounded Influence Tests in Linear Models

    DTIC Science & Technology

    1988-11-01

    sensitivity analysis and bounded influence estimation. In: Evaluation of Econometric Models, J. Kmenta and J.B. Ramsey (eds.) Academic Press, New York...1R’OBUST bOUNDED INFLUENCE TESTS IN LINEA’ MODELS and( I’homas P. [lettmansperger* Tim [PennsylvanLa State UJniversity A M i0d fix pu111 rsos.p JJ 1 0...November 1988 ROBUST BOUNDED INFLUENCE TESTS IN LINEAR MODELS Marianthi Markatou The University of Iowa and Thomas P. Hettmansperger* The Pennsylvania

  5. Douglas Gagne | NREL

    Science.gov Websites

    Renewable energy project analyst with a robust understanding of solar photovoltaic project costs and and analysis for a variety of solar photovoltaic and wind turbine cost benchmarking studies, with , University of Denver Featured Publications Mexico's Regulatory Engagement in Bulk Electric Power System

  6. [A cost-effectiveness analysis on universal infant rotavirus vaccination strategy in China].

    PubMed

    Sun, S L; Gao, Y Q; Yin, J; Zhuang, G H

    2016-02-01

    To evaluate the cost-effectiveness of current universal infant rotavirus vaccination strategy, in China. Through constructing decision tree-Markov model, we simulated rotavirus diarrhea associated cost and health outcome on those newborns in 2012 regarding different vaccination programs as: group with no vaccination, Rotavirus vaccination group and Rotateq vaccination group, respectively. We determined the optimal program, based on the comparison between incremental cost-effectiveness ratio (ICER) and China' s 2012 per capital gross domestic product (GDP). Compared with non-vaccination group, the Rotavirus vaccination and Rotateq vaccination groups had to pay 3 760 Yuan and 7 578 Yuan (both less than 2012 GDP per capital) to avert one disability adjusted life years (DALY) loss, respectively. RESULTS from sensitivity analysis indicated that both results were robust. Compared with Rotavirus vaccination program, the Rotateq vaccination program had to pay extra 81 068 Yuan (between 1 and 3 times GDP per capital) to avert one DALY loss. Data from the sensitivity analysis indicated that the result was not robust. From the perspective of health economics, both two-dose Rotarix vaccine and three-dose' s Rotateq vaccine programs were highly cost-effective, when compared to the non-vaccination program. It was appropriate to integrate rotavirus vaccine into the routine immunization program. Considering the large amount of extra cost that had to spend on Rotateq vaccination program, results from the sensitivity analysis showed that it was not robust. Rotateq vaccine required one more dose than the Rotarix vaccine, to be effective. However, it appeared more difficult to practice, suggesting that it was better to choose the Rotarix vaccine, at current stage.

  7. Literacy Achievement in India: A Demographic Evaluation

    ERIC Educational Resources Information Center

    Shukla, Vachaspati; Mishra, Udaya S.

    2017-01-01

    This article evaluates the progress in literacy among the Indian states, from an age-cohort perspective. It argues that age-cohort analysis offers a robust understanding of the dynamics of literacy progress. The article clearly brings out the fact that, despite the accomplishment of universal elementary education, achieving the goal of full…

  8. On the Use of Interactive Texts in Undergraduate Chemical Reaction Engineering Courses: A Pedagogical Experience

    ERIC Educational Resources Information Center

    Asensio, Daniela A.; Barassi, Francisca J.; Zambon, Mariana T.; Mazza, Germán D.

    2010-01-01

    This paper describes the results of a pedagogical experience carried out at the University of Comahue, Argentina, with an interactive text (IT) concerning Homogeneous Chemical Reactors Analysis. The IT was built on the frame of the "Mathematica" software with the aim of providing students with a robust computational tool. Students'…

  9. Author name recognition in degraded journal images

    NASA Astrophysics Data System (ADS)

    de Bodard de la Jacopière, Aliette; Likforman-Sulem, Laurence

    2006-01-01

    A method for extracting names in degraded documents is presented in this article. The documents targeted are images of photocopied scientific journals from various scientific domains. Due to the degradation, there is poor OCR recognition, and pieces of other articles appear on the sides of the image. The proposed approach relies on the combination of a low-level textual analysis and an image-based analysis. The textual analysis extracts robust typographic features, while the image analysis selects image regions of interest through anchor components. We report results on the University of Washington benchmark database.

  10. A concept analysis of optimality in perinatal health.

    PubMed

    Kennedy, Holly Powell

    2006-01-01

    This analysis was conducted to describe the concept of optimality and its appropriateness for perinatal health care. The concept was identified in 24 scientific disciplines. Across all disciplines, the universal definition of optimality is the robust, efficient, and cost-effective achievement of best possible outcomes within a rule-governed framework. Optimality, specifically defined for perinatal health care, is the maximal perinatal outcome with minimal intervention placed against the context of the woman's social, medical, and obstetric history.

  11. The Robust Learning Model (RLM): A Comprehensive Approach to a New Online University

    ERIC Educational Resources Information Center

    Neumann, Yoram; Neumann, Edith F.

    2010-01-01

    This paper outlines the components of the Robust Learning Model (RLM) as a conceptual framework for creating a new online university offering numerous degree programs at all degree levels. The RLM is a multi-factorial model based on the basic belief that successful learning outcomes depend on multiple factors employed together in a holistic…

  12. Stability and Performance Robustness Assessment of Multivariable Control Systems

    DTIC Science & Technology

    1993-04-01

    00- STABILITY AND PERFORMANCE ROBUSTNESS ASSESSMENT OF MULTIVARIABLE CONTROL SYSTEMS Asok Ray , Jenny I. Shen, and Chen-Kuo Weng Mechanical...Office of Naval Research Assessment of Multivariable Control Systems Grant No. N00014-90-J- 1513 6. AUTHOR(S) (Extension) Professor Asok Ray , Dr...20 The Pennsylvania State University University Park, PA 16802 (20 for Professor Asok Ray ) Naval Postgraduate School

  13. A Bayesian blind survey for cold molecular gas in the Universe

    NASA Astrophysics Data System (ADS)

    Lentati, L.; Carilli, C.; Alexander, P.; Walter, F.; Decarli, R.

    2014-10-01

    A new Bayesian method for performing an image domain search for line-emitting galaxies is presented. The method uses both spatial and spectral information to robustly determine the source properties, employing either simple Gaussian, or other physically motivated models whilst using the evidence to determine the probability that the source is real. In this paper, we describe the method, and its application to both a simulated data set, and a blind survey for cold molecular gas using observations of the Hubble Deep Field-North taken with the Plateau de Bure Interferometer. We make a total of six robust detections in the survey, five of which have counterparts in other observing bands. We identify the most secure detections found in a previous investigation, while finding one new probable line source with an optical ID not seen in the previous analysis. This study acts as a pilot application of Bayesian statistics to future searches to be carried out both for low-J CO transitions of high-redshift galaxies using the Jansky Very Large Array (JVLA), and at millimetre wavelengths with Atacama Large Millimeter/submillimeter Array (ALMA), enabling the inference of robust scientific conclusions about the history of the molecular gas properties of star-forming galaxies in the Universe through cosmic time.

  14. Fair Use Education for the Twenty-First Century: A Comparative Study of Students' Use of an Interactive Tool to Guide Decision Making

    ERIC Educational Resources Information Center

    Greenhow, Christine; Walker, J. D.; Donnelly, Dan; Cohen, Brad

    2008-01-01

    Christine Greenhow, J. D. Walker, Dan Donnelly, and Brad Cohen describe the implementation and evaluation of the University of Minnesota's Fair Use Analysis (FUA) tool, an interactive online application intended to educate users and foster defensible fair use practice in accordance with copyright law by guiding users through a robust,…

  15. The National Wind Energy Skills Assessment and Preparing for the Future Wind Workforce; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tegen, Suzanne

    2015-07-10

    A robust workforce is essential to growing domestic wind manufacturing capabilities. This presentation provides an overview of an NREL analysis of wind-focused education at American colleges and universities. The second part of the presentation discusses DOE/NREL workforce-related projects, such as the Wind Career Map, the Collegiate Wind Competition, and the Wind for Schools project.

  16. Robust Segmentation of Planar and Linear Features of Terrestrial Laser Scanner Point Clouds Acquired from Construction Sites.

    PubMed

    Maalek, Reza; Lichti, Derek D; Ruwanpura, Janaka Y

    2018-03-08

    Automated segmentation of planar and linear features of point clouds acquired from construction sites is essential for the automatic extraction of building construction elements such as columns, beams and slabs. However, many planar and linear segmentation methods use scene-dependent similarity thresholds that may not provide generalizable solutions for all environments. In addition, outliers exist in construction site point clouds due to data artefacts caused by moving objects, occlusions and dust. To address these concerns, a novel method for robust classification and segmentation of planar and linear features is proposed. First, coplanar and collinear points are classified through a robust principal components analysis procedure. The classified points are then grouped using a new robust clustering method, the robust complete linkage method. A robust method is also proposed to extract the points of flat-slab floors and/or ceilings independent of the aforementioned stages to improve computational efficiency. The applicability of the proposed method is evaluated in eight datasets acquired from a complex laboratory environment and two construction sites at the University of Calgary. The precision, recall, and accuracy of the segmentation at both construction sites were 96.8%, 97.7% and 95%, respectively. These results demonstrate the suitability of the proposed method for robust segmentation of planar and linear features of contaminated datasets, such as those collected from construction sites.

  17. Robust Segmentation of Planar and Linear Features of Terrestrial Laser Scanner Point Clouds Acquired from Construction Sites

    PubMed Central

    Maalek, Reza; Lichti, Derek D; Ruwanpura, Janaka Y

    2018-01-01

    Automated segmentation of planar and linear features of point clouds acquired from construction sites is essential for the automatic extraction of building construction elements such as columns, beams and slabs. However, many planar and linear segmentation methods use scene-dependent similarity thresholds that may not provide generalizable solutions for all environments. In addition, outliers exist in construction site point clouds due to data artefacts caused by moving objects, occlusions and dust. To address these concerns, a novel method for robust classification and segmentation of planar and linear features is proposed. First, coplanar and collinear points are classified through a robust principal components analysis procedure. The classified points are then grouped using a new robust clustering method, the robust complete linkage method. A robust method is also proposed to extract the points of flat-slab floors and/or ceilings independent of the aforementioned stages to improve computational efficiency. The applicability of the proposed method is evaluated in eight datasets acquired from a complex laboratory environment and two construction sites at the University of Calgary. The precision, recall, and accuracy of the segmentation at both construction sites were 96.8%, 97.7% and 95%, respectively. These results demonstrate the suitability of the proposed method for robust segmentation of planar and linear features of contaminated datasets, such as those collected from construction sites. PMID:29518062

  18. Development of a Prokaryotic Universal Primer for Simultaneous Analysis of Bacteria and Archaea Using Next-Generation Sequencing

    PubMed Central

    Takahashi, Shunsuke; Tomita, Junko; Nishioka, Kaori; Hisada, Takayoshi; Nishijima, Miyuki

    2014-01-01

    For the analysis of microbial community structure based on 16S rDNA sequence diversity, sensitive and robust PCR amplification of 16S rDNA is a critical step. To obtain accurate microbial composition data, PCR amplification must be free of bias; however, amplifying all 16S rDNA species with equal efficiency from a sample containing a large variety of microorganisms remains challenging. Here, we designed a universal primer based on the V3-V4 hypervariable region of prokaryotic 16S rDNA for the simultaneous detection of Bacteria and Archaea in fecal samples from crossbred pigs (Landrace×Large white×Duroc) using an Illumina MiSeq next-generation sequencer. In-silico analysis showed that the newly designed universal prokaryotic primers matched approximately 98.0% of Bacteria and 94.6% of Archaea rRNA gene sequences in the Ribosomal Database Project database. For each sequencing reaction performed with the prokaryotic universal primer, an average of 69,330 (±20,482) reads were obtained, of which archaeal rRNA genes comprised approximately 1.2% to 3.2% of all prokaryotic reads. In addition, the detection frequency of Bacteria belonging to the phylum Verrucomicrobia, including members of the classes Verrucomicrobiae and Opitutae, was higher in the NGS analysis using the prokaryotic universal primer than that performed with the bacterial universal primer. Importantly, this new prokaryotic universal primer set had markedly lower bias than that of most previously designed universal primers. Our findings demonstrate that the prokaryotic universal primer set designed in the present study will permit the simultaneous detection of Bacteria and Archaea, and will therefore allow for a more comprehensive understanding of microbial community structures in environmental samples. PMID:25144201

  19. Input-output Transfer Function Analysis of a Photometer Circuit Based on an Operational Amplifier.

    PubMed

    Hernandez, Wilmar

    2008-01-09

    In this paper an input-output transfer function analysis based on the frequencyresponse of a photometer circuit based on operational amplifier (op amp) is carried out. Opamps are universally used in monitoring photodetectors and there are a variety of amplifierconnections for this purpose. However, the electronic circuits that are usually used to carryout the signal treatment in photometer circuits introduce some limitations in theperformance of the photometers that influence the selection of the op amps and otherelectronic devices. For example, the bandwidth, slew-rate, noise, input impedance and gain,among other characteristics of the op amp, are often the performance limiting factors ofphotometer circuits. For this reason, in this paper a comparative analysis between twophotodiode amplifier circuits is carried out. One circuit is based on a conventional currentto-voltage converter connection and the other circuit is based on a robust current-to-voltageconverter connection. The results are satisfactory and show that the photodiode amplifierperformance can be improved by using robust control techniques.

  20. Cost-effectiveness analysis of personalized antiplatelet therapy in patients with acute coronary syndrome.

    PubMed

    Jiang, Minghuan; You, Joyce Hs

    2016-05-01

    This study aimed to compare the clinical and economic outcomes of pharmacogenetic-guided (PG-guided) and platelet reactivity testing-guided antiplatelet therapy for patients with acute coronary syndrome undergoing percutaneous coronary intervention. A decision-analytic model was simulated including four antiplatelet strategies: universal clopidogrel 75 mg daily, universal alternative P2Y12 inhibitor (prasugrel or ticagrelor), PG-guided therapy, and platelet reactivity testing-guided therapy. PG-guided therapy was the preferred option with lowest cost (US$75,208) and highest quality-adjusted life years gained (7.6249 quality-adjusted life years). The base-case results were robust in sensitivity analysis. PG-guided antiplatelet therapy showed the highest probability to be preferred antiplatelet strategy for acute coronary syndrome patients with percutaneous coronary intervention.

  1. Revisiting QRS detection methodologies for portable, wearable, battery-operated, and wireless ECG systems.

    PubMed

    Elgendi, Mohamed; Eskofier, Björn; Dokos, Socrates; Abbott, Derek

    2014-01-01

    Cardiovascular diseases are the number one cause of death worldwide. Currently, portable battery-operated systems such as mobile phones with wireless ECG sensors have the potential to be used in continuous cardiac function assessment that can be easily integrated into daily life. These portable point-of-care diagnostic systems can therefore help unveil and treat cardiovascular diseases. The basis for ECG analysis is a robust detection of the prominent QRS complex, as well as other ECG signal characteristics. However, it is not clear from the literature which ECG analysis algorithms are suited for an implementation on a mobile device. We investigate current QRS detection algorithms based on three assessment criteria: 1) robustness to noise, 2) parameter choice, and 3) numerical efficiency, in order to target a universal fast-robust detector. Furthermore, existing QRS detection algorithms may provide an acceptable solution only on small segments of ECG signals, within a certain amplitude range, or amid particular types of arrhythmia and/or noise. These issues are discussed in the context of a comparison with the most conventional algorithms, followed by future recommendations for developing reliable QRS detection schemes suitable for implementation on battery-operated mobile devices.

  2. Revisiting QRS Detection Methodologies for Portable, Wearable, Battery-Operated, and Wireless ECG Systems

    PubMed Central

    Elgendi, Mohamed; Eskofier, Björn; Dokos, Socrates; Abbott, Derek

    2014-01-01

    Cardiovascular diseases are the number one cause of death worldwide. Currently, portable battery-operated systems such as mobile phones with wireless ECG sensors have the potential to be used in continuous cardiac function assessment that can be easily integrated into daily life. These portable point-of-care diagnostic systems can therefore help unveil and treat cardiovascular diseases. The basis for ECG analysis is a robust detection of the prominent QRS complex, as well as other ECG signal characteristics. However, it is not clear from the literature which ECG analysis algorithms are suited for an implementation on a mobile device. We investigate current QRS detection algorithms based on three assessment criteria: 1) robustness to noise, 2) parameter choice, and 3) numerical efficiency, in order to target a universal fast-robust detector. Furthermore, existing QRS detection algorithms may provide an acceptable solution only on small segments of ECG signals, within a certain amplitude range, or amid particular types of arrhythmia and/or noise. These issues are discussed in the context of a comparison with the most conventional algorithms, followed by future recommendations for developing reliable QRS detection schemes suitable for implementation on battery-operated mobile devices. PMID:24409290

  3. A Design Tool for Robust Composite Structures

    DTIC Science & Technology

    2010-06-01

    a a UNIVERSITY OF ^?CAiVI BRIDGE FINAL REPORT A Design Tool for Robust Composite Structures Frank Zok Materials Department University of ...organic fibers, especially Dyneema®. The principal objectives of the present study were to ascertain the fundamental mechanical properties of Dyneema...composites increases by a factor of 2 and the ductility by almost a factor of 3 over the strain rate range 10-3 s-1 to 104 s- 1. One consequence is

  4. A Program in Air Transportation Technology (Joint University Program)

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1996-01-01

    The Joint University Program on Air Transportation Technology was conducted at Princeton University from 1971 to 1995. Our vision was to further understanding of the design and operation of transport aircraft, of the effects of atmospheric environment on aircraft flight, and of the development and utilization of the National Airspace System. As an adjunct, the program emphasized the independent research of both graduate and undergraduate students. Recent principal goals were to develop and verify new methods for design and analysis of intelligent flight control systems, aircraft guidance logic for recovery from wake vortex encounter, and robust flight control systems. Our research scope subsumed problems associated with multidisciplinary aircraft design synthesis and analysis based on flight physics, providing a theoretical basis for developing innovative control concepts that enhance aircraft performance and safety. Our research focus was of direct interest not only to NASA but to manufacturers of aircraft and their associated systems. Our approach, metrics, and future directions described in the remainder of the report.

  5. Biological robustness.

    PubMed

    Kitano, Hiroaki

    2004-11-01

    Robustness is a ubiquitously observed property of biological systems. It is considered to be a fundamental feature of complex evolvable systems. It is attained by several underlying principles that are universal to both biological organisms and sophisticated engineering systems. Robustness facilitates evolvability and robust traits are often selected by evolution. Such a mutually beneficial process is made possible by specific architectural features observed in robust systems. But there are trade-offs between robustness, fragility, performance and resource demands, which explain system behaviour, including the patterns of failure. Insights into inherent properties of robust systems will provide us with a better understanding of complex diseases and a guiding principle for therapy design.

  6. 2016 Summer Series - Ophir Frieder - Searching Harsh Environments

    NASA Image and Video Library

    2016-07-12

    Analysis of selective data that fits our investigative tool may lead to erroneous or limited conclusions. The universe consists of multi states and our recording of them adds complexity. By finding methods to increase the robustness of our digital data collection and applying likely relationship search methods that can handle all the data, we will increase the resolution of our conclusions. Frieder will present methods to increase our ability to capture and search digital data.

  7. Education in a Research University

    ERIC Educational Resources Information Center

    Arrow, Kenneth J. Ed.; And Others

    This collection of 30 essays on the character, administration, and management of research universities research university emphasizes the perspective of statistics and operations research: The essays are: "A Robust Faculty Planning Model" (Frederick Biedenweg); "Looking Back at Computer Models Employed in the Stanford University…

  8. Universal Faraday Rotation in HgTe Wells with Critical Thickness.

    PubMed

    Shuvaev, A; Dziom, V; Kvon, Z D; Mikhailov, N N; Pimenov, A

    2016-09-09

    The universal value of the Faraday rotation angle close to the fine structure constant (α≈1/137) is experimentally observed in thin HgTe quantum wells with a thickness on the border between trivial insulating and the topologically nontrivial Dirac phases. The quantized value of the Faraday angle remains robust in the broad range of magnetic fields and gate voltages. Dynamic Hall conductivity of the holelike carriers extracted from the analysis of the transmission data shows a theoretically predicted universal value of σ_{xy}=e^{2}/h, which is consistent with the doubly degenerate Dirac state. On shifting the Fermi level by the gate voltage, the effective sign of the charge carriers changes from positive (holes) to negative (electrons). The electronlike part of the dynamic response does not show quantum plateaus and is well described within the classical Drude model.

  9. Random Bits Forest: a Strong Classifier/Regressor for Big Data

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Li, Yi; Pu, Weilin; Wen, Kathryn; Shugart, Yin Yao; Xiong, Momiao; Jin, Li

    2016-07-01

    Efficiency, memory consumption, and robustness are common problems with many popular methods for data analysis. As a solution, we present Random Bits Forest (RBF), a classification and regression algorithm that integrates neural networks (for depth), boosting (for width), and random forests (for prediction accuracy). Through a gradient boosting scheme, it first generates and selects ~10,000 small, 3-layer random neural networks. These networks are then fed into a modified random forest algorithm to obtain predictions. Testing with datasets from the UCI (University of California, Irvine) Machine Learning Repository shows that RBF outperforms other popular methods in both accuracy and robustness, especially with large datasets (N > 1000). The algorithm also performed highly in testing with an independent data set, a real psoriasis genome-wide association study (GWAS).

  10. Robust design of a 2-DOF GMV controller: a direct self-tuning and fuzzy scheduling approach.

    PubMed

    Silveira, Antonio S; Rodríguez, Jaime E N; Coelho, Antonio A R

    2012-01-01

    This paper presents a study on self-tuning control strategies with generalized minimum variance control in a fixed two degree of freedom structure-or simply GMV2DOF-within two adaptive perspectives. One, from the process model point of view, using a recursive least squares estimator algorithm for direct self-tuning design, and another, using a Mamdani fuzzy GMV2DOF parameters scheduling technique based on analytical and physical interpretations from robustness analysis of the system. Both strategies are assessed by simulation and real plants experimentation environments composed of a damped pendulum and an under development wind tunnel from the Department of Automation and Systems of the Federal University of Santa Catarina. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Development of a universal measure of quadrupedal forelimb-hindlimb coordination using digital motion capture and computerised analysis.

    PubMed

    Hamilton, Lindsay; Franklin, Robin J M; Jeffery, Nick D

    2007-09-18

    Clinical spinal cord injury in domestic dogs provides a model population in which to test the efficacy of putative therapeutic interventions for human spinal cord injury. To achieve this potential a robust method of functional analysis is required so that statistical comparison of numerical data derived from treated and control animals can be achieved. In this study we describe the use of digital motion capture equipment combined with mathematical analysis to derive a simple quantitative parameter - 'the mean diagonal coupling interval' - to describe coordination between forelimb and hindlimb movement. In normal dogs this parameter is independent of size, conformation, speed of walking or gait pattern. We show here that mean diagonal coupling interval is highly sensitive to alterations in forelimb-hindlimb coordination in dogs that have suffered spinal cord injury, and can be accurately quantified, but is unaffected by orthopaedic perturbations of gait. Mean diagonal coupling interval is an easily derived, highly robust measurement that provides an ideal method to compare the functional effect of therapeutic interventions after spinal cord injury in quadrupeds.

  12. Factors Influencing uUniversity Research Performance

    ERIC Educational Resources Information Center

    Edgar, Fiona; Geare, Alan

    2013-01-01

    This research extends our understanding of research productivity by examining features of managerial practice and culture within university departments. Adopting a robust comparative research design, capturing both interview and survey data sourced from multiple stakeholders from New Zealand universities, we seek to identify factors associated…

  13. Robust control of dielectric elastomer diaphragm actuator for human pulse signal tracking

    NASA Astrophysics Data System (ADS)

    Ye, Zhihang; Chen, Zheng; Asmatulu, Ramazan; Chan, Hoyin

    2017-08-01

    Human pulse signal tracking is an emerging technology that is needed in traditional Chinese medicine. However, soft actuation with multi-frequency tracking capability is needed for tracking human pulse signal. Dielectric elastomer (DE) is one type of soft actuating that has great potential in human pulse signal tracking. In this paper, a DE diaphragm actuator was designed and fabricated to track human pulse pressure signal. A physics-based and control-oriented model has been developed to capture the dynamic behavior of DE diaphragm actuator. Using the physical model, an H-infinity robust control was designed for the actuator to reject high-frequency sensing noises and disturbances. The robust control was then implemented in real-time to track a multi-frequency signal, which verified the tracking capability and robustness of the control system. In the human pulse signal tracking test, a human pulse signal was measured at the City University of Hong Kong and then was tracked using DE actuator at Wichita State University in the US. Experimental results have verified that the DE actuator with its robust control is capable of tracking human pulse signal.

  14. Glomerular structural-functional relationship models of diabetic nephropathy are robust in type 1 diabetic patients.

    PubMed

    Mauer, Michael; Caramori, Maria Luiza; Fioretto, Paola; Najafian, Behzad

    2015-06-01

    Studies of structural-functional relationships have improved understanding of the natural history of diabetic nephropathy (DN). However, in order to consider structural end points for clinical trials, the robustness of the resultant models needs to be verified. This study examined whether structural-functional relationship models derived from a large cohort of type 1 diabetic (T1D) patients with a wide range of renal function are robust. The predictability of models derived from multiple regression analysis and piecewise linear regression analysis was also compared. T1D patients (n = 161) with research renal biopsies were divided into two equal groups matched for albumin excretion rate (AER). Models to explain AER and glomerular filtration rate (GFR) by classical DN lesions in one group (T1D-model, or T1D-M) were applied to the other group (T1D-test, or T1D-T) and regression analyses were performed. T1D-M-derived models explained 70 and 63% of AER variance and 32 and 21% of GFR variance in T1D-M and T1D-T, respectively, supporting the substantial robustness of the models. Piecewise linear regression analyses substantially improved predictability of the models with 83% of AER variance and 66% of GFR variance explained by classical DN glomerular lesions alone. These studies demonstrate that DN structural-functional relationship models are robust, and if appropriate models are used, glomerular lesions alone explain a major proportion of AER and GFR variance in T1D patients. © The Author 2014. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  15. Potential cost-effectiveness of universal access to modern contraceptives in Uganda.

    PubMed

    Babigumira, Joseph B; Stergachis, Andy; Veenstra, David L; Gardner, Jacqueline S; Ngonzi, Joseph; Mukasa-Kivunike, Peter; Garrison, Louis P

    2012-01-01

    Over two thirds of women who need contraception in Uganda lack access to modern effective methods. This study was conducted to estimate the potential cost-effectiveness of achieving universal access to modern contraceptives in Uganda by implementing a hypothetical new contraceptive program (NCP) from both societal and governmental (Ministry of Health (MoH)) perspectives. A Markov model was developed to compare the NCP to the status quo or current contraceptive program (CCP). The model followed a hypothetical cohort of 15-year old girls over a lifetime horizon. Data were obtained from the Uganda National Demographic and Health Survey and from published and unpublished sources. Costs, life expectancy, disability-adjusted life expectancy, pregnancies, fertility and incremental cost-effectiveness measured as cost per life-year (LY) gained, cost per disability-adjusted life-year (DALY) averted, cost per pregnancy averted and cost per unit of fertility reduction were calculated. Univariate and probabilistic sensitivity analyses were performed to examine the robustness of results. Mean discounted life expectancy and disability-adjusted life expectancy (DALE) were higher under the NCP vs. CCP (28.74 vs. 28.65 years and 27.38 vs. 27.01 respectively). Mean pregnancies and live births per woman were lower under the NCP (9.51 vs. 7.90 and 6.92 vs. 5.79 respectively). Mean lifetime societal costs per woman were lower for the NCP from the societal perspective ($1,949 vs. $1,987) and the MoH perspective ($636 vs. $685). In the incremental analysis, the NCP dominated the CCP, i.e. it was both less costly and more effective. The results were robust to univariate and probabilistic sensitivity analysis. Universal access to modern contraceptives in Uganda appears to be highly cost-effective. Increasing contraceptive coverage should be considered among Uganda's public health priorities.

  16. Vehicle System Integration, Optimization, and Robustness

    Science.gov Websites

    Operations Technology Exchange Initiating Partnerships University Partners Government Partners Industry Contacts Researchers Thrust Area 5: Vehicle System Integration, Optimization, and Robustness Thrust Area only optimal design of the vehicle components, but also an optimization of the interactions between

  17. A Robust Alternative to the Normal Distribution.

    DTIC Science & Technology

    1982-07-07

    for any Purpose of the United States Governuent DEPARTMENT OF STATISTICS t -, STANFORD UIVERSITY I STANFORD, CALIFORNIA A Robust Alternative to the...Stanford University Technical Report No. 3. [5] Bhattacharya, S. K. (1966). A Modified Bessel Function lodel in Life Testing. Metrika 10, 133-144

  18. Universal Faraday Rotation in HgTe Wells with Critical Thickness

    NASA Astrophysics Data System (ADS)

    Shuvaev, A.; Dziom, V.; Kvon, Z. D.; Mikhailov, N. N.; Pimenov, A.

    2016-09-01

    The universal value of the Faraday rotation angle close to the fine structure constant (α ≈1 /137 ) is experimentally observed in thin HgTe quantum wells with a thickness on the border between trivial insulating and the topologically nontrivial Dirac phases. The quantized value of the Faraday angle remains robust in the broad range of magnetic fields and gate voltages. Dynamic Hall conductivity of the holelike carriers extracted from the analysis of the transmission data shows a theoretically predicted universal value of σx y=e2/h , which is consistent with the doubly degenerate Dirac state. On shifting the Fermi level by the gate voltage, the effective sign of the charge carriers changes from positive (holes) to negative (electrons). The electronlike part of the dynamic response does not show quantum plateaus and is well described within the classical Drude model.

  19. Sixty-Five Years of University Education in Nigeria: Some Key Cross Cutting Issues

    ERIC Educational Resources Information Center

    Ejoigu, Aloy; Sule, Sheidu

    2012-01-01

    This paper traces briefly the history and development of university education in Nigeria from one university in 1948 to a total of 118 universities as at the time of writing the paper. Besides the chronicle, the paper examines some cross-cutting issues that tend to scuttle the otherwise good intentions and robust programme initiatives of the…

  20. BigDataScript: a scripting language for data pipelines.

    PubMed

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.

  1. Precise calculation of a bond percolation transition and survival rates of nodes in a complex network.

    PubMed

    Kawamoto, Hirokazu; Takayasu, Hideki; Jensen, Henrik Jeldtoft; Takayasu, Misako

    2015-01-01

    Through precise numerical analysis, we reveal a new type of universal loopless percolation transition in randomly removed complex networks. As an example of a real-world network, we apply our analysis to a business relation network consisting of approximately 3,000,000 links among 300,000 firms and observe the transition with critical exponents close to the mean-field values taking into account the finite size effect. We focus on the largest cluster at the critical point, and introduce survival probability as a new measure characterizing the robustness of each node. We also discuss the relation between survival probability and k-shell decomposition.

  2. Robust diagnosis of non-Hodgkin lymphoma phenotypes validated on gene expression data from different laboratories.

    PubMed

    Bhanot, Gyan; Alexe, Gabriela; Levine, Arnold J; Stolovitzky, Gustavo

    2005-01-01

    A major challenge in cancer diagnosis from microarray data is the need for robust, accurate, classification models which are independent of the analysis techniques used and can combine data from different laboratories. We propose such a classification scheme originally developed for phenotype identification from mass spectrometry data. The method uses a robust multivariate gene selection procedure and combines the results of several machine learning tools trained on raw and pattern data to produce an accurate meta-classifier. We illustrate and validate our method by applying it to gene expression datasets: the oligonucleotide HuGeneFL microarray dataset of Shipp et al. (www.genome.wi.mit.du/MPR/lymphoma) and the Hu95Av2 Affymetrix dataset (DallaFavera's laboratory, Columbia University). Our pattern-based meta-classification technique achieves higher predictive accuracies than each of the individual classifiers , is robust against data perturbations and provides subsets of related predictive genes. Our techniques predict that combinations of some genes in the p53 pathway are highly predictive of phenotype. In particular, we find that in 80% of DLBCL cases the mRNA level of at least one of the three genes p53, PLK1 and CDK2 is elevated, while in 80% of FL cases, the mRNA level of at most one of them is elevated.

  3. University Roles in Technological Innovation in California. Research & Occasional Paper Series: CSHE.6.07

    ERIC Educational Resources Information Center

    King, C. Judson

    2007-01-01

    California has achieved considerable economic success through technological innovation and the formation of businesses based upon those technologies. This paper addresses some of the roles of universities in that success story. It starts with some measures of the contributions of innovation and a robust university structure to the California…

  4. Strengthening research governance for sustainable research: experiences from three Zimbabwean universities.

    PubMed

    Mashaah, Thokozile; Hakim, James; Chidzonga, Midion; Kangwende, Rugare A; Naik, Yogeshkumar; Federspiel, Nancy; Fiorillo, Suzanne; Scott, Jim; Gomo, Exnevia

    2014-08-01

    A robust research system requires a robust governance framework. As part of the Medical Education Partnership Initiative, three Zimbabwean universities partnered with two U.S. universities in a project to strengthen research governance in the Zimbabwean universities. The project aimed at (1) developing research policies, (2) strengthening central research management offices, (3) developing a research administration curriculum, and (4) enhancing awareness about the role and relevance of research administration in other universities and research institutions in Zimbabwe. Through the efforts of the partners, a generic research policy was developed and successfully adapted by the institutions. A curriculum was drafted, and module development experts are helping to finalize the curriculum to meet university requirements for accreditation of training research administrators. The Association of Research Managers of Zimbabwe was established to promote information sharing and professionalize research administration. The consortium approach enabled rapid and smooth development and adoption of research policies in the institutions. It also helped researchers and managers accept research administration as an essential structure and function. The experiences and lessons learned are reported here to benefit other institutions and consortia.

  5. Strengthening Research Governance for Sustainable Research: Experiences from Three Zimbabwean Universities

    PubMed Central

    Mashaah, Thokozile; Hakim, James; Chidzonga, Midion; Kangwende, Rugare A.; Naik, Yogeshkumar; Federspiel, Nancy; Fiorillo, Suzanne; Scott, Jim; Gomo, Exnevia

    2014-01-01

    A robust research system requires a robust governance framework. As part of the Medical Education Partnership Initiative, three Zimbabwean universities partnered with two US universities in a project to strengthen research governance in the Zimbabwean universities. The project aimed at (1) developing research policies; (2) strengthening central research management offices; (3) developing a research administration curriculum; and (4) enhancing awareness about the role and relevance of research administration in other universities and research institutions in Zimbabwe. Through the efforts of the partners, a generic research policy was developed and successfully adapted by the institutions. A curriculum was drafted, and module development experts are helping to finalize the curriculum to meet university requirements for accreditation of training research administrators. The Association of Research Managers of Zimbabwe was established to promote information sharing and professionalize research administration. The consortium approach enabled rapid and smooth development and adoption of research policies in the institutions. It also helped researchers and managers accept research administration as an essential structure and function. The experiences and lessons learned are reported here to benefit other institutions and consortia. PMID:25072583

  6. Epoch of Reionization : An Investigation of the Semi-Analytic 21CMMC Code

    NASA Astrophysics Data System (ADS)

    Miller, Michelle

    2018-01-01

    After the Big Bang the universe was filled with neutral hydrogen that began to cool and collapse into the first structures. These first stars and galaxies began to emit radiation that eventually ionized all of the neutral hydrogen in the universe. 21CMMC is a semi-numerical code that takes simulated boxes of this ionized universe from another code called 21cmFAST. Mock measurements are taken from the simulated boxes in 21cmFAST. Those measurements are thrown into 21CMMC and help us determine three major parameters of this simulated universe: virial temperature, mean free path, and ionization efficiency. My project tests the robustness of 21CMMC on universe simulations other than 21cmFAST to see whether 21CMMC can properly reconstruct early universe parameters given a mock “measurement” in the form of power spectra. We determine that while two of the three EoR parameters (Virial Temperature and Efficiency) have some reconstructability, the mean free path parameter in the code is the least robust. This requires development of the 21CMMC code.

  7. Joint Sparsity-Based Robust Multimodal Biometrics Recognition

    DTIC Science & Technology

    2012-10-07

    Nasrabadi, Rama Chellappa William Marsh Rice University Office of Sponsored Research William Marsh Rice University Houston, TX 77005 - REPORT...Shekhar1, Vishal M. Patel1, Nasser M. Nasrabadi2, and Rama Chellappa1 1 University of Maryland, College Park, USA 2 Army Research Lab, Adelphi, USA...authentication. Unfortunately these systems often have to deal with some of the following inevitable problems [1]: (a) Noisy data (b) Non- universality

  8. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-off on Phenotype Robustness in Biological Networks Part I: Gene Regulatory Networks in Systems and Evolutionary Biology

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view. PMID:23515240

  9. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-off on Phenotype Robustness in Biological Networks Part I: Gene Regulatory Networks in Systems and Evolutionary Biology.

    PubMed

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view.

  10. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    PubMed

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Robust and High Order Computational Method for Parachute and Air Delivery and MAV System

    DTIC Science & Technology

    2017-11-01

    Report: Robust and High Order Computational Method for Parachute and Air Delivery and MAV System The views, opinions and/or findings contained in this...University Title: Robust and High Order Computational Method for Parachute and Air Delivery and MAV System Report Term: 0-Other Email: xiaolin.li...model coupled with an incompressible fluid solver through the impulse method . Our approach to simulating the parachute system is based on the front

  12. A psychometric evaluation of the University of Auckland General Practice Report of Educational Environment: UAGREE.

    PubMed

    Eggleton, Kyle; Goodyear-Smith, Felicity; Henning, Marcus; Jones, Rhys; Shulruf, Boaz

    2017-03-01

    The aim of this study was to develop an instrument (University of Auckland General Practice Report of Educational Environment: UAGREE) with robust psychometric properties that measured the educational environment of undergraduate primary care. The questions were designed to incorporate measurements of the teaching of cultural competence. Following a structured consensus process and an initial pilot, a list of 55 questions was developed. All Year 5 and 6 students completing a primary care attachment at Auckland University were invited to complete the questionnaire. The results were analysed using exploratory factor analysis and confirmatory factor analysis resulting in a 16-item instrument. Three factors were identified explaining 53% of the variance. The items' reliability within the factors were high (Learning: 0.894; Teaching: 0.871; Cultural competence: 0.857). Multiple groups analysis by gender; and separately across ethnic groups did not find significant differences between groups. UAGREE is a specific instrument measuring the undergraduate primary care educational environment. Its questions fit within established theoretical educational environment frameworks and the incorporation of cultural competence questions reflects the importance of teaching cultural competence within medicine. The psychometric properties of UAGREE suggest that it is a reliable and valid measure of the primary care education environment.

  13. Maternal Serologic Screening to Prevent Congenital Toxoplasmosis: A Decision-Analytic Economic Model

    PubMed Central

    Stillwaggon, Eileen; Carrier, Christopher S.; Sautter, Mari; McLeod, Rima

    2011-01-01

    Objective To determine a cost-minimizing option for congenital toxoplasmosis in the United States. Methodology/Principal Findings A decision-analytic and cost-minimization model was constructed to compare monthly maternal serological screening, prenatal treatment, and post-natal follow-up and treatment according to the current French (Paris) protocol, versus no systematic screening or perinatal treatment. Costs are based on published estimates of lifetime societal costs of developmental disabilities and current diagnostic and treatment costs. Probabilities are based on published results and clinical practice in the United States and France. One- and two-way sensitivity analyses are used to evaluate robustness of results. Universal monthly maternal screening for congenital toxoplasmosis with follow-up and treatment, following the French protocol, is found to be cost-saving, with savings of $620 per child screened. Results are robust to changes in test costs, value of statistical life, seroprevalence in women of childbearing age, fetal loss due to amniocentesis, and to bivariate analysis of test costs and incidence of primary T. gondii infection in pregnancy. Given the parameters in this model and a maternal screening test cost of $12, screening is cost-saving for rates of congenital infection above 1 per 10,000 live births. If universal testing generates economies of scale in diagnostic tools—lowering test costs to about $2 per test—universal screening is cost-saving at rates of congenital infection well below the lowest reported rates in the United States of 1 per 10,000 live births. Conclusion/Significance Universal screening according to the French protocol is cost saving for the US population within broad parameters for costs and probabilities. PMID:21980546

  14. The Secular University and Its Critics

    ERIC Educational Resources Information Center

    Jobani, Yuval

    2016-01-01

    Universities in the USA have become bastions of secularity in a distinctly religious society. As such, they are subjected to a variety of robust and rigorous religious critiques. In this paper I do not seek to engage in the debate between the supporters of the secular university and its opponents. Furthermore, I do not claim to summarize the…

  15. Three-Component Soliton States in Spinor F =1 Bose-Einstein Condensates

    NASA Astrophysics Data System (ADS)

    Bersano, T. M.; Gokhroo, V.; Khamehchi, M. A.; D'Ambroise, J.; Frantzeskakis, D. J.; Engels, P.; Kevrekidis, P. G.

    2018-02-01

    Dilute-gas Bose-Einstein condensates are an exceptionally versatile test bed for the investigation of novel solitonic structures. While matter-wave solitons in one- and two-component systems have been the focus of intense research efforts, an extension to three components has never been attempted in experiments. Here, we experimentally demonstrate the existence of robust dark-bright-bright (DBB) and dark-dark-bright solitons in a multicomponent F =1 condensate. We observe lifetimes on the order of hundreds of milliseconds for these structures. Our theoretical analysis, based on a multiscale expansion method, shows that small-amplitude solitons of these types obey universal long-short wave resonant interaction models, namely, Yajima-Oikawa systems. Our experimental and analytical findings are corroborated by direct numerical simulations highlighting the persistence of, e.g., the DBB soliton states, as well as their robust oscillations in the trap.

  16. Three-Component Soliton States in Spinor F=1 Bose-Einstein Condensates.

    PubMed

    Bersano, T M; Gokhroo, V; Khamehchi, M A; D'Ambroise, J; Frantzeskakis, D J; Engels, P; Kevrekidis, P G

    2018-02-09

    Dilute-gas Bose-Einstein condensates are an exceptionally versatile test bed for the investigation of novel solitonic structures. While matter-wave solitons in one- and two-component systems have been the focus of intense research efforts, an extension to three components has never been attempted in experiments. Here, we experimentally demonstrate the existence of robust dark-bright-bright (DBB) and dark-dark-bright solitons in a multicomponent F=1 condensate. We observe lifetimes on the order of hundreds of milliseconds for these structures. Our theoretical analysis, based on a multiscale expansion method, shows that small-amplitude solitons of these types obey universal long-short wave resonant interaction models, namely, Yajima-Oikawa systems. Our experimental and analytical findings are corroborated by direct numerical simulations highlighting the persistence of, e.g., the DBB soliton states, as well as their robust oscillations in the trap.

  17. Comparison as a Universal Learning Action

    ERIC Educational Resources Information Center

    Merkulova, T. V.

    2016-01-01

    This article explores "comparison" as a universal metasubject learning action, a key curricular element envisaged by the Russian Federal State Educational Standards. Representing the modern learner's fundamental pragmatic skill embedding such core capacities as information processing, critical thinking, robust decision-making, and…

  18. A stochastic model of firm growth

    NASA Astrophysics Data System (ADS)

    Bottazzi, Giulio; Secchi, Angelo

    2003-06-01

    Recently from analyses on different databases the tent-shape of the distribution of firm growth rates has emerged as a robust and universal characteristic of the time evolution of corporates. We add new evidence on this topic and we present a new stochastic model that, under rather general assumptions, provides a robust explanation for the observed regularity.

  19. Composition-spread Growth and the Robust Topological Surface State of Kondo Insulator SmB6 Thin Films

    DTIC Science & Technology

    2014-01-01

    1,2 1 Center for Nanophysics & Advanced Materials , University of Maryland, College Park, Maryland 20742, USA 2 Department of physics, University of...Maryland, College Park, Maryland 20742, USA 3 Department of Mechanical Engineering and Materials Science, Duke University, Durham, NC 27708 4...Department of Materials Science and Engineering, University of Michigan, Ann Arbor, Michigan 48109, USA 5 Department of Materials Science & Engineering

  20. A Review of Some Aspects of Robust Inference for Time Series.

    DTIC Science & Technology

    1984-09-01

    REVIEW OF SOME ASPECTSOF ROBUST INFERNCE FOR TIME SERIES by Ad . Dougla Main TE "iAL REPOW No. 63 Septermber 1984 Department of Statistics University of ...clear. One cannot hope to have a good method for dealing with outliers in time series by using only an instantaneous nonlinear transformation of the data...AI.49 716 A REVIEWd OF SOME ASPECTS OF ROBUST INFERENCE FOR TIME 1/1 SERIES(U) WASHINGTON UNIV SEATTLE DEPT OF STATISTICS R D MARTIN SEP 84 TR-53

  1. SU-E-T-625: Robustness Evaluation and Robust Optimization of IMPT Plans Based on Per-Voxel Standard Deviation of Dose Distributions.

    PubMed

    Liu, W; Mohan, R

    2012-06-01

    Proton dose distributions, IMPT in particular, are highly sensitive to setup and range uncertainties. We report a novel method, based on per-voxel standard deviation (SD) of dose distributions, to evaluate the robustness of proton plans and to robustly optimize IMPT plans to render them less sensitive to uncertainties. For each optimization iteration, nine dose distributions are computed - the nominal one, and one each for ± setup uncertainties along x, y and z axes and for ± range uncertainty. SD of dose in each voxel is used to create SD-volume histogram (SVH) for each structure. SVH may be considered a quantitative representation of the robustness of the dose distribution. For optimization, the desired robustness may be specified in terms of an SD-volume (SV) constraint on the CTV and incorporated as a term in the objective function. Results of optimization with and without this constraint were compared in terms of plan optimality and robustness using the so called'worst case' dose distributions; which are obtained by assigning the lowest among the nine doses to each voxel in the clinical target volume (CTV) and the highest to normal tissue voxels outside the CTV. The SVH curve and the area under it for each structure were used as quantitative measures of robustness. Penalty parameter of SV constraint may be varied to control the tradeoff between robustness and plan optimality. We applied these methods to one case each of H&N and lung. In both cases, we found that imposing SV constraint improved plan robustness but at the cost of normal tissue sparing. SVH-based optimization and evaluation is an effective tool for robustness evaluation and robust optimization of IMPT plans. Studies need to be conducted to test the methods for larger cohorts of patients and for other sites. This research is supported by National Cancer Institute (NCI) grant P01CA021239, the University Cancer Foundation via the Institutional Research Grant program at the University of Texas MD Anderson Cancer Center, and MD Anderson’s cancer center support grant CA016672. © 2012 American Association of Physicists in Medicine.

  2. Universal distribution of mutational effects on protein stability, uncoupling of protein robustness from sequence evolution and distinct evolutionary modes of prokaryotic and eukaryotic proteins

    NASA Astrophysics Data System (ADS)

    Faure, Guilhem; Koonin, Eugene V.

    2015-05-01

    Robustness to destabilizing effects of mutations is thought of as a key factor of protein evolution. The connections between two measures of robustness, the relative core size and the computationally estimated effect of mutations on protein stability (ΔΔG), protein abundance and the selection pressure on protein-coding genes (dN/dS) were analyzed for the organisms with a large number of available protein structures including four eukaryotes, two bacteria and one archaeon. The distribution of the effects of mutations in the core on protein stability is universal and indistinguishable in eukaryotes and bacteria, centered at slightly destabilizing amino acid replacements, and with a heavy tail of more strongly destabilizing replacements. The distribution of mutational effects in the hyperthermophilic archaeon Thermococcus gammatolerans is significantly shifted toward strongly destabilizing replacements which is indicative of stronger constraints that are imposed on proteins in hyperthermophiles. The median effect of mutations is strongly, positively correlated with the relative core size, in evidence of the congruence between the two measures of protein robustness. However, both measures show only limited correlations to the expression level and selection pressure on protein-coding genes. Thus, the degree of robustness reflected in the universal distribution of mutational effects appears to be a fundamental, ancient feature of globular protein folds whereas the observed variations are largely neutral and uncoupled from short term protein evolution. A weak anticorrelation between protein core size and selection pressure is observed only for surface residues in prokaryotes but a stronger anticorrelation is observed for all residues in eukaryotic proteins. This substantial difference between proteins of prokaryotes and eukaryotes is likely to stem from the demonstrable higher compactness of prokaryotic proteins.

  3. Robust and sparse correlation matrix estimation for the analysis of high-dimensional genomics data.

    PubMed

    Serra, Angela; Coretto, Pietro; Fratello, Michele; Tagliaferri, Roberto; Stegle, Oliver

    2018-02-15

    Microarray technology can be used to study the expression of thousands of genes across a number of different experimental conditions, usually hundreds. The underlying principle is that genes sharing similar expression patterns, across different samples, can be part of the same co-expression system, or they may share the same biological functions. Groups of genes are usually identified based on cluster analysis. Clustering methods rely on the similarity matrix between genes. A common choice to measure similarity is to compute the sample correlation matrix. Dimensionality reduction is another popular data analysis task which is also based on covariance/correlation matrix estimates. Unfortunately, covariance/correlation matrix estimation suffers from the intrinsic noise present in high-dimensional data. Sources of noise are: sampling variations, presents of outlying sample units, and the fact that in most cases the number of units is much larger than the number of genes. In this paper, we propose a robust correlation matrix estimator that is regularized based on adaptive thresholding. The resulting method jointly tames the effects of the high-dimensionality, and data contamination. Computations are easy to implement and do not require hand tunings. Both simulated and real data are analyzed. A Monte Carlo experiment shows that the proposed method is capable of remarkable performances. Our correlation metric is more robust to outliers compared with the existing alternatives in two gene expression datasets. It is also shown how the regularization allows to automatically detect and filter spurious correlations. The same regularization is also extended to other less robust correlation measures. Finally, we apply the ARACNE algorithm on the SyNTreN gene expression data. Sensitivity and specificity of the reconstructed network is compared with the gold standard. We show that ARACNE performs better when it takes the proposed correlation matrix estimator as input. The R software is available at https://github.com/angy89/RobustSparseCorrelation. aserra@unisa.it or robtag@unisa.it. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  4. A robust optimization methodology for preliminary aircraft design

    NASA Astrophysics Data System (ADS)

    Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.

    2016-05-01

    This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.

  5. Precise Calculation of a Bond Percolation Transition and Survival Rates of Nodes in a Complex Network

    PubMed Central

    Kawamoto, Hirokazu; Takayasu, Hideki; Jensen, Henrik Jeldtoft; Takayasu, Misako

    2015-01-01

    Through precise numerical analysis, we reveal a new type of universal loopless percolation transition in randomly removed complex networks. As an example of a real-world network, we apply our analysis to a business relation network consisting of approximately 3,000,000 links among 300,000 firms and observe the transition with critical exponents close to the mean-field values taking into account the finite size effect. We focus on the largest cluster at the critical point, and introduce survival probability as a new measure characterizing the robustness of each node. We also discuss the relation between survival probability and k-shell decomposition. PMID:25885791

  6. Compactness and robustness: Applications in the solution of integral equations for chemical kinetics and electromagnetic scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Yajun

    This thesis employs the topological concept of compactness to deduce robust solutions to two integral equations arising from chemistry and physics: the inverse Laplace problem in chemical kinetics and the vector wave scattering problem in dielectric optics. The inverse Laplace problem occurs in the quantitative understanding of biological processes that exhibit complex kinetic behavior: different subpopulations of transition events from the "reactant" state to the "product" state follow distinct reaction rate constants, which results in a weighted superposition of exponential decay modes. Reconstruction of the rate constant distribution from kinetic data is often critical for mechanistic understandings of chemical reactions related to biological macromolecules. We devise a "phase function approach" to recover the probability distribution of rate constants from decay data in the time domain. The robustness (numerical stability) of this reconstruction algorithm builds upon the continuity of the transformations connecting the relevant function spaces that are compact metric spaces. The robust "phase function approach" not only is useful for the analysis of heterogeneous subpopulations of exponential decays within a single transition step, but also is generalizable to the kinetic analysis of complex chemical reactions that involve multiple intermediate steps. A quantitative characterization of the light scattering is central to many meteoro-logical, optical, and medical applications. We give a rigorous treatment to electromagnetic scattering on arbitrarily shaped dielectric media via the Born equation: an integral equation with a strongly singular convolution kernel that corresponds to a non-compact Green operator. By constructing a quadratic polynomial of the Green operator that cancels out the kernel singularity and satisfies the compactness criterion, we reveal the universality of a real resonance mode in dielectric optics. Meanwhile, exploiting the properties of compact operators, we outline the geometric and physical conditions that guarantee a robust solution to the light scattering problem, and devise an asymptotic solution to the Born equation of electromagnetic scattering for arbitrarily shaped dielectric in a non-perturbative manner.

  7. Quantitative analysis of sitagliptin using the (19)F-NMR method: a universal technique for fluorinated compound detection.

    PubMed

    Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya

    2015-01-07

    To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.

  8. Ready, Fire, Aim: The College Campus Gun Fight

    ERIC Educational Resources Information Center

    Birnbaum, Robert

    2013-01-01

    The question of whether guns should be permitted on college and university campuses in the United States reflects the tension between two competing perspectives. America has both a robust gun culture and an equally robust (if less well known) gun-control culture. The gun culture is as American as apple pie: There may be as many as 300 million…

  9. Moving Faces

    ERIC Educational Resources Information Center

    Journal of College Science Teaching, 2005

    2005-01-01

    A recent study by Zara Ambadar and Jeffrey F. Cohn of the University of Pittsburgh and Jonathan W. Schooler of the University of British Columbia, examined how motion affects people's judgment of subtle facial expressions. Two experiments demonstrated robust effects of motion in facilitating the perception of subtle facial expressions depicting…

  10. Optimization of Robust HPLC Method for Quantitation of Ambroxol Hydrochloride and Roxithromycin Using a DoE Approach.

    PubMed

    Patel, Rashmin B; Patel, Nilay M; Patel, Mrunali R; Solanki, Ajay B

    2017-03-01

    The aim of this work was to develop and optimize a robust HPLC method for the separation and quantitation of ambroxol hydrochloride and roxithromycin utilizing Design of Experiment (DoE) approach. The Plackett-Burman design was used to assess the impact of independent variables (concentration of organic phase, mobile phase pH, flow rate and column temperature) on peak resolution, USP tailing and number of plates. A central composite design was utilized to evaluate the main, interaction, and quadratic effects of independent variables on the selected dependent variables. The optimized HPLC method was validated based on ICH Q2R1 guideline and was used to separate and quantify ambroxol hydrochloride and roxithromycin in tablet formulations. The findings showed that DoE approach could be effectively applied to optimize a robust HPLC method for quantification of ambroxol hydrochloride and roxithromycin in tablet formulations. Statistical comparison between results of proposed and reported HPLC method revealed no significant difference; indicating the ability of proposed HPLC method for analysis of ambroxol hydrochloride and roxithromycin in pharmaceutical formulations. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Stakeholder perspectives on implementing a universal Lynch syndrome screening program: a qualitative study of early barriers and facilitators.

    PubMed

    Schneider, Jennifer L; Davis, James; Kauffman, Tia L; Reiss, Jacob A; McGinley, Cheryl; Arnold, Kathleen; Zepp, Jamilyn; Gilmore, Marian; Muessig, Kristin R; Syngal, Sapna; Acheson, Louise; Wiesner, Georgia L; Peterson, Susan K; Goddard, Katrina A B

    2016-02-01

    Evidence-based guidelines recommend that all newly diagnosed colon cancer be screened for Lynch syndrome (LS), but best practices for implementing universal tumor screening have not been extensively studied. We interviewed a range of stakeholders in an integrated health-care system to identify initial factors that might promote or hinder the successful implementation of a universal LS screening program. We conducted interviews with health-plan leaders, managers, and staff. Interviews were audio-recorded and transcribed. Thematic analysis began with a grounded approach and was also guided by the Practical Robust Implementation and Sustainability Model (PRISM). We completed 14 interviews with leaders/managers and staff representing involved clinical and health-plan departments. Although stakeholders supported the concept of universal screening, they identified several internal (organizational) and external (environment) factors that promote or hinder implementation. Facilitating factors included perceived benefits of screening for patients and organization, collaboration between departments, and availability of organizational resources. Barriers were also identified, including: lack of awareness of guidelines, lack of guideline clarity, staffing and program "ownership" concerns, and cost uncertainties. Analysis also revealed nine important infrastructure-type considerations for successful implementation. We found that clinical, laboratory, and administrative departments supported universal tumor screening for LS. Requirements for successful implementation may include interdepartmental collaboration and communication, patient and provider/staff education, and significant infrastructure and resource support related to laboratory processing and systems for electronic ordering and tracking.

  12. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    PubMed

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  13. Potential Cost-Effectiveness of Universal Access to Modern Contraceptives in Uganda

    PubMed Central

    Babigumira, Joseph B.; Stergachis, Andy; Veenstra, David L.; Gardner, Jacqueline S.; Ngonzi, Joseph; Mukasa-Kivunike, Peter; Garrison, Louis P.

    2012-01-01

    Background Over two thirds of women who need contraception in Uganda lack access to modern effective methods. This study was conducted to estimate the potential cost-effectiveness of achieving universal access to modern contraceptives in Uganda by implementing a hypothetical new contraceptive program (NCP) from both societal and governmental (Ministry of Health (MoH)) perspectives. Methodology/Principal Findings A Markov model was developed to compare the NCP to the status quo or current contraceptive program (CCP). The model followed a hypothetical cohort of 15-year old girls over a lifetime horizon. Data were obtained from the Uganda National Demographic and Health Survey and from published and unpublished sources. Costs, life expectancy, disability-adjusted life expectancy, pregnancies, fertility and incremental cost-effectiveness measured as cost per life-year (LY) gained, cost per disability-adjusted life-year (DALY) averted, cost per pregnancy averted and cost per unit of fertility reduction were calculated. Univariate and probabilistic sensitivity analyses were performed to examine the robustness of results. Mean discounted life expectancy and disability-adjusted life expectancy (DALE) were higher under the NCP vs. CCP (28.74 vs. 28.65 years and 27.38 vs. 27.01 respectively). Mean pregnancies and live births per woman were lower under the NCP (9.51 vs. 7.90 and 6.92 vs. 5.79 respectively). Mean lifetime societal costs per woman were lower for the NCP from the societal perspective ($1,949 vs. $1,987) and the MoH perspective ($636 vs. $685). In the incremental analysis, the NCP dominated the CCP, i.e. it was both less costly and more effective. The results were robust to univariate and probabilistic sensitivity analysis. Conclusion/Significance Universal access to modern contraceptives in Uganda appears to be highly cost-effective. Increasing contraceptive coverage should be considered among Uganda's public health priorities. PMID:22363480

  14. Task-sharing or public finance for the expansion of surgical access in rural Ethiopia: an extended cost-effectiveness analysis.

    PubMed

    Shrime, Mark G; Verguet, Stéphane; Johansson, Kjell Arne; Desalegn, Dawit; Jamison, Dean T; Kruk, Margaret E

    2016-07-01

    Despite a high burden of surgical disease, access to surgical services in low- and middle-income countries is often limited. In line with the World Health Organization's current focus on universal health coverage and equitable access to care, we examined how policies to expand access to surgery in rural Ethiopia would impact health, impoverishment and equity. An extended cost-effectiveness analysis was performed. Deterministic and stochastic models of surgery in rural Ethiopia were constructed, utilizing pooled estimates of costs and probabilities from national surveys and published literature. Model calibration and validation were performed against published estimates, with sensitivity analyses on model assumptions to check for robustness. Outcomes of interest were the number of deaths averted, the number of cases of poverty averted and the number of cases of catastrophic expenditure averted for each policy, divided across wealth quintiles. Health benefits, financial risk protection and equity appear to be in tension in the expansion of access to surgical care in rural Ethiopia. Health benefits from each of the examined policies accrued primarily to the poor. However, without travel vouchers, many policies also induced impoverishment in the poor while providing financial risk protection to the rich, calling into question the equitable distribution of benefits by these policies. Adding travel vouchers removed the impoverishing effects of a policy but decreased the health benefit that could be bought per dollar spent. These results were robust to sensitivity analyses. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  15. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics.

    PubMed

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina

    2015-01-01

    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  16. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example. PMID:23515190

  17. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology.

    PubMed

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example.

  18. A comparative study of multivariable robustness analysis methods as applied to integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Lovell, T. A.; Schmidt, David K.

    1993-01-01

    Three multivariable robustness analysis methods are compared and contrasted. The focus of the analysis is on system stability and performance robustness to uncertainty in the coupling dynamics between two interacting subsystems. Of particular interest is interacting airframe and engine subsystems, and an example airframe/engine vehicle configuration is utilized in the demonstration of these approaches. The singular value (SV) and structured singular value (SSV) analysis methods are compared to a method especially well suited for analysis of robustness to uncertainties in subsystem interactions. This approach is referred to here as the interacting subsystem (IS) analysis method. This method has been used previously to analyze airframe/engine systems, emphasizing the study of stability robustness. However, performance robustness is also investigated here, and a new measure of allowable uncertainty for acceptable performance robustness is introduced. The IS methodology does not require plant uncertainty models to measure the robustness of the system, and is shown to yield valuable information regarding the effects of subsystem interactions. In contrast, the SV and SSV methods allow for the evaluation of the robustness of the system to particular models of uncertainty, and do not directly indicate how the airframe (engine) subsystem interacts with the engine (airframe) subsystem.

  19. Effect of interaction strength on robustness of controlling edge dynamics in complex networks

    NASA Astrophysics Data System (ADS)

    Pang, Shao-Peng; Hao, Fei

    2018-05-01

    Robustness plays a critical role in the controllability of complex networks to withstand failures and perturbations. Recent advances in the edge controllability show that the interaction strength among edges plays a more important role than network structure. Therefore, we focus on the effect of interaction strength on the robustness of edge controllability. Using three categories of all edges to quantify the robustness, we develop a universal framework to evaluate and analyze the robustness in complex networks with arbitrary structures and interaction strengths. Applying our framework to a large number of model and real-world networks, we find that the interaction strength is a dominant factor for the robustness in undirected networks. Meanwhile, the strongest robustness and the optimal edge controllability in undirected networks can be achieved simultaneously. Different from the case of undirected networks, the robustness in directed networks is determined jointly by the interaction strength and the network's degree distribution. Moreover, a stronger robustness is usually associated with a larger number of driver nodes required to maintain full control in directed networks. This prompts us to provide an optimization method by adjusting the interaction strength to optimize the robustness of edge controllability.

  20. Accounting for technical noise in differential expression analysis of single-cell RNA sequencing data.

    PubMed

    Jia, Cheng; Hu, Yu; Kelly, Derek; Kim, Junhyong; Li, Mingyao; Zhang, Nancy R

    2017-11-02

    Recent technological breakthroughs have made it possible to measure RNA expression at the single-cell level, thus paving the way for exploring expression heterogeneity among individual cells. Current single-cell RNA sequencing (scRNA-seq) protocols are complex and introduce technical biases that vary across cells, which can bias downstream analysis without proper adjustment. To account for cell-to-cell technical differences, we propose a statistical framework, TASC (Toolkit for Analysis of Single Cell RNA-seq), an empirical Bayes approach to reliably model the cell-specific dropout rates and amplification bias by use of external RNA spike-ins. TASC incorporates the technical parameters, which reflect cell-to-cell batch effects, into a hierarchical mixture model to estimate the biological variance of a gene and detect differentially expressed genes. More importantly, TASC is able to adjust for covariates to further eliminate confounding that may originate from cell size and cell cycle differences. In simulation and real scRNA-seq data, TASC achieves accurate Type I error control and displays competitive sensitivity and improved robustness to batch effects in differential expression analysis, compared to existing methods. TASC is programmed to be computationally efficient, taking advantage of multi-threaded parallelization. We believe that TASC will provide a robust platform for researchers to leverage the power of scRNA-seq. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. To sling or not to sling at time of abdominal sacrocolpopexy: a cost-effectiveness analysis.

    PubMed

    Richardson, Monica L; Elliott, Christopher S; Shaw, Jonathan G; Comiter, Craig V; Chen, Bertha; Sokol, Eric R

    2013-10-01

    We compare the cost-effectiveness of 3 strategies for the use of a mid urethral sling to prevent occult stress urinary incontinence in patients undergoing abdominal sacrocolpopexy. Using decision analysis modeling we compared cost-effectiveness during a 1-year postoperative period of 3 treatment approaches including 1) abdominal sacrocolpopexy alone with deferred option for mid urethral sling, 2) abdominal sacrocolpopexy with universal concomitant mid urethral sling and 3) preoperative urodynamic study for selective mid urethral sling. Using published data we modeled probabilities of stress urinary incontinence after abdominal sacrocolpopexy with or without mid urethral sling, the predictive value of urodynamic study to detect occult stress urinary incontinence and the likelihood of complications after mid urethral sling. Costs were derived from Medicare 2010 reimbursement rates. The main outcome modeled was incremental cost-effectiveness ratio per quality adjusted life-years gained. In addition to base case analysis, 1-way sensitivity analyses were performed. In our model, universally performing mid urethral sling at abdominal sacrocolpopexy was the most cost-effective approach with an incremental cost per quality adjusted life-year gained of $2,867 compared to abdominal sacrocolpopexy alone. Preoperative urodynamic study was more costly and less effective than universally performing intraoperative mid urethral sling. The cost-effectiveness of abdominal sacrocolpopexy plus mid urethral sling was robust to sensitivity analysis with a cost-effectiveness ratio consistently below $20,000 per quality adjusted life-year. Universal concomitant mid urethral sling is the most cost-effective prophylaxis strategy for occult stress urinary incontinence in women undergoing abdominal sacrocolpopexy. The use of preoperative urodynamic study to guide mid urethral sling placement at abdominal sacrocolpopexy is not cost-effective. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  2. CNN universal machine as classificaton platform: an art-like clustering algorithm.

    PubMed

    Bálya, David

    2003-12-01

    Fast and robust classification of feature vectors is a crucial task in a number of real-time systems. A cellular neural/nonlinear network universal machine (CNN-UM) can be very efficient as a feature detector. The next step is to post-process the results for object recognition. This paper shows how a robust classification scheme based on adaptive resonance theory (ART) can be mapped to the CNN-UM. Moreover, this mapping is general enough to include different types of feed-forward neural networks. The designed analogic CNN algorithm is capable of classifying the extracted feature vectors keeping the advantages of the ART networks, such as robust, plastic and fault-tolerant behaviors. An analogic algorithm is presented for unsupervised classification with tunable sensitivity and automatic new class creation. The algorithm is extended for supervised classification. The presented binary feature vector classification is implemented on the existing standard CNN-UM chips for fast classification. The experimental evaluation shows promising performance after 100% accuracy on the training set.

  3. Universal non-adiabatic geometric manipulation of pseudo-spin charge qubits

    NASA Astrophysics Data System (ADS)

    Azimi Mousolou, Vahid

    2017-01-01

    Reliable quantum information processing requires high-fidelity universal manipulation of quantum systems within the characteristic coherence times. Non-adiabatic holonomic quantum computation offers a promising approach to implement fast, universal, and robust quantum logic gates particularly useful in nano-fabricated solid-state architectures, which typically have short coherence times. Here, we propose an experimentally feasible scheme to realize high-speed universal geometric quantum gates in nano-engineered pseudo-spin charge qubits. We use a system of three coupled quantum dots containing a single electron, where two computational states of a double quantum dot charge qubit interact through an intermediate quantum dot. The additional degree of freedom introduced into the qubit makes it possible to create a geometric model system, which allows robust and efficient single-qubit rotations through careful control of the inter-dot tunneling parameters. We demonstrate that a capacitive coupling between two charge qubits permits a family of non-adiabatic holonomic controlled two-qubit entangling gates, and thus provides a promising procedure to maintain entanglement in charge qubits and a pathway toward fault-tolerant universal quantum computation. We estimate the feasibility of the proposed structure by analyzing the gate fidelities to some extent.

  4. Genonets server-a web server for the construction, analysis and visualization of genotype networks.

    PubMed

    Khalid, Fahad; Aguilar-Rodríguez, José; Wagner, Andreas; Payne, Joshua L

    2016-07-08

    A genotype network is a graph in which vertices represent genotypes that have the same phenotype. Edges connect vertices if their corresponding genotypes differ in a single small mutation. Genotype networks are used to study the organization of genotype spaces. They have shed light on the relationship between robustness and evolvability in biological systems as different as RNA macromolecules and transcriptional regulatory circuits. Despite the importance of genotype networks, no tool exists for their automatic construction, analysis and visualization. Here we fill this gap by presenting the Genonets Server, a tool that provides the following features: (i) the construction of genotype networks for categorical and univariate phenotypes from DNA, RNA, amino acid or binary sequences; (ii) analyses of genotype network topology and how it relates to robustness and evolvability, as well as analyses of genotype network topography and how it relates to the navigability of a genotype network via mutation and natural selection; (iii) multiple interactive visualizations that facilitate exploratory research and education. The Genonets Server is freely available at http://ieu-genonets.uzh.ch. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. CFD Analysis of the Oscillating Flow within a Stirling Engine with an Additively Manufactured Foil Type Regenerator

    NASA Astrophysics Data System (ADS)

    Qiu, Songgang; Solomon, Laura

    2017-11-01

    The simplistic design, fuel independence, and robustness of Stirling convertors makes them the ideal choice for use in solar power and combined heat and power (CHP) applications. A lack of moving parts and the use of novel flexure bearings allows free-piston type Stirling engines to run in excess of ten years without degradation or maintenance. The key component to their overall efficiency is the regenerator. While a foil type regenerator outperforms a sintered random fiber regenerator, limitation in manufacturing and keeping uniform spacing between the foils has limited their overall use. However, with the advent of additive manufacturing, a robust foil type regenerator can be cheaply manufactured without traditional limitations. Currently, a CFD analysis of the oscillating internal flow within the novel design was conducted to evaluate the flow loses within the system. Particularly the pressure drop across the regenerator in comparison to a traditionally used random fiber regenerator. Additionally, the heat transfer and flow over the tubular heater hear was evaluated. The results of the investigation will be used to optimize the operation of the next generation of additively manufactured Stirling convertors. This research was supported by ARPA-E and West Virginia University.

  6. A "Politically Robust" Experimental Design for Public Policy Evaluation, with Application to the Mexican Universal Health Insurance Program

    ERIC Educational Resources Information Center

    King, Gary; Gakidou, Emmanuela; Ravishankar, Nirmala; Moore, Ryan T.; Lakin, Jason; Vargas, Manett; Tellez-Rojo, Martha Maria; Avila, Juan Eugenio Hernandez; Avila, Mauricio Hernandez; Llamas, Hector Hernandez

    2007-01-01

    We develop an approach to conducting large-scale randomized public policy experiments intended to be more robust to the political interventions that have ruined some or all parts of many similar previous efforts. Our proposed design is insulated from selection bias in some circumstances even if we lose observations; our inferences can still be…

  7. SU-F-R-51: Radiomics in CT Perfusion Maps of Head and Neck Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nesteruk, M; Riesterer, O; Veit-Haibach, P

    2016-06-15

    Purpose: The aim of this study was to test the predictive value of radiomics features of CT perfusion (CTP) for tumor control, based on a preselection of radiomics features in a robustness study. Methods: 11 patients with head and neck cancer (HNC) and 11 patients with lung cancer were included in the robustness study to preselect stable radiomics parameters. Data from 36 HNC patients treated with definitive radiochemotherapy (median follow-up 30 months) was used to build a predictive model based on these parameters. All patients underwent pre-treatment CTP. 315 texture parameters were computed for three perfusion maps: blood volume, bloodmore » flow and mean transit time. The variability of texture parameters was tested with respect to non-standardizable perfusion computation factors (noise level and artery contouring) using intraclass correlation coefficients (ICC). The parameter with the highest ICC in the correlated group of parameters (inter-parameter Spearman correlations) was tested for its predictive value. The final model to predict tumor control was built using multivariate Cox regression analysis with backward selection of the variables. For comparison, a predictive model based on tumor volume was created. Results: Ten parameters were found to be stable in both HNC and lung cancer regarding potentially non-standardizable factors after the correction for inter-parameter correlations. In the multivariate backward selection of the variables, blood flow entropy showed a highly significant impact on tumor control (p=0.03) with concordance index (CI) of 0.76. Blood flow entropy was significantly lower in the patient group with controlled tumors at 18 months (p<0.1). The new model showed a higher concordance index compared to the tumor volume model (CI=0.68). Conclusion: The preselection of variables in the robustness study allowed building a predictive radiomics-based model of tumor control in HNC despite a small patient cohort. This model was found to be superior to the volume-based model. The project was supported by the KFSP Tumor Oxygenation of the University of Zurich, by a grant of the Center for Clinical Research, University and University Hospital Zurich and by a research grant from Merck (Schweiz) AG.« less

  8. Robust Ambiguity Estimation for an Automated Analysis of the Intensive Sessions

    NASA Astrophysics Data System (ADS)

    Kareinen, Niko; Hobiger, Thomas; Haas, Rüdiger

    2016-12-01

    Very Long Baseline Interferometry (VLBI) is a unique space-geodetic technique that can directly determine the Earth's phase of rotation, namely UT1. The daily estimates of the difference between UT1 and Coordinated Universal Time (UTC) are computed from one-hour long VLBI Intensive sessions. These sessions are essential for providing timely UT1 estimates for satellite navigation systems. To produce timely UT1 estimates, efforts have been made to completely automate the analysis of VLBI Intensive sessions. This requires automated processing of X- and S-band group delays. These data often contain an unknown number of integer ambiguities in the observed group delays. In an automated analysis with the c5++ software the standard approach in resolving the ambiguities is to perform a simplified parameter estimation using a least-squares adjustment (L2-norm minimization). We implement the robust L1-norm with an alternative estimation method in c5++. The implemented method is used to automatically estimate the ambiguities in VLBI Intensive sessions for the Kokee-Wettzell baseline. The results are compared to an analysis setup where the ambiguity estimation is computed using the L2-norm. Additionally, we investigate three alternative weighting strategies for the ambiguity estimation. The results show that in automated analysis the L1-norm resolves ambiguities better than the L2-norm. The use of the L1-norm leads to a significantly higher number of good quality UT1-UTC estimates with each of the three weighting strategies.

  9. Can We Talk? Employing Conversation to Ameliorate Undergraduate Distress at Catholic Colleges and Universities

    ERIC Educational Resources Information Center

    Petro, Susannah J. P.

    2017-01-01

    This article addresses students' need for robust relationships to counteract the epidemic of loneliness, anxiety, and depression pervading contemporary undergraduate life, and proposes that Catholic colleges and universities can find in Catholic theological anthropology a warrant for recognizing relationship-building as central to their mission.…

  10. Combining Research, Outreach and Student Learning: A New Model in Rhode Island

    ERIC Educational Resources Information Center

    Grossman-Garber, Deborah; Gold, Arthur; Husband, Thomas

    2001-01-01

    American research universities are renowned for applying cutting-edge science to the improvement of the world's health and environmental systems. Indeed, as a society, people have come to expect this type of intellectual leadership from their great universities. Less appreciated is the robust opportunity for state and local governments to harness…

  11. Feature Matching of Historical Images Based on Geometry of Quadrilaterals

    NASA Astrophysics Data System (ADS)

    Maiwald, F.; Schneider, D.; Henze, F.; Münster, S.; Niebling, F.

    2018-05-01

    This contribution shows an approach to match historical images from the photo library of the Saxon State and University Library Dresden (SLUB) in the context of a historical three-dimensional city model of Dresden. In comparison to recent images, historical photography provides diverse factors which make an automatical image analysis (feature detection, feature matching and relative orientation of images) difficult. Due to e.g. film grain, dust particles or the digitalization process, historical images are often covered by noise interfering with the image signal needed for a robust feature matching. The presented approach uses quadrilaterals in image space as these are commonly available in man-made structures and façade images (windows, stones, claddings). It is explained how to generally detect quadrilaterals in images. Consequently, the properties of the quadrilaterals as well as the relationship to neighbouring quadrilaterals are used for the description and matching of feature points. The results show that most of the matches are robust and correct but still small in numbers.

  12. Burnout among Finnish and Chinese university students.

    PubMed

    Hernesniemi, Elina; Räty, Hannu; Kasanen, Kati; Cheng, Xuejiao; Hong, Jianzhong; Kuittinen, Matti

    2017-10-01

    In this study the levels of experienced burnout of Finnish and Chinese university students are compared using School Burnout Inventory (SBI). This study is motivated by earlier studies, which suggest that the level of student burnout is different in the culturally distinct Finnish and Chinese university systems, but which are based on different research instruments for the two groups. The sample studied consisted of 3,035 Finnish students and 2,309 Chinese students. Because of the cross-cultural nature of this study the level of structural equivalence of SBI between the cultural groups was examined and the effect of different response styles on the results was taken into account. Both standard and robust statistical methods were used for the analyses. The results showed that SBI with two extracted components is suitable for cross-cultural analysis between Finnish and Chinese university students. Virtually no difference was found in experienced overall burnout between the Finnish and Chinese students, which means that both university systems contain factors causing similar levels of student burnout. This study also verified that controlling for the response styles is important in cross-cultural studies as it was found to have a distinct effect on the results obtained from mean-level comparisons. © 2017 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  13. Comparing Networks from a Data Analysis Perspective

    NASA Astrophysics Data System (ADS)

    Li, Wei; Yang, Jing-Yu

    To probe network characteristics, two predominant ways of network comparison are global property statistics and subgraph enumeration. However, they suffer from limited information and exhaustible computing. Here, we present an approach to compare networks from the perspective of data analysis. Initially, the approach projects each node of original network as a high-dimensional data point, and the network is seen as clouds of data points. Then the dispersion information of the principal component analysis (PCA) projection of the generated data clouds can be used to distinguish networks. We applied this node projection method to the yeast protein-protein interaction networks and the Internet Autonomous System networks, two types of networks with several similar higher properties. The method can efficiently distinguish one from the other. The identical result of different datasets from independent sources also indicated that the method is a robust and universal framework.

  14. Impact of the macroeconomic factors on university budgeting the US and Russia

    NASA Astrophysics Data System (ADS)

    Bogomolova, Arina; Balk, Igor; Ivachenko, Natalya; Temkin, Anatoly

    2017-10-01

    This paper discuses impact of macroeconomics factor on the university budgeting. Modern developments in the area of data science and machine learning made it possible to utilise automated techniques to address several problems of humankind ranging from genetic engineering and particle physics to sociology and economics. This paper is the first step to create a robust toolkit which will help universities sustain macroeconomic challenges utilising modern predictive analytics techniques.

  15. Overview of NASA's Universe of Learning: An Integrated Astrophysics STEM Learning and Literacy Program

    NASA Astrophysics Data System (ADS)

    Smith, Denise; Lestition, Kathleen; Squires, Gordon; Biferno, Anya A.; Cominsky, Lynn; Manning, Colleen; NASA's Universe of Learning Team

    2018-01-01

    NASA's Universe of Learning creates and delivers science-driven, audience-driven resources and experiences designed to engage and immerse learners of all ages and backgrounds in exploring the universe for themselves. The project is the result of a unique partnership between the Space Telescope Science Institute, Caltech/IPAC, Jet Propulsion Laboratory, Smithsonian Astrophysical Observatory, and Sonoma State University, and is one of 27 competitively-selected cooperative agreements within the NASA Science Mission Directorate STEM Activation program. The NASA's Universe of Learning team draws upon cutting-edge science and works closely with Subject Matter Experts (scientists and engineers) from across the NASA Astrophysics Physics of the Cosmos, Cosmic Origins, and Exoplanet Exploration themes. Together we develop and disseminate data tools and participatory experiences, multimedia and immersive experiences, exhibits and community programs, and professional learning experiences that meet the needs of our audiences, with attention to underserved and underrepresented populations. In doing so, scientists and educators from the partner institutions work together as a collaborative, integrated Astrophysics team to support NASA objectives to enable STEM education, increase scientific literacy, advance national education goals, and leverage efforts through partnerships. Robust program evaluation is central to our efforts, and utilizes portfolio analysis, process studies, and studies of reach and impact. This presentation will provide an overview of NASA's Universe of Learning, our direct connection to NASA Astrophysics, and our collaborative work with the NASA Astrophysics science community.

  16. Robust logistic regression to narrow down the winner's curse for rare and recessive susceptibility variants.

    PubMed

    Kesselmeier, Miriam; Lorenzo Bermejo, Justo

    2017-11-01

    Logistic regression is the most common technique used for genetic case-control association studies. A disadvantage of standard maximum likelihood estimators of the genotype relative risk (GRR) is their strong dependence on outlier subjects, for example, patients diagnosed at unusually young age. Robust methods are available to constrain outlier influence, but they are scarcely used in genetic studies. This article provides a non-intimidating introduction to robust logistic regression, and investigates its benefits and limitations in genetic association studies. We applied the bounded Huber and extended the R package 'robustbase' with the re-descending Hampel functions to down-weight outlier influence. Computer simulations were carried out to assess the type I error rate, mean squared error (MSE) and statistical power according to major characteristics of the genetic study and investigated markers. Simulations were complemented with the analysis of real data. Both standard and robust estimation controlled type I error rates. Standard logistic regression showed the highest power but standard GRR estimates also showed the largest bias and MSE, in particular for associated rare and recessive variants. For illustration, a recessive variant with a true GRR=6.32 and a minor allele frequency=0.05 investigated in a 1000 case/1000 control study by standard logistic regression resulted in power=0.60 and MSE=16.5. The corresponding figures for Huber-based estimation were power=0.51 and MSE=0.53. Overall, Hampel- and Huber-based GRR estimates did not differ much. Robust logistic regression may represent a valuable alternative to standard maximum likelihood estimation when the focus lies on risk prediction rather than identification of susceptibility variants. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares

    NASA Technical Reports Server (NTRS)

    Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.

    2012-01-01

    A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.

  18. What Lies Beneath: Saddam’s Legacy and the Roots of Resistance in Iraq

    DTIC Science & Technology

    2005-12-01

    Third World Politics: An Introduction, (Madison, WI: University of Wisconsin Press, 1985), 48-49. 10 Eva Bellin , “The Robustness of...464. 12 Bellin , “The Robustness of Authoritarianism,” 145-150. 6 of building layered security structures,13 as well as the extension of control...of reasons. Bellin points out that strong state institutions, relatively high economic development, ethnic homogeneity, historical experience of

  19. Evaluation of Ares-I Control System Robustness to Uncertain Aerodynamics and Flex Dynamics

    NASA Technical Reports Server (NTRS)

    Jang, Jiann-Woei; VanTassel, Chris; Bedrossian, Nazareth; Hall, Charles; Spanos, Pol

    2008-01-01

    This paper discusses the application of robust control theory to evaluate robustness of the Ares-I control systems. Three techniques for estimating upper and lower bounds of uncertain parameters which yield stable closed-loop response are used here: (1) Monte Carlo analysis, (2) mu analysis, and (3) characteristic frequency response analysis. All three methods are used to evaluate stability envelopes of the Ares-I control systems with uncertain aerodynamics and flex dynamics. The results show that characteristic frequency response analysis is the most effective of these methods for assessing robustness.

  20. Recommendations for Methicillin-Resistant Staphylococcus aureus Prevention in Adult ICUs: A Cost-Effectiveness Analysis.

    PubMed

    Whittington, Melanie D; Atherly, Adam J; Curtis, Donna J; Lindrooth, Richard C; Bradley, Cathy J; Campbell, Jonathan D

    2017-08-01

    Patients in the ICU are at the greatest risk of contracting healthcare-associated infections like methicillin-resistant Staphylococcus aureus. This study calculates the cost-effectiveness of methicillin-resistant S aureus prevention strategies and recommends specific strategies based on screening test implementation. A cost-effectiveness analysis using a Markov model from the hospital perspective was conducted to determine if the implementation costs of methicillin-resistant S aureus prevention strategies are justified by associated reductions in methicillin-resistant S aureus infections and improvements in quality-adjusted life years. Univariate and probabilistic sensitivity analyses determined the influence of input variation on the cost-effectiveness. ICU. Hypothetical cohort of adults admitted to the ICU. Three prevention strategies were evaluated, including universal decolonization, targeted decolonization, and screening and isolation. Because prevention strategies have a screening component, the screening test in the model was varied to reflect commonly used screening test categories, including conventional culture, chromogenic agar, and polymerase chain reaction. Universal and targeted decolonization are less costly and more effective than screening and isolation. This is consistent for all screening tests. When compared with targeted decolonization, universal decolonization is cost-saving to cost-effective, with maximum cost savings occurring when a hospital uses more expensive screening tests like polymerase chain reaction. Results were robust to sensitivity analyses. As compared with screening and isolation, the current standard practice in ICUs, targeted decolonization, and universal decolonization are less costly and more effective. This supports updating the standard practice to a decolonization approach.

  1. Creating More "Elbow Room" for Collaborative Reflective Practice in the Competitive, Performative Culture of Today's University

    ERIC Educational Resources Information Center

    Kennelly, Robert; McCormack, Coralie

    2015-01-01

    We live in "a world of clashing interests" [Zinn, H. (1991). "Declarations of independence: Cross-examining American ideology." Toronto: Harper Collins, p. xx]. In a grapple for survival, universities choose to spend less money and time on teaching and learning, less time on robust evaluation of student learning and…

  2. Towards a Theory of University Entrepreneurship: Developing a Theoretical Model

    ERIC Educational Resources Information Center

    Woollard, David

    2010-01-01

    This paper sets out to develop a robust theory in a largely atheoretical field of study. The increasing importance of entrepreneurship in delivering the "Third Mission" calls for an enhanced understanding of the university entrepreneurship phenomenon, not solely as a subject of academic interest but also to guide the work of practitioners in the…

  3. The Ecology of Arts and Humanities Education: Bridging the Worlds of Universities and Museums

    ERIC Educational Resources Information Center

    Salazar-Porzio, Margaret

    2015-01-01

    In recent years, colleges and universities have been talking seriously about civic learning, but other stakeholders, particularly public arts, culture, and humanities institutions, must be part of the conversation in order to create a context for learning that develops the skills of graduates in robust ways that reflect the full promise of liberal…

  4. Hierarchical CoP/Ni 5 P 4 /CoP microsheet arrays as a robust pH-universal electrocatalyst for efficient hydrogen generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, Ishwar Kumar; Zhou, Haiqing; Sun, Jingying

    Exceptional Pt-like electrocatalytic activity was achieved in a sandwich-like catalyst of CoP/Ni 5 P 4 /CoP microsheet arrays for pH-universal hydrogen evolution through simply wrapping Ni 5 P 4 nanosheet arrays in CoP nanoparticles.

  5. The Death of the Large Lecture Hall, the Rise of Peer-to-Peer Course Delivery?

    ERIC Educational Resources Information Center

    Navarro, Peter

    2015-01-01

    This article reports the results of a pilot project conducted at the University of California--Irvine (UCI) involving the simultaneous online delivery of a course to both University of California undergraduates and enrollees on the Coursera Massive Open Online Course (MOOC) platform. Survey results from a robust sampling of UCI undergraduates…

  6. Cultural Navigators: International Faculty Fathers in the U.S Research University

    ERIC Educational Resources Information Center

    Sallee, Margaret; Hart, Jeni

    2015-01-01

    Based on interviews with 16 international tenure-track and tenured faculty fathers from collectivist cultures at 2 U.S. research universities, this study explores how these men reconcile the demands of parenting with those of the academic career. Adding to a robust body of literature on the concerns of domestic faculty parents, this study focuses…

  7. Hierarchical CoP/Ni 5 P 4 /CoP microsheet arrays as a robust pH-universal electrocatalyst for efficient hydrogen generation

    DOE PAGES

    Mishra, Ishwar Kumar; Zhou, Haiqing; Sun, Jingying; ...

    2018-01-01

    Exceptional Pt-like electrocatalytic activity was achieved in a sandwich-like catalyst of CoP/Ni 5 P 4 /CoP microsheet arrays for pH-universal hydrogen evolution through simply wrapping Ni 5 P 4 nanosheet arrays in CoP nanoparticles.

  8. A simple solid phase, peptide-based fluorescent assay for the efficient and universal screening of HRV 3C protease inhibitors.

    PubMed

    Schünemann, Katrin; Connelly, Stephen; Kowalczyk, Renata; Sperry, Jonathan; Wilson, Ian A; Fraser, John D; Brimble, Margaret A

    2012-08-01

    With over a 100 different serotypes, the human rhinovirus (HRV) is the major aetiological agent for the common cold, for which only symptomatic treatment is available. HRV maturation and replication is entirely dependent on the activity of a virally encoded 3C protease that represents an attractive target for the development of therapeutics to treat the common cold. Although a variety of small molecules and peptidomimetics have been found to inhibit HRV 3C protease, no universally compatible assay exists to reliably quantify the activity of the enzyme in vitro. Herein we report the development of a universal and robust solid phase peptide assay that utilizes the full HRV-14 3C protease recognition sequence and the release of 5(6)-carboxyfluorescein to sensitively quantify protease activity. This novel assay overcomes several limitations of existing assays allowing for the simple and efficient analysis of HRV-14 3C protease activity facilitating both high-throughput screening and the accurate kinetic study of HRV-14 3C protease inhibitors. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Southeast Offshore Storage Resource Assessment (SOSRA): Evaluation of CO2 Storage Potential on the Continental Shelf from North Carolina to Florida

    NASA Astrophysics Data System (ADS)

    Knapp, J. H.; Knapp, C. C.; Brantley, D.; Lakshmi, V.; Howard, S.

    2016-12-01

    The Southeast Offshore Storage Resource Assessment (SOSRA) project is part of a major new program, funded by the U.S. Department of Energy for the next two and a half years, to evaluate the Atlantic and Gulf of Mexico offshore margins of the United States for geologic storage capacity of CO2. Collaborating organizations include the Southern States Energy Board, Virginia Polytechnic Institute, University of South Carolina, Oklahoma State University, Virginia Department of Mines, Minerals, and Energy, South Carolina Geological Survey, and Geological Survey of Alabama. Team members from South Carolina are focused on the Atlantic offshore, from North Carolina to Florida. Geologic sequestration of CO2 is a major research focus globally, and requires robust knowledge of the porosity and permeability distribution in upper crustal sediments. Using legacy seismic reflection, refraction, and well data from a previous phase of offshore petroleum exploration on the Atlantic margin, we are analyzing the rock physics characteristics of the offshore Mesozoic and Cenozoic stratigraphy on a regional scale from North Carolina to Florida. Major features of the margin include the Carolina Trough, the Southeast Georgia Embayment, the Blake Plateau basin, and the Blake Outer Ridge. Previous studies indicate sediment accumulations on this margin may be as thick as 12-15 km. The study will apply a diverse suite of data analysis techniques designed to meet the goal of predicting storage capacity to within ±30%. Synthetic seismograms and checkshot surveys will be used to tie well and seismic data. Seismic interpretation and geophysical log analysis will employ leading-edge software technology and state-of-the art techniques for stratigraphic and structural interpretation and the definition of storage units and their physical and chemical properties. This approach will result in a robust characterization of offshore CO2 storage opportunities, as well as a volumetric analysis that is consistent with established procedures.

  10. FATSLiM: a fast and robust software to analyze MD simulations of membranes.

    PubMed

    Buchoux, Sébastien

    2017-01-01

    When studying biological membranes, Molecular Dynamics (MD) simulations reveal to be quite complementary to experimental techniques. Because the simulated systems keep increasing both in size and complexity, the analysis of MD trajectories need to be computationally efficient while being robust enough to perform analysis on membranes that may be curved or deformed due to their size and/or protein-lipid interactions. This work presents a new software named FATSLiM ('Fast Analysis Toolbox for Simulations of Lipid Membranes') that can extract physical properties from MD simulations of membranes (with or without interacting proteins). Because it relies on the calculation of local normals, FATSLiM does not depend of the bilayer morphology and thus can handle with the same accuracy vesicles for instance. Thanks to an efficiency-driven development, it is also fast and consumes a rather low amount of memory. FATSLiM (http://fatslim.github.io) is a stand-alone software written in Python. Source code is released under the GNU GPLv3 and is freely available at https://github.com/FATSLiM/fatslim A complete online documentation including instructions for platform-independent installation is available at http://pythonhosted.org/fatslim CONTACT: sebastien.buchoux@u-picardie.frSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Robust Long-Range Coordination of Spontaneous Neural Activity in Waking, Sleep and Anesthesia.

    PubMed

    Liu, Xiao; Yanagawa, Toru; Leopold, David A; Fujii, Naotaka; Duyn, Jeff H

    2015-09-01

    Although the emerging field of functional connectomics relies increasingly on the analysis of spontaneous fMRI signal covariation to infer the spatial fingerprint of the brain's large-scale functional networks, the nature of the underlying neuro-electrical activity remains incompletely understood. In part, this lack in understanding owes to the invasiveness of electrophysiological acquisition, the difficulty in their simultaneous recording over large cortical areas, and the absence of fully established methods for unbiased extraction of network information from these data. Here, we demonstrate a novel, data-driven approach to analyze spontaneous signal variations in electrocorticographic (ECoG) recordings from nearly entire hemispheres of macaque monkeys. Based on both broadband analysis and analysis of specific frequency bands, the ECoG signals were found to co-vary in patterns that resembled the fMRI networks reported in previous studies. The extracted patterns were robust against changes in consciousness associated with sleep and anesthesia, despite profound changes in intrinsic characteristics of the raw signals, including their spectral signatures. These results suggest that the spatial organization of large-scale brain networks results from neural activity with a broadband spectral feature and is a core aspect of the brain's physiology that does not depend on the state of consciousness. Published by Oxford University Press 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  12. Universal heat conduction in Ce 1-xYb xCoIn 5: Evidence for robust nodal d-wave superconducting gap

    DOE PAGES

    Xu, Y.; Petrovic, C.; Dong, J. K.; ...

    2016-02-01

    In the heavy-fermion superconductor Ce 1-xYb xCoIn 5, Yb doping was reported to cause a possible change from nodal d-wave superconductivity to a fully gapped d-wave molecular superfluid of composite pairs near x ≈ 0.07 (nominal value x nom = 0.2). Here we present systematic thermal conductivity measurements on Ce 1-xYb xCoIn 5 (x = 0.013, 0.084, and 0.163) single crystals. The observed finite residual linear term κ 0/T is insensitive to Yb doping, verifying the universal heat conduction of the nodal d-wave superconducting gap in Ce 1-xYb xCoIn 5. Similar universal heat conduction is also observed in the CeCo(Inmore » 1–yCd y) 5 system. Furthermore, these results reveal a robust nodal d-wave gap in CeCoIn 5 upon Yb or Cd doping.« less

  13. Investigation of air transportation technology at Princeton University, 1990-1991

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1991-01-01

    The Air Transportation Technology Program at Princeton University is a program that emphasizes graduate and undergraduate student research. The program proceeded along six avenues during the past year: microburst hazards to aircraft, intelligent failure tolerant control, computer-aided heuristics for piloted flight, stochastic robustness of flight control systems, neural networks for flight control, and computer-aided control system design.

  14. Closed-loop torque feedback for a universal field-oriented controller

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Doncker, R.W.A.A.; King, R.D.; Sanza, P.C.

    A torque feedback system is employed in a universal field-oriented (UFO) controller to tune a torque-producing current command and a slip frequency command in order to achieve robust torque control of an induction machine even in the event of current regulator errors and during transitions between pulse width modulated (PWM) and square wave modes of operation. 1 figure.

  15. Closed-loop torque feedback for a universal field-oriented controller

    DOEpatents

    De Doncker, R.W.A.A.; King, R.D.; Sanza, P.C.; Haefner, K.B.

    1992-11-24

    A torque feedback system is employed in a universal field-oriented (UFO) controller to tune a torque-producing current command and a slip frequency command in order to achieve robust torque control of an induction machine even in the event of current regulator errors and during transitions between pulse width modulated (PWM) and square wave modes of operation. 1 figure.

  16. Closed-loop torque feedback for a universal field-oriented controller

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Doncker, Rik W. A. A.; King, Robert D.; Sanza, Peter C.

    A torque feedback system is employed in a universal field-oriented (UFO) controller to tune a torque-producing current command and a slip frequency command in order to achieve robust torque control of an induction machine even in the event of current regulator errors and during transitions between pulse width modulated (PWM) and square wave modes of operation.

  17. The Immigrant's University: A Study of Academic Performance and the Experiences of Recent Immigrant Groups at the University of California

    ERIC Educational Resources Information Center

    Douglass, John Aubrey; Thomson, Gregg

    2010-01-01

    One of the major characteristics of globalization is the large influx of immigrant groups moving largely from underdeveloped regions to developed economies. California offers one of the most robust examples of a large-scale, postmodern demographic transition that includes a great racial, ethnic, and cultural diversity of immigrant groups, many of…

  18. Reliability and validity of the Positive Mental Health Questionnaire in a sample of Spanish university students.

    PubMed

    Roldán-Merino, J; Lluch-Canut, M T; Casas, I; Sanromà-Ortíz, M; Ferré-Grau, C; Sequeira, C; Falcó-Pegueroles, A; Soares, D; Puig-Llobet, M

    2017-03-01

    WHAT IS KNOWN ON THE SUBJECT?: In general, the current studies of positive mental health use questionnaires or parts thereof. However, while these questionnaires evaluate aspects of positive mental health, they fail to measure the construct itself. WHAT DOES THIS PAPER ADD TO EXISTING KNOWLEDGE?: The widespread use and the lack of specific questionnaires for evaluating the positive mental health construct justify the need to measure the robustness of the Positive Mental Health Questionnaire. Also six factors are proposed to measure positive mental health. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: The availability of a good questionnaire to measure positive mental health in university students is useful not only to promote mental health but also to strengthen the curricula of future professionals. Introduction Nursing has a relevant role in managing mental health. It is important to identify and thereafter to enhance positive aspects of mental health among university nursing students. Aim The aim of the present study was to analyse the psychometric properties of the Positive Mental Health Questionnaire (PMHQ) in terms of reliability and validity using confirmatory factor analysis in a sample of university students. Method A cross-sectional study was carried out in a sample of 1091 students at 4 nursing schools in Catalonia, Spain. The reliability of the PMHQ was measured by means of Cronbach's alpha coefficient, and the test-retest stability was measured with the intraclass correlation coefficient (ICC). Confirmatory factor analysis was used to determine the validity of the factorial structure. Results Cronbach's alpha coefficient was satisfactory (>0.70) for four of the six subscales or dimensions and ranged from 0.54 to 0.79. ICC analysis was satisfactory for the six subscales or dimensions. The hypothesis was confirmed in the analysis of the correlations between subclasses and the overall scale, with the strongest correlations being found between the majority of the subscales and the overall scale. Confirmatory factor analysis showed that the model proposed for the factors fit the data satisfactorily. Discussion This scale is a valid and reliable instrument for evaluating positive mental health in university students. Implications for Practice A good questionnaire to measure positive mental health in university students is useful not only to promote mental health but also to strengthen the curricula of future professionals. © 2017 John Wiley & Sons Ltd.

  19. scEpath: Energy landscape-based inference of transition probabilities and cellular trajectories from single-cell transcriptomic data.

    PubMed

    Jin, Suoqin; MacLean, Adam L; Peng, Tao; Nie, Qing

    2018-02-05

    Single-cell RNA-sequencing (scRNA-seq) offers unprecedented resolution for studying cellular decision-making processes. Robust inference of cell state transition paths and probabilities is an important yet challenging step in the analysis of these data. Here we present scEpath, an algorithm that calculates energy landscapes and probabilistic directed graphs in order to reconstruct developmental trajectories. We quantify the energy landscape using "single-cell energy" and distance-based measures, and find that the combination of these enables robust inference of the transition probabilities and lineage relationships between cell states. We also identify marker genes and gene expression patterns associated with cell state transitions. Our approach produces pseudotemporal orderings that are - in combination - more robust and accurate than current methods, and offers higher resolution dynamics of the cell state transitions, leading to new insight into key transition events during differentiation and development. Moreover, scEpath is robust to variation in the size of the input gene set, and is broadly unsupervised, requiring few parameters to be set by the user. Applications of scEpath led to the identification of a cell-cell communication network implicated in early human embryo development, and novel transcription factors important for myoblast differentiation. scEpath allows us to identify common and specific temporal dynamics and transcriptional factor programs along branched lineages, as well as the transition probabilities that control cell fates. A MATLAB package of scEpath is available at https://github.com/sqjin/scEpath. qnie@uci.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  20. Universal non-adiabatic holonomic quantum computation in decoherence-free subspaces with quantum dots inside a cavity

    NASA Astrophysics Data System (ADS)

    Liu, Jun; Dong, Ping; Zhou, Jian; Cao, Zhuo-Liang

    2017-05-01

    A scheme for implementing the non-adiabatic holonomic quantum computation in decoherence-free subspaces is proposed with the interactions between a microcavity and quantum dots. A universal set of quantum gates can be constructed on the encoded logical qubits with high fidelities. The current scheme can suppress both local and collective noises, which is very important for achieving universal quantum computation. Discussions about the gate fidelities with the experimental parameters show that our schemes can be implemented in current experimental technology. Therefore, our scenario offers a method for universal and robust solid-state quantum computation.

  1. Is the Universe transparent?

    NASA Astrophysics Data System (ADS)

    Liao, Kai; Avgoustidis, A.; Li, Zhengxiang

    2015-12-01

    We present our study on cosmic opacity, which relates to changes in photon number as photons travel from the source to the observer. Cosmic opacity may be caused by absorption or scattering due to matter in the Universe, or by extragalactic magnetic fields that can turn photons into unobserved particles (e.g., light axions, chameleons, gravitons, Kaluza-Klein modes), and it is crucial to correctly interpret astronomical photometric measurements like type Ia supernovae observations. On the other hand, the expansion rate at different epochs, i.e., the observational Hubble parameter data H (z ), are obtained from differential ageing of passively evolving galaxies or from baryon acoustic oscillations and thus are not affected by cosmic opacity. In this work, we first construct opacity-free luminosity distances from H (z ) determinations, taking into consideration correlations between different redshifts for our error analysis. Moreover, we let the light-curve fitting parameters, accounting for distance estimation in type Ia supernovae observations, free to ensure that our analysis is authentically cosmological-model independent and gives a robust result. Any nonzero residuals between these two kinds of luminosity distances can be deemed as an indication of the existence of cosmic opacity. While a transparent Universe is currently consistent with the data, our results show that strong constraints on opacity (and consequently on physical mechanisms that could cause it) can be obtained in a cosmological-model-independent fashion.

  2. Emergence of robustness in networks of networks

    NASA Astrophysics Data System (ADS)

    Roth, Kevin; Morone, Flaviano; Min, Byungjoon; Makse, Hernán A.

    2017-06-01

    A model of interdependent networks of networks (NONs) was introduced recently [Proc. Natl. Acad. Sci. (USA) 114, 3849 (2017), 10.1073/pnas.1620808114] in the context of brain activation to identify the neural collective influencers in the brain NON. Here we investigate the emergence of robustness in such a model, and we develop an approach to derive an exact expression for the random percolation transition in Erdös-Rényi NONs of this kind. Analytical calculations are in agreement with numerical simulations, and highlight the robustness of the NON against random node failures, which thus presents a new robust universality class of NONs. The key aspect of this robust NON model is that a node can be activated even if it does not belong to the giant mutually connected component, thus allowing the NON to be built from below the percolation threshold, which is not possible in previous models of interdependent networks. Interestingly, the phase diagram of the model unveils particular patterns of interconnectivity for which the NON is most vulnerable, thereby marking the boundary above which the robustness of the system improves with increasing dependency connections.

  3. Robustness Analysis of Integrated LPV-FDI Filters and LTI-FTC System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Khong, Thuan H.; Shin, Jong-Yeob

    2007-01-01

    This paper proposes an analysis framework for robustness analysis of a nonlinear dynamics system that can be represented by a polynomial linear parameter varying (PLPV) system with constant bounded uncertainty. The proposed analysis framework contains three key tools: 1) a function substitution method which can convert a nonlinear system in polynomial form into a PLPV system, 2) a matrix-based linear fractional transformation (LFT) modeling approach, which can convert a PLPV system into an LFT system with the delta block that includes key uncertainty and scheduling parameters, 3) micro-analysis, which is a well known robust analysis tool for linear systems. The proposed analysis framework is applied to evaluating the performance of the LPV-fault detection and isolation (FDI) filters of the closed-loop system of a transport aircraft in the presence of unmodeled actuator dynamics and sensor gain uncertainty. The robustness analysis results are compared with nonlinear time simulations.

  4. Acquire: an open-source comprehensive cancer biobanking system.

    PubMed

    Dowst, Heidi; Pew, Benjamin; Watkins, Chris; McOwiti, Apollo; Barney, Jonathan; Qu, Shijing; Becnel, Lauren B

    2015-05-15

    The probability of effective treatment of cancer with a targeted therapeutic can be improved for patients with defined genotypes containing actionable mutations. To this end, many human cancer biobanks are integrating more tightly with genomic sequencing facilities and with those creating and maintaining patient-derived xenografts (PDX) and cell lines to provide renewable resources for translational research. To support the complex data management needs and workflows of several such biobanks, we developed Acquire. It is a robust, secure, web-based, database-backed open-source system that supports all major needs of a modern cancer biobank. Its modules allow for i) up-to-the-minute 'scoreboard' and graphical reporting of collections; ii) end user roles and permissions; iii) specimen inventory through caTissue Suite; iv) shipping forms for distribution of specimens to pathology, genomic analysis and PDX/cell line creation facilities; v) robust ad hoc querying; vi) molecular and cellular quality control metrics to track specimens' progress and quality; vii) public researcher request; viii) resource allocation committee distribution request review and oversight and ix) linkage to available derivatives of specimen. © The Author 2015. Published by Oxford University Press.

  5. Identification of novel and robust internal control genes from Volvariella volvacea that are suitable for RT-qPCR in filamentous fungi.

    PubMed

    Tao, Yongxin; van Peer, Arend Frans; Huang, Qianhui; Shao, Yanping; Zhang, Lei; Xie, Bin; Jiang, Yuji; Zhu, Jian; Xie, Baogui

    2016-07-12

    The selection of appropriate internal control genes (ICGs) is a crucial step in the normalization of real-time quantitative PCR (RT-qPCR) data. Housekeeping genes are habitually selected for this purpose, despite accumulating evidence on their instability. We screened for novel, robust ICGs in the mushroom forming fungus Volvariella volvacea. Nine commonly used and five newly selected ICGs were evaluated for expression stability using RT-qPCR data in eight different stages of the life cycle of V. volvacea. Three different algorithms consistently determined that three novel ICGs (SPRYp, Ras and Vps26) exhibited the highest expression stability in V. volvacea. Subsequent analysis of ICGs in twenty-four expression profiles from nine filamentous fungi revealed that Ras was the most stable ICG amongst the Basidiomycetous samples, followed by SPRYp, Vps26 and ACTB. Vps26 was expressed most stably within the analyzed data of Ascomycetes, followed by HH3 and β-TUB. No ICG was universally stable for all fungal species, or for all experimental conditions within a species. Ultimately, the choice of an ICG will depend on a specific set of experiments. This study provides novel, robust ICGs for Basidiomycetes and Ascomycetes. Together with the presented guiding principles, this enables the efficient selection of suitable ICGs for RT-qPCR.

  6. PDBStat: a universal restraint converter and restraint analysis software package for protein NMR.

    PubMed

    Tejero, Roberto; Snyder, David; Mao, Binchen; Aramini, James M; Montelione, Gaetano T

    2013-08-01

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data.

  7. PDBStat: A Universal Restraint Converter and Restraint Analysis Software Package for Protein NMR

    PubMed Central

    Tejero, Roberto; Snyder, David; Mao, Binchen; Aramini, James M.; Montelione, Gaetano T

    2013-01-01

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data. PMID:23897031

  8. A Universal and Robust Integrated Platform for the Scalable Production of Human Cardiomyocytes From Pluripotent Stem Cells.

    PubMed

    Fonoudi, Hananeh; Ansari, Hassan; Abbasalizadeh, Saeed; Larijani, Mehran Rezaei; Kiani, Sahar; Hashemizadeh, Shiva; Zarchi, Ali Sharifi; Bosman, Alexis; Blue, Gillian M; Pahlavan, Sara; Perry, Matthew; Orr, Yishay; Mayorchak, Yaroslav; Vandenberg, Jamie; Talkhabi, Mahmood; Winlaw, David S; Harvey, Richard P; Aghdami, Nasser; Baharvand, Hossein

    2015-12-01

    Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs), in conjunction with the promising outcomes from preclinical and clinical studies, have raised new hopes for cardiac cell therapy. We report the development of a scalable, robust, and integrated differentiation platform for large-scale production of hPSC-CM aggregates in a stirred suspension bioreactor as a single-unit operation. Precise modulation of the differentiation process by small molecule activation of WNT signaling, followed by inactivation of transforming growth factor-β and WNT signaling and activation of sonic hedgehog signaling in hPSCs as size-controlled aggregates led to the generation of approximately 100% beating CM spheroids containing virtually pure (∼90%) CMs in 10 days. Moreover, the developed differentiation strategy was universal, as demonstrated by testing multiple hPSC lines (5 human embryonic stem cell and 4 human inducible PSC lines) without cell sorting or selection. The produced hPSC-CMs successfully expressed canonical lineage-specific markers and showed high functionality, as demonstrated by microelectrode array and electrophysiology tests. This robust and universal platform could become a valuable tool for the mass production of functional hPSC-CMs as a prerequisite for realizing their promising potential for therapeutic and industrial applications, including drug discovery and toxicity assays. Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs) and the development of novel cell therapy strategies using hPSC-CMs (e.g., cardiac patches) in conjunction with promising preclinical and clinical studies, have raised new hopes for patients with end-stage cardiovascular disease, which remains the leading cause of morbidity and mortality globally. In this study, a simplified, scalable, robust, and integrated differentiation platform was developed to generate clinical grade hPSC-CMs as cell aggregates under chemically defined culture conditions. This approach resulted in approximately 100% beating CM spheroids with virtually pure (∼90%) functional cardiomyocytes in 10 days from multiple hPSC lines. This universal and robust bioprocessing platform can provide sufficient numbers of hPSC-CMs for companies developing regenerative medicine technologies to rescue, replace, and help repair damaged heart tissues and for pharmaceutical companies developing advanced biologics and drugs for regeneration of lost heart tissue using high-throughput technologies. It is believed that this technology can expedite clinical progress in these areas to achieve a meaningful impact on improving clinical outcomes, cost of care, and quality of life for those patients disabled and experiencing heart disease. ©AlphaMed Press.

  9. A Universal and Robust Integrated Platform for the Scalable Production of Human Cardiomyocytes From Pluripotent Stem Cells

    PubMed Central

    Fonoudi, Hananeh; Ansari, Hassan; Abbasalizadeh, Saeed; Larijani, Mehran Rezaei; Kiani, Sahar; Hashemizadeh, Shiva; Zarchi, Ali Sharifi; Bosman, Alexis; Blue, Gillian M.; Pahlavan, Sara; Perry, Matthew; Orr, Yishay; Mayorchak, Yaroslav; Vandenberg, Jamie; Talkhabi, Mahmood; Winlaw, David S.; Harvey, Richard P.; Aghdami, Nasser

    2015-01-01

    Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs), in conjunction with the promising outcomes from preclinical and clinical studies, have raised new hopes for cardiac cell therapy. We report the development of a scalable, robust, and integrated differentiation platform for large-scale production of hPSC-CM aggregates in a stirred suspension bioreactor as a single-unit operation. Precise modulation of the differentiation process by small molecule activation of WNT signaling, followed by inactivation of transforming growth factor-β and WNT signaling and activation of sonic hedgehog signaling in hPSCs as size-controlled aggregates led to the generation of approximately 100% beating CM spheroids containing virtually pure (∼90%) CMs in 10 days. Moreover, the developed differentiation strategy was universal, as demonstrated by testing multiple hPSC lines (5 human embryonic stem cell and 4 human inducible PSC lines) without cell sorting or selection. The produced hPSC-CMs successfully expressed canonical lineage-specific markers and showed high functionality, as demonstrated by microelectrode array and electrophysiology tests. This robust and universal platform could become a valuable tool for the mass production of functional hPSC-CMs as a prerequisite for realizing their promising potential for therapeutic and industrial applications, including drug discovery and toxicity assays. Significance Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs) and the development of novel cell therapy strategies using hPSC-CMs (e.g., cardiac patches) in conjunction with promising preclinical and clinical studies, have raised new hopes for patients with end-stage cardiovascular disease, which remains the leading cause of morbidity and mortality globally. In this study, a simplified, scalable, robust, and integrated differentiation platform was developed to generate clinical grade hPSC-CMs as cell aggregates under chemically defined culture conditions. This approach resulted in approximately 100% beating CM spheroids with virtually pure (∼90%) functional cardiomyocytes in 10 days from multiple hPSC lines. This universal and robust bioprocessing platform can provide sufficient numbers of hPSC-CMs for companies developing regenerative medicine technologies to rescue, replace, and help repair damaged heart tissues and for pharmaceutical companies developing advanced biologics and drugs for regeneration of lost heart tissue using high-throughput technologies. It is believed that this technology can expedite clinical progress in these areas to achieve a meaningful impact on improving clinical outcomes, cost of care, and quality of life for those patients disabled and experiencing heart disease. PMID:26511653

  10. The means/side-effect distinction in moral cognition: A meta-analysis.

    PubMed

    Feltz, Adam; May, Joshua

    2017-09-01

    Experimental research suggests that people draw a moral distinction between bad outcomes brought about as a means versus a side effect (or byproduct). Such findings have informed multiple psychological and philosophical debates about moral cognition, including its computational structure, its sensitivity to the famous Doctrine of Double Effect, its reliability, and its status as a universal and innate mental module akin to universal grammar. But some studies have failed to replicate the means/byproduct effect especially in the absence of other factors, such as personal contact. So we aimed to determine how robust the means/byproduct effect is by conducting a meta-analysis of both published and unpublished studies (k=101; 24,058 participants). We found that while there is an overall small difference between moral judgments of means and byproducts (standardized mean difference=0.87, 95% CI 0.67-1.06; standardized mean change=0.57, 95% CI 0.44-0.69; log odds ratio=1.59, 95% CI 1.15-2.02), the mean effect size is primarily moderated by whether the outcome is brought about by personal contact, which typically involves the use of personal force. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Modern CACSD using the Robust-Control Toolbox

    NASA Technical Reports Server (NTRS)

    Chiang, Richard Y.; Safonov, Michael G.

    1989-01-01

    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.

  12. Empathy from the client's perspective: A grounded theory analysis.

    PubMed

    MacFarlane, Peter; Anderson, Timothy; McClintock, Andrew S

    2017-03-01

    Although empathy is one of most robust predictors of client outcome, there is little consensus about how best to conceptualize this construct. The aim of the present research was to investigate clients' perceptions and in-session experiences of empathy. Semi-structured, video-assisted interpersonal process recall interviews were used to collect data from nine clients receiving individual psychotherapy at a university psychology clinic. Grounded theory analysis yielded a model consisting of three clusters: (1) relational context of empathy (i.e., personal relationship and professional relationship), (2) types of empathy (i.e., psychotherapists' cognitive empathy, psychotherapists' emotional empathy, and client attunement to psychotherapist), and (3) utility of empathy (i.e., process-related benefits and client-related benefits). These results suggest that empathy is a multi-dimensional, interactional process that affects-and is affected by-the broader relationship between client and psychotherapist.

  13. Microbial Communities as Experimental Units

    PubMed Central

    DAY, MITCH D.; BECK, DANIEL; FOSTER, JAMES A.

    2011-01-01

    Artificial ecosystem selection is an experimental technique that treats microbial communities as though they were discrete units by applying selection on community-level properties. Highly diverse microbial communities associated with humans and other organisms can have significant impacts on the health of the host. It is difficult to find correlations between microbial community composition and community-associated diseases, in part because it may be impossible to define a universal and robust species concept for microbes. Microbial communities are composed of potentially thousands of unique populations that evolved in intimate contact, so it is appropriate in many situations to view the community as the unit of analysis. This perspective is supported by recent discoveries using metagenomics and pangenomics. Artificial ecosystem selection experiments can be costly, but they bring the logical rigor of biological model systems to the emerging field of microbial community analysis. PMID:21731083

  14. DNATCO: assignment of DNA conformers at dnatco.org.

    PubMed

    Černý, Jiří; Božíková, Paulína; Schneider, Bohdan

    2016-07-08

    The web service DNATCO (dnatco.org) classifies local conformations of DNA molecules beyond their traditional sorting to A, B and Z DNA forms. DNATCO provides an interface to robust algorithms assigning conformation classes called NTC: to dinucleotides extracted from DNA-containing structures uploaded in PDB format version 3.1 or above. The assigned dinucleotide NTC: classes are further grouped into DNA structural alphabet NTA: , to the best of our knowledge the first DNA structural alphabet. The results are presented at two levels: in the form of user friendly visualization and analysis of the assignment, and in the form of a downloadable, more detailed table for further analysis offline. The website is free and open to all users and there is no login requirement. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Computer models and the evidence of anthropogenic climate change: An epistemology of variety-of-evidence inferences and robustness analysis.

    PubMed

    Vezér, Martin A

    2016-04-01

    To study climate change, scientists employ computer models, which approximate target systems with various levels of skill. Given the imperfection of climate models, how do scientists use simulations to generate knowledge about the causes of observed climate change? Addressing a similar question in the context of biological modelling, Levins (1966) proposed an account grounded in robustness analysis. Recent philosophical discussions dispute the confirmatory power of robustness, raising the question of how the results of computer modelling studies contribute to the body of evidence supporting hypotheses about climate change. Expanding on Staley's (2004) distinction between evidential strength and security, and Lloyd's (2015) argument connecting variety-of-evidence inferences and robustness analysis, I address this question with respect to recent challenges to the epistemology robustness analysis. Applying this epistemology to case studies of climate change, I argue that, despite imperfections in climate models, and epistemic constraints on variety-of-evidence reasoning and robustness analysis, this framework accounts for the strength and security of evidence supporting climatological inferences, including the finding that global warming is occurring and its primary causes are anthropogenic. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Robust Mediation Analysis Based on Median Regression

    PubMed Central

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  17. Bayesian Inference and Application of Robust Growth Curve Models Using Student's "t" Distribution

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Lai, Keke; Lu, Zhenqiu; Tong, Xin

    2013-01-01

    Despite the widespread popularity of growth curve analysis, few studies have investigated robust growth curve models. In this article, the "t" distribution is applied to model heavy-tailed data and contaminated normal data with outliers for growth curve analysis. The derived robust growth curve models are estimated through Bayesian…

  18. Gauging the cosmic acceleration with recent type Ia supernovae data sets

    NASA Astrophysics Data System (ADS)

    Velten, Hermano; Gomes, Syrios; Busti, Vinicius C.

    2018-04-01

    We revisit a model-independent estimator for cosmic acceleration based on type Ia supernovae distance measurements. This approach does not rely on any specific theory for gravity, energy content, nor parametrization for the scale factor or deceleration parameter and is based on falsifying the null hypothesis that the Universe never expanded in an accelerated way. By generating mock catalogs of known cosmologies, we test the robustness of this estimator, establishing its limits of applicability. We detail the pros and cons of such an approach. For example, we find that there are specific counterexamples in which the estimator wrongly provides evidence against acceleration in accelerating cosmologies. The dependence of the estimator on the H0 value is also discussed. Finally, we update the evidence for acceleration using the recent UNION2.1 and Joint Light-Curve Analysis samples. Contrary to recent claims, available data strongly favor an accelerated expansion of the Universe in complete agreement with the standard Λ CDM model.

  19. Transit Photometry of Recently Discovered Hot Jupiters

    NASA Astrophysics Data System (ADS)

    McCloat, Sean Peter

    The University of North Dakota Space Studies Internet Observatory was used to observe the transits of hot Jupiter exoplanets. Targets for this research were selected from the list of currently confirmed exoplanets using the following criteria: radius > 0.5 Rjup, discovered since 2011, orbiting stars with apparent magnitude > 13. Eleven transits were observed distributed across nine targets with the goal of performing differential photometry for parameter refinement and transit timing variation analysis if data quality allowed. Data quality was ultimately insufficient for robust parameter refinement, but tentative calculations of mid-transit times were made of three of the observed transits. Mid-transit times for WASP-103b and WASP-48b were consistent with predictions and the existing database.

  20. Community Perspectives on Drug/Alcohol Use, Concerns, Needs and Resources In Four Washington State Tribal Communities

    PubMed Central

    Radin, Sandra M.; Kutz, Stephen H.; LaMarr, June; Vendiola, Diane; Vendiola, Michael; Wilbur, Brian; Thomas, Lisa Rey; Donovan, Dennis M.

    2016-01-01

    Community-university teams investigated substance use, abuse, and dependence (SUAD) and related concerns, needs, strengths, and resources in four Washington State Tribal communities. 153 key community members shared their perspectives through 43 semi-structured interviews and 19 semi-structured focus groups. Qualitative data analysis revealed robust themes: prescription medications and alcohol were perceived as most prevalent and concerning; family and peer influences and emotional distress were prominent perceived risk factors; and SUAD intervention resources varied across communities. Findings may guide future research and the development of much needed strength-based, culturally appropriate, and effective SUAD interventions for American Indians, Alaska Natives, and their communities. PMID:25560464

  1. Robust variance estimation with dependent effect sizes: practical considerations including a software tutorial in Stata and spss.

    PubMed

    Tanner-Smith, Emily E; Tipton, Elizabeth

    2014-03-01

    Methodologists have recently proposed robust variance estimation as one way to handle dependent effect sizes in meta-analysis. Software macros for robust variance estimation in meta-analysis are currently available for Stata (StataCorp LP, College Station, TX, USA) and spss (IBM, Armonk, NY, USA), yet there is little guidance for authors regarding the practical application and implementation of those macros. This paper provides a brief tutorial on the implementation of the Stata and spss macros and discusses practical issues meta-analysts should consider when estimating meta-regression models with robust variance estimates. Two example databases are used in the tutorial to illustrate the use of meta-analysis with robust variance estimates. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Health Technology Assessment: Global Advocacy and Local Realities Comment on "Priority Setting for Universal Health Coverage: We Need Evidence-Informed Deliberative Processes, Not Just More Evidence on Cost-Effectiveness".

    PubMed

    Chalkidou, Kalipso; Li, Ryan; Culyer, Anthony J; Glassman, Amanda; Hofman, Karen J; Teerawattananon, Yot

    2016-08-29

    Cost-effectiveness analysis (CEA) can help countries attain and sustain universal health coverage (UHC), as long as it is context-specific and considered within deliberative processes at the country level. Institutionalising robust deliberative processes requires significant time and resources, however, and countries often begin by demanding evidence (including local CEA evidence as well as evidence about local values), whilst striving to strengthen the governance structures and technical capacities with which to generate, consider and act on such evidence. In low- and middle-income countries (LMICs), such capacities could be developed initially around a small technical unit in the health ministry or health insurer. The role of networks, development partners, and global norm setting organisations is crucial in supporting the necessary capacities. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  3. Oasis: online analysis of small RNA deep sequencing data.

    PubMed

    Capece, Vincenzo; Garcia Vizcaino, Julio C; Vidal, Ramon; Rahman, Raza-Ur; Pena Centeno, Tonatiuh; Shomroni, Orr; Suberviola, Irantzu; Fischer, Andre; Bonn, Stefan

    2015-07-01

    Oasis is a web application that allows for the fast and flexible online analysis of small-RNA-seq (sRNA-seq) data. It was designed for the end user in the lab, providing an easy-to-use web frontend including video tutorials, demo data and best practice step-by-step guidelines on how to analyze sRNA-seq data. Oasis' exclusive selling points are a differential expression module that allows for the multivariate analysis of samples, a classification module for robust biomarker detection and an advanced programming interface that supports the batch submission of jobs. Both modules include the analysis of novel miRNAs, miRNA targets and functional analyses including GO and pathway enrichment. Oasis generates downloadable interactive web reports for easy visualization, exploration and analysis of data on a local system. Finally, Oasis' modular workflow enables for the rapid (re-) analysis of data. Oasis is implemented in Python, R, Java, PHP, C++ and JavaScript. It is freely available at http://oasis.dzne.de. stefan.bonn@dzne.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  4. Scene analysis in the natural environment

    PubMed Central

    Lewicki, Michael S.; Olshausen, Bruno A.; Surlykke, Annemarie; Moss, Cynthia F.

    2014-01-01

    The problem of scene analysis has been studied in a number of different fields over the past decades. These studies have led to important insights into problems of scene analysis, but not all of these insights are widely appreciated, and there remain critical shortcomings in current approaches that hinder further progress. Here we take the view that scene analysis is a universal problem solved by all animals, and that we can gain new insight by studying the problems that animals face in complex natural environments. In particular, the jumping spider, songbird, echolocating bat, and electric fish, all exhibit behaviors that require robust solutions to scene analysis problems encountered in the natural environment. By examining the behaviors of these seemingly disparate animals, we emerge with a framework for studying scene analysis comprising four essential properties: (1) the ability to solve ill-posed problems, (2) the ability to integrate and store information across time and modality, (3) efficient recovery and representation of 3D scene structure, and (4) the use of optimal motor actions for acquiring information to progress toward behavioral goals. PMID:24744740

  5. Self-Stabilizing and Efficient Robust Uncertainty Management

    DTIC Science & Technology

    2011-10-01

    Group decision making in honey bee swarms. American Scientist. 94:220-229. 71 Frisch, Karl von. (1967) The Dance Language and Orientation of... Bees . Cambridge, Mass.: The Belknap Press of Harvard University Press. 18 Thom et al. (21 August 2007) The Scent of the Waggle Dance . PLoS Biology...Orientation of Bees . Cambridge, Mass.: The Belknap Press of Harvard University Press. 02 Frisch, Karl von. (1967) The Dance Language and

  6. Robustness, Diagnostics, Computing and Graphics in Statistics

    DTIC Science & Technology

    1990-01-01

    seto his collection at nuformation. ruciL.ong suggestions for reducing this burden to Washington iieadcuarters Services . Directorate or information...Lewis Cornell University Keaing Lu Georgia Institute of Technology Mary Silber UC, Berkeley Matthew W. Stafford Loyola University Mary Lou Zeeman UC...wavefronts in excitable media are determined by the manner of recovery to the rest state. The distance between a pair of wavefronts tends to lock at one of

  7. Compressive Oversampling for Robust Data Transmission in Sensor Networks

    DTIC Science & Technology

    2010-01-01

    Mani B. Srivastava University of California, Los Angeles Ting He, Chatschik Bisdikian IBM T. J. Watson Research Center Abstract—Data loss in...TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS( ES ) University of California, Los Angeles,Los Angeles,CA,90095 8...PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS( ES ) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S

  8. Exploration of robust operating conditions in inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Tromp, John W.; Pomares, Mario; Alvarez-Prieto, Manuel; Cole, Amanda; Ying, Hai; Salin, Eric D.

    2003-11-01

    'Robust' conditions, as defined by Mermet and co-workers for inductively coupled plasma (ICP)-atomic emission spectrometry, minimize matrix effects on analyte signals, and are obtained by increasing power and reducing nebulizer gas flow. In ICP-mass spectrometry (MS), it is known that reduced nebulizer gas flow usually leads to more robust conditions such that matrix effects are reduced. In this work, robust conditions for ICP-MS have been determined by optimizing for accuracy in the determination of analytes in a multi-element solution with various interferents (Al, Ba, Cs, K, Na), by varying power, nebulizer gas flow, sample introduction rate and ion lens voltage. The goal of the work was to determine which operating parameters were the most important in reducing matrix effects, and whether different interferents yielded the same robust conditions. Reduction in nebulizer gas flow and in sample input rate led to a significantly decreased interference, while an increase in power seemed to have a lesser effect. Once the other parameters had been adjusted to their robust values, there was no additional improvement in accuracy attainable by adjusting the ion lens voltage. The robust conditions were universal, since, for all the interferents and analytes studied, the optimum was found at the same operating conditions. One drawback to the use of robust conditions was the slightly reduced sensitivity; however, in the context of 'intelligent' instruments, the concept of 'robust conditions' is useful in many cases.

  9. MPLEx: a Robust and Universal Protocol for Single-Sample Integrative Proteomic, Metabolomic, and Lipidomic Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.

    2016-05-03

    ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. Themetabolite,protein, andlipidextraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of thismore » protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental,in vitro, and clinical). IMPORTANCEIn systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample.« less

  10. Image analysis and modeling in medical image computing. Recent developments and advances.

    PubMed

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.

  11. Engineering the quantum states of light in a Kerr-nonlinear resonator by two-photon driving

    NASA Astrophysics Data System (ADS)

    Puri, Shruti; Boutin, Samuel; Blais, Alexandre

    2017-04-01

    Photonic cat states stored in high-Q resonators show great promise for hardware efficient universal quantum computing. We propose an approach to efficiently prepare such cat states in a Kerr-nonlinear resonator by the use of a two-photon drive. Significantly, we show that this preparation is robust against single-photon loss. An outcome of this observation is that a two-photon drive can eliminate undesirable phase evolution induced by a Kerr nonlinearity. By exploiting the concept of transitionless quantum driving, we moreover demonstrate how non-adiabatic initialization of cat states is possible. Finally, we present a universal set of quantum logical gates that can be performed on the engineered eigenspace of such a two-photon driven resonator and discuss a possible realization using superconducting circuits. The robustness of the engineered subspace to higher-order circuit nonlinearities makes this implementation favorable for scalable quantum computation.

  12. No evidence for a significant AGN contribution to cosmic hydrogen reionization

    NASA Astrophysics Data System (ADS)

    Parsa, Shaghayegh; Dunlop, James S.; McLure, Ross J.

    2018-03-01

    We reinvestigate a claimed sample of 22 X-ray detected active galactic nuclei (AGN) at redshifts z > 4, which has reignited the debate as to whether young galaxies or AGN reionized the Universe. These sources lie within the Great Observatories Origins Deep Survey-South (GOODS-S)/Cosmic Assembly Near-Infrared Deep Extragalactic Legacy Survey (CANDELS) field, and we examine both the robustness of the claimed X-ray detections (within the Chandra 4Ms imaging) and perform an independent analysis of the photometric redshifts of the optical/infrared counterparts. We confirm the reality of only 15 of the 22 reported X-ray detections, and moreover find that only 12 of the 22 optical/infrared counterpart galaxies actually lie robustly at z > 4. Combining these results we find convincing evidence for only seven X-ray AGN at z > 4 in the GOODS-S field, of which only one lies at z > 5. We recalculate the evolving far-ultraviolet (1500 Å) luminosity density produced by AGN at high redshift, and find that it declines rapidly from z ≃ 4 to z ≃ 6, in agreement with several other recent studies of the evolving AGN luminosity function. The associated rapid decline in inferred hydrogen ionizing emissivity contributed by AGN falls an order-of-magnitude short of the level required to maintain hydrogen ionization at z ≃ 6. We conclude that all available evidence continues to favour a scenario in which young galaxies reionized the Universe, with AGN making, at most, a very minor contribution to cosmic hydrogen reionization.

  13. The causal effect of increased primary schooling on child mortality in Malawi: Universal primary education as a natural experiment.

    PubMed

    Makate, Marshall; Makate, Clifton

    2016-11-01

    The primary objective of this analysis is to investigate the causal effect of mother's schooling on under-five health - and the passageways through which schooling propagates - by exploiting the exogenous variability in schooling prompted by the 1994 universal primary schooling program in Malawi. This education policy, which saw the elimination of tuition fees across all primary schooling grades, creates an ideal setting for observing the causal influence of improved primary school enrollment on the under-five fatality rates of the subsequent generation. Our analysis uses data from three waves of the nationally representative Malawi Demographic and Health Surveys conducted in 2000, 2004/05, and 2010. To address the potential endogeneity of schooling, we employ the mother's age at implementation of the tuition-free primary school policy in 1994 as an instrumental variable for the prospect of finishing primary level instruction. The results suggest that spending one year in school translated to a 3.22 percentage point reduction in mortality for infants and a 6.48 percent reduction for children under age five years. For mothers younger than 19 years, mortality was reduced by 5.95 percentage points. These figures remained approximately the same even after adjusting for potential confounders. However, we failed to find any statistically meaningful effect of the mother's education on neonatal survival. The juvenile fatality estimates we find are weakly robust to several robustness checks. We also explored the potential mechanisms by which increased maternal schooling might help enhance child survival. The findings indicated that an added year of motherly learning considerably improves the prospect of prenatal care use, literacy levels, father's educational level, and alters fertility behavior. Our results suggest that increasing the primary schooling prospects for young women might help reduce under-five mortality in less-industrialized regions experiencing high under-five fatalities such as in sub-Saharan Africa. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Usability Analysis within The DataONE Network of Collaborators

    NASA Astrophysics Data System (ADS)

    Budden, A. E.; Frame, M. T.; Tenopir, C.; Volentine, R.

    2014-12-01

    DataONE was conceived as a 10-year project to enable new science and knowledge creation through universal access to data about life on Earth and the environment that sustains it. In Phase I (2009-2014) more than 300 DataONE participants designed, developed and deployed a robust cyberinfrastructure (CI) with innovative services, and directly engaged and educated a broad stakeholder community. DataONE provides a resilient, scalable infrastructure using Member Nodes (data repositories), Coordinating Nodes, and an Investigator Toolkit to support the data access and data management needs of biological, Earth, and environmental science researchers in the U.S. and across the globe. DataONE collaborators, such as the U.S. Geological Survey, University of New Mexico, and the University of Tennessee, perform research to measure both the current data practices and opinions of DataONE stakeholders and the usability of DataONE for these stakeholders. Stakeholders include scientists, data managers, librarians, and educators among others. The DataONE Usability and Assessment Working Group, which includes members from multiple sectors, does research, development, and implementation projects on DataONE processes, systems, and methods. These projects are essential to insure that DataONE products and services meet network goals, include appropriate community involvement, and demonstrate progress and achievements of DataONE. This poster will provide an overview of DataONE's usability analysis and assessment methodologies, benefits to DataONE and its collaborators, and current tools/techniques being utilized by the participants.

  15. A generalized association test based on U statistics.

    PubMed

    Wei, Changshuai; Lu, Qing

    2017-07-01

    Second generation sequencing technologies are being increasingly used for genetic association studies, where the main research interest is to identify sets of genetic variants that contribute to various phenotypes. The phenotype can be univariate disease status, multivariate responses and even high-dimensional outcomes. Considering the genotype and phenotype as two complex objects, this also poses a general statistical problem of testing association between complex objects. We here proposed a similarity-based test, generalized similarity U (GSU), that can test the association between complex objects. We first studied the theoretical properties of the test in a general setting and then focused on the application of the test to sequencing association studies. Based on theoretical analysis, we proposed to use Laplacian Kernel-based similarity for GSU to boost power and enhance robustness. Through simulation, we found that GSU did have advantages over existing methods in terms of power and robustness. We further performed a whole genome sequencing (WGS) scan for Alzherimer's disease neuroimaging initiative data, identifying three genes, APOE , APOC1 and TOMM40 , associated with imaging phenotype. We developed a C ++ package for analysis of WGS data using GSU. The source codes can be downloaded at https://github.com/changshuaiwei/gsu . weichangshuai@gmail.com ; qlu@epi.msu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. Robust Variance Estimation with Dependent Effect Sizes: Practical Considerations Including a Software Tutorial in Stata and SPSS

    ERIC Educational Resources Information Center

    Tanner-Smith, Emily E.; Tipton, Elizabeth

    2014-01-01

    Methodologists have recently proposed robust variance estimation as one way to handle dependent effect sizes in meta-analysis. Software macros for robust variance estimation in meta-analysis are currently available for Stata (StataCorp LP, College Station, TX, USA) and SPSS (IBM, Armonk, NY, USA), yet there is little guidance for authors regarding…

  17. Investigation of Air Transportation Technology at Princeton University, 1989-1990

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1990-01-01

    The Air Transportation Technology Program at Princeton University proceeded along six avenues during the past year: microburst hazards to aircraft; machine-intelligent, fault tolerant flight control; computer aided heuristics for piloted flight; stochastic robustness for flight control systems; neural networks for flight control; and computer aided control system design. These topics are briefly discussed, and an annotated bibliography of publications that appeared between January 1989 and June 1990 is given.

  18. Recruitment, selection and retention of nursing and midwifery students in Scottish Universities.

    PubMed

    Rodgers, Sheila; Stenhouse, Rosie; McCreaddie, May; Small, Pauline

    2013-11-01

    High attrition rates from pre-registration nursing and midwifery programmes have been reported in both the UK and in other countries. A study was conducted to identify best practice in recruitment, selection and retention across Scottish Universities providing pre-registration programmes. A survey of all universities providing pre-registration programmes in Scotland was conducted. Semi-structured interviews were conducted with key personnel in each university. Documentary evidence was collected to supplement interview data and evidence recruitment, selection and retention practices. All universities in Scotland providing pre-registration nursing and/or midwifery programmes. All 10 identified universities agreed to take part and a total of 18 interviews were conducted. Semi-structured face to face and telephone interviews were conducted. Relevant documentary evidence was collected. All data were subject to thematic analysis. Universities are predominantly concerned with recruiting to the institution and not to the professions. Interviews are widely used, and are a requirement in the United Kingdom. However, there is no evidence base within the literature that they have predictive validity despite creating scales and scoring systems which are largely unvalidated. The study identified initiatives aimed at addressing attrition/retention, however most had not been evaluated often due to the multi-factorial nature of attrition/retention and difficulties with measurement. Recruitment selection and retention initiatives were rarely evaluated, and if so, adopted a relatively superficial approach. Evidence from existing studies to support practices was mostly weakly supportive or absent. The study highlights the need for a coordinated approach, supporting the development of a robust evidence base through the evaluation of local initiatives, and evaluation of new strategies. Evaluation strategies must take account of the local context to facilitate transferability of findings across different settings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Adiabatic gate teleportation.

    PubMed

    Bacon, Dave; Flammia, Steven T

    2009-09-18

    The difficulty in producing precisely timed and controlled quantum gates is a significant source of error in many physical implementations of quantum computers. Here we introduce a simple universal primitive, adiabatic gate teleportation, which is robust to timing errors and many control errors and maintains a constant energy gap throughout the computation above a degenerate ground state space. This construction allows for geometric robustness based upon the control of two independent qubit interactions. Further, our piecewise adiabatic evolution easily relates to the quantum circuit model, enabling the use of standard methods from fault-tolerance theory for establishing thresholds.

  20. Finite-time robust control of uncertain fractional-order Hopfield neural networks via sliding mode control

    NASA Astrophysics Data System (ADS)

    Xi, Yangui; Yu, Yongguang; Zhang, Shuo; Hai, Xudong

    2018-01-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11371049 and 61772063) and the Fundamental Research Funds for the Central Universities, China (Grant No. 2016JBM070).

  1. Robustness analysis of non-ordinary Petri nets for flexible assembly systems

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2010-05-01

    Non-ordinary controlled Petri nets (NCPNs) have the advantages to model flexible assembly systems in which multiple identical resources may be required to perform an operation. However, existing studies on NCPNs are still limited. For example, the robustness properties of NCPNs have not been studied. This motivates us to develop an analysis method for NCPNs. Robustness analysis concerns the ability for a system to maintain operation in the presence of uncertainties. It provides an alternative way to analyse a perturbed system without reanalysis. In our previous research, we have analysed the robustness properties of several subclasses of ordinary controlled Petri nets. To study the robustness properties of NCPNs, we augment NCPNs with an uncertainty model, which specifies an upper bound on the uncertainties for each reachable marking. The resulting PN models are called non-ordinary controlled Petri nets with uncertainties (NCPNU). Based on NCPNU, the problem is to characterise the maximal tolerable uncertainties for each reachable marking. The computational complexities to characterise maximal tolerable uncertainties for each reachable marking grow exponentially with the size of the nets. Instead of considering general NCPNU, we limit our scope to a subclass of PN models called non-ordinary controlled flexible assembly Petri net with uncertainties (NCFAPNU) for assembly systems and study its robustness. We will extend the robustness analysis to NCFAPNU. We identify two types of uncertainties under which the liveness of NCFAPNU can be maintained.

  2. Study of archaeological coins of different dynasties using libs coupled with multivariate analysis

    NASA Astrophysics Data System (ADS)

    Awasthi, Shikha; Kumar, Rohit; Rai, G. K.; Rai, A. K.

    2016-04-01

    Laser Induced Breakdown Spectroscopy (LIBS) is an atomic emission spectroscopic technique having unique capability of an in-situ monitoring tool for detection and quantification of elements present in different artifacts. Archaeological coins collected form G.R. Sharma Memorial Museum; University of Allahabad, India has been analyzed using LIBS technique. These coins were obtained from excavation of Kausambi, Uttar Pradesh, India. LIBS system assembled in the laboratory (laser Nd:YAG 532 nm, 4 ns pulse width FWHM with Ocean Optics LIBS 2000+ spectrometer) is employed for spectral acquisition. The spectral lines of Ag, Cu, Ca, Sn, Si, Fe and Mg are identified in the LIBS spectra of different coins. LIBS along with Multivariate Analysis play an effective role for classification and contribution of spectral lines in different coins. The discrimination between five coins with Archaeological interest has been carried out using Principal Component Analysis (PCA). The results show the potential relevancy of the methodology used in the elemental identification and classification of artifacts with high accuracy and robustness.

  3. Adherence to infection control guidelines in surgery on MRSA positive patients : A cost analysis.

    PubMed

    Saegeman, V; Schuermans, A

    2016-09-01

    In surgical units, similar to other healthcare departments, guidelines are used to curb transmission of methicillin resistant Staphylococcus aureus (MRSA). The aim of this study was to calculate the extra costs for material and extra working hours for compliance to MRSA infection control guidelines in the operating rooms of a University Hospital. The study was based on observations of surgeries on MRSA positive patients. The average cost per surgery was calculated utilizing local information on unit costs. Robustness of the calculations was evaluated with a sensitivity analysis. The total extra costs of adherence to MRSA infection control guidelines averaged € 340.46 per surgical procedure (range € 207.76- € 473.15). A sensitivity analysis based on a standardized operating room hourly rate reached a cost of € 366.22. The extra costs of adherence to infection control guidelines are considerable. To reduce costs, the logistical planning of surgeries could be improved by for instance a dedicated room.

  4. Comparison of software packages for detecting differential expression in RNA-seq studies.

    PubMed

    Seyednasrollah, Fatemeh; Laiho, Asta; Elo, Laura L

    2015-01-01

    RNA-sequencing (RNA-seq) has rapidly become a popular tool to characterize transcriptomes. A fundamental research problem in many RNA-seq studies is the identification of reliable molecular markers that show differential expression between distinct sample groups. Together with the growing popularity of RNA-seq, a number of data analysis methods and pipelines have already been developed for this task. Currently, however, there is no clear consensus about the best practices yet, which makes the choice of an appropriate method a daunting task especially for a basic user without a strong statistical or computational background. To assist the choice, we perform here a systematic comparison of eight widely used software packages and pipelines for detecting differential expression between sample groups in a practical research setting and provide general guidelines for choosing a robust pipeline. In general, our results demonstrate how the data analysis tool utilized can markedly affect the outcome of the data analysis, highlighting the importance of this choice. © The Author 2013. Published by Oxford University Press.

  5. Advancements in robust algorithm formulation for speaker identification of whispered speech

    NASA Astrophysics Data System (ADS)

    Fan, Xing

    Whispered speech is an alternative speech production mode from neutral speech, which is used by talkers intentionally in natural conversational scenarios to protect privacy and to avoid certain content from being overheard/made public. Due to the profound differences between whispered and neutral speech in production mechanism and the absence of whispered adaptation data, the performance of speaker identification systems trained with neutral speech degrades significantly. This dissertation therefore focuses on developing a robust closed-set speaker recognition system for whispered speech by using no or limited whispered adaptation data from non-target speakers. This dissertation proposes the concept of "High''/"Low'' performance whispered data for the purpose of speaker identification. A variety of acoustic properties are identified that contribute to the quality of whispered data. An acoustic analysis is also conducted to compare the phoneme/speaker dependency of the differences between whispered and neutral data in the feature domain. The observations from those acoustic analysis are new in this area and also serve as a guidance for developing robust speaker identification systems for whispered speech. This dissertation further proposes two systems for speaker identification of whispered speech. One system focuses on front-end processing. A two-dimensional feature space is proposed to search for "Low''-quality performance based whispered utterances and separate feature mapping functions are applied to vowels and consonants respectively in order to retain the speaker's information shared between whispered and neutral speech. The other system focuses on speech-mode-independent model training. The proposed method generates pseudo whispered features from neutral features by using the statistical information contained in a whispered Universal Background model (UBM) trained from extra collected whispered data from non-target speakers. Four modeling methods are proposed for the transformation estimation in order to generate the pseudo whispered features. Both of the above two systems demonstrate a significant improvement over the baseline system on the evaluation data. This dissertation has therefore contributed to providing a scientific understanding of the differences between whispered and neutral speech as well as improved front-end processing and modeling method for speaker identification of whispered speech. Such advancements will ultimately contribute to improve the robustness of speech processing systems.

  6. GPS baseline configuration design based on robustness analysis

    NASA Astrophysics Data System (ADS)

    Yetkin, M.; Berber, M.

    2012-11-01

    The robustness analysis results obtained from a Global Positioning System (GPS) network are dramatically influenced by the configurationof the observed baselines. The selection of optimal GPS baselines may allow for a cost effective survey campaign and a sufficiently robustnetwork. Furthermore, using the approach described in this paper, the required number of sessions, the baselines to be observed, and thesignificance levels for statistical testing and robustness analysis can be determined even before the GPS campaign starts. In this study, wepropose a robustness criterion for the optimal design of geodetic networks, and present a very simple and efficient algorithm based on thiscriterion for the selection of optimal GPS baselines. We also show the relationship between the number of sessions and the non-centralityparameter. Finally, a numerical example is given to verify the efficacy of the proposed approach.

  7. Gap-metric-based robustness analysis of nonlinear systems with full and partial feedback linearisation

    NASA Astrophysics Data System (ADS)

    Al-Gburi, A.; Freeman, C. T.; French, M. C.

    2018-06-01

    This paper uses gap metric analysis to derive robustness and performance margins for feedback linearising controllers. Distinct from previous robustness analysis, it incorporates the case of output unstructured uncertainties, and is shown to yield general stability conditions which can be applied to both stable and unstable plants. It then expands on existing feedback linearising control schemes by introducing a more general robust feedback linearising control design which classifies the system nonlinearity into stable and unstable components and cancels only the unstable plant nonlinearities. This is done in order to preserve the stabilising action of the inherently stabilising nonlinearities. Robustness and performance margins are derived for this control scheme, and are expressed in terms of bounds on the plant nonlinearities and the accuracy of the cancellation of the unstable plant nonlinearity by the controller. Case studies then confirm reduced conservatism compared with standard methods.

  8. Universality of the Peregrine Soliton in the Focusing Dynamics of the Cubic Nonlinear Schrödinger Equation

    NASA Astrophysics Data System (ADS)

    Tikan, Alexey; Billet, Cyril; El, Gennady; Tovbis, Alexander; Bertola, Marco; Sylvestre, Thibaut; Gustave, Francois; Randoux, Stephane; Genty, Goëry; Suret, Pierre; Dudley, John M.

    2017-07-01

    We report experimental confirmation of the universal emergence of the Peregrine soliton predicted to occur during pulse propagation in the semiclassical limit of the focusing nonlinear Schrödinger equation. Using an optical fiber based system, measurements of temporal focusing of high power pulses reveal both intensity and phase signatures of the Peregrine soliton during the initial nonlinear evolution stage. Experimental and numerical results are in very good agreement, and show that the universal mechanism that yields the Peregrine soliton structure is highly robust and can be observed over a broad range of parameters.

  9. Investigation of air transportation technology at Princeton University, 1992-1993

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1994-01-01

    The Air Transportation Research Program at Princeton University proceeded along five avenues during the past year: (1) Flight Control System Robustness; (2) Microburst Hazards to Aircraft; (3) Wind Rotor Hazards to Aircraft; (4) Intelligent Aircraft/Airspace Systems; and (5) Aerospace Optical Communications. This research resulted in a number of publications, including theses, archival papers, and conference papers. An annotated bibliography of publications that appeared between June 1992 and June 1993 is included. The research that these papers describe was supported in whole or in part by the Joint University Program, including work that was completed prior to the reporting period.

  10. Robust Control Systems.

    DTIC Science & Technology

    1981-12-01

    time control system algorithms that will perform adequately (i.e., at least maintain closed-loop system stability) when ucertain parameters in the...system design models vary significantly. Such a control algorithm is said to have stability robustness-or more simply is said to be "robust". This...cas6s above, the performance is analyzed using a covariance analysis. The development of all the controllers and the performance analysis algorithms is

  11. Universal Influenza B Virus Genomic Amplification Facilitates Sequencing, Diagnostics, and Reverse Genetics

    PubMed Central

    Zhou, Bin; Lin, Xudong; Wang, Wei; Halpin, Rebecca A.; Bera, Jayati; Stockwell, Timothy B.; Barr, Ian G.

    2014-01-01

    Although human influenza B virus (IBV) is a significant human pathogen, its great genetic diversity has limited our ability to universally amplify the entire genome for subsequent sequencing or vaccine production. The generation of sequence data via next-generation approaches and the rapid cloning of viral genes are critical for basic research, diagnostics, antiviral drugs, and vaccines to combat IBV. To overcome the difficulty of amplifying the diverse and ever-changing IBV genome, we developed and optimized techniques that amplify the complete segmented negative-sense RNA genome from any IBV strain in a single tube/well (IBV genomic amplification [IBV-GA]). Amplicons for >1,000 diverse IBV genomes from different sample types (e.g., clinical specimens) were generated and sequenced using this robust technology. These approaches are sensitive, robust, and sequence independent (i.e., universally amplify past, present, and future IBVs), which facilitates next-generation sequencing and advanced genomic diagnostics. Importantly, special terminal sequences engineered into the optimized IBV-GA2 products also enable ligation-free cloning to rapidly generate reverse-genetics plasmids, which can be used for the rescue of recombinant viruses and/or the creation of vaccine seed stock. PMID:24501036

  12. Robust Flutter Margin Analysis that Incorporates Flight Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Martin J.

    1998-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  13. New development of the image matching algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoqiang; Feng, Zhao

    2018-04-01

    To study the image matching algorithm, algorithm four elements are described, i.e., similarity measurement, feature space, search space and search strategy. Four common indexes for evaluating the image matching algorithm are described, i.e., matching accuracy, matching efficiency, robustness and universality. Meanwhile, this paper describes the principle of image matching algorithm based on the gray value, image matching algorithm based on the feature, image matching algorithm based on the frequency domain analysis, image matching algorithm based on the neural network and image matching algorithm based on the semantic recognition, and analyzes their characteristics and latest research achievements. Finally, the development trend of image matching algorithm is discussed. This study is significant for the algorithm improvement, new algorithm design and algorithm selection in practice.

  14. Developing Optimized Trajectories Derived from Mission and Thermo-Structural Constraints

    NASA Technical Reports Server (NTRS)

    Lear, Matthew H.; McGrath, Brian E.; Anderson, Michael P.; Green, Peter W.

    2008-01-01

    In conjunction with NASA and the Department of Defense, the Johns Hopkins University Applied Physics Laboratory (JHU/APL) has been investigating analytical techniques to address many of the fundamental issues associated with solar exploration spacecraft and high-speed atmospheric vehicle systems. These issues include: thermo-structural response including the effects of thermal management via the use of surface optical properties for high-temperature composite structures; aerodynamics with the effects of non-equilibrium chemistry and gas radiation; and aero-thermodynamics with the effects of material ablation for a wide range of thermal protection system (TPS) materials. The need exists to integrate these discrete tools into a common framework that enables the investigation of interdisciplinary interactions (including analysis tool, applied load, and environment uncertainties) to provide high fidelity solutions. In addition to developing robust tools for the coupling of aerodynamically induced thermal and mechanical loads, JHU/APL has been studying the optimal design of high-speed vehicles as a function of their trajectory. Under traditional design methodology the optimization of system level mission parameters such as range and time of flight is performed independently of the optimization for thermal and mechanical constraints such as stress and temperature. A truly optimal trajectory should optimize over the entire range of mission and thermo-mechanical constraints. Under this research, a framework for the robust analysis of high-speed spacecraft and atmospheric vehicle systems has been developed. It has been built around a generic, loosely coupled framework such that a variety of readily available analysis tools can be used. The methodology immediately addresses many of the current analysis inadequacies and allows for future extension in order to handle more complex problems.

  15. Using cost-effectiveness analysis to evaluate targeting strategies: the case of vitamin A supplementation.

    PubMed

    Loevinsohn, B P; Sutter, R W; Costales, M O

    1997-03-01

    Given the demonstrated efficacy of vitamin A supplements in reducing childhood mortality, health officials now have to decide whether it would be efficient to target the supplements to high risk children. Decisions about targeting are complex because they depend on a number of factors; the degree of clustering of preventable deaths, the cost of the intervention, the side-effects of the intervention, the cost of identifying the high risk group, and the accuracy of the 'diagnosis' of risk. A cost-effectiveness analysis was used in the Philippines to examine whether vitamin A supplements should be given universally to all children 6-59 months, targeted broadly to children suffering from mild, moderate, or severe malnutrition, or targeted narrowly to pre-schoolers with moderate and severe malnutrition. The first year average cost of the universal approach was US$67.21 per death averted compared to $144.12 and $257.20 for the broad and narrow targeting approaches respectively. When subjected to sensitivity analysis the conclusion about the most cost-effective strategy was robust to changes in underlying assumptions such as the efficacy of supplements, clustering of deaths, and toxicity. Targeting vitamin A supplements to high risk children is not an efficient use of resources. Based on the results of this cost-effectiveness analysis and a consideration of alternate strategies, it is apparent that vitamin A, like immunization, should be provided to all pre-schoolers in the developing world. Issues about targeting public health interventions can usefully be addressed by cost-effectiveness analysis.

  16. Robust Bonding of Tough Double Network Hydrogel to Bone

    NASA Astrophysics Data System (ADS)

    Nonoyama, Takayuki; Wada, Susumu; Kiyama, Ryuji; Kitamura, Nobuto; Kurokawa, Takayuki; Nakajima, Tasuku; Yasuda, Kazunori; Gong, Jian Ping

    Tough Double Network (DN) hydrogels are one of candidates as next-generation artificial cartilage from the viewpoints of low friction, water storage capability and toughness. For practical use, the hydrogel must be strongly fixed at the joint. However, strong fixation of such hydrogel to other materials (tissues) has not been achieved yet because the surface property of hydrogel is almost equal to water due to its high water content. Therefore, robust adhesion for fixation and low friction for lithe motion are trade-off relation. Here, we report robust fixation of hydroxyapatite (HAp) mineralized DN hydrogel to the bone without any toxicity. HAp is main inorganic component of bone tissues and has osteoconductive capability. After 4 weeks implantation of HAp/DN gel into rabbit femoral groove, The robust fixation between bone and HAp/DN gel, more than strength of gel matrix, was achieved. The methodology is universal for new biomaterials, which should be fixed on bone, such as ligament and tendon systems.

  17. Robust interface between flying and topological qubits

    PubMed Central

    Xue, Zheng-Yuan; Gong, Ming; Liu, Jia; Hu, Yong; Zhu, Shi-Liang; Wang, Z. D.

    2015-01-01

    Hybrid architectures, consisting of conventional and topological qubits, have recently attracted much attention due to their capability in consolidating robustness of topological qubits and universality of conventional qubits. However, these two kinds of qubits are normally constructed in significantly different energy scales, and thus the energy mismatch is a major obstacle for their coupling, which can support the exchange of quantum information between them. Here we propose a microwave photonic quantum bus for a strong direct coupling between the topological and conventional qubits, where the energy mismatch is compensated by an external driving field. In the framework of tight-binding simulation and perturbation approach, we show that the energy splitting of Majorana fermions in a finite length nanowire, which we use to define topological qubits, is still robust against local perturbations due to the topology of the system. Therefore, the present scheme realizes a rather robust interface between the flying and topological qubits. Finally, we demonstrate that this quantum bus can also be used to generate multipartitie entangled states with the topological qubits. PMID:26216201

  18. Cost-effectiveness analysis of different types of labor for singleton pregnancy: real life data.

    PubMed

    Lakić, Dragana; Petrović, Branko; Petrova, Guenka

    2014-01-01

    Views on the conduct of labor have changed over time, and a significant difference exists in relation to obstetric centers. To assess cost, clinical outcomes and cost-effectiveness of different types of labor in singleton pregnancies. A decision model was used to compare vaginal labor, induced labor and planned cesarean section. All data were taken from the Book of Labor from the University Hospital for Gynecology and Obstetrics "Narodni Front", Belgrade, Serbia for labors conducted during one month period in 2011. Successful delivery (i.e. labor that began up to 42 gestation weeks, without maternal mortality and the newborn Apgar scores greater than or equal to seven in the fifth minute of life) was considered as the outcome of the cost effectiveness-analysis. To test the robustness of this definition probabilistic sensitivity analysis was performed. From a total of 667 births, vaginal labor was conducted in 98 cases, induced vaginal in 442, while planned cesarean section was performed 127 times. Emergency cesarean section as a complication was much higher in the vaginal labor cohort compared to the induced vaginal cohort (OR=17.374; 95% CI: 8.522 to 35.418; p<0.001). The least costly type of labor was induced vaginal labor: average cost 461 euro, with an effectiveness of 98.17%. Both, vaginal and planned cesarean labor were dominated by the induced labor. The results were robust. Elective induction of labor was associated with the lowest cost compared to other types of labor, with favorable maternal and neonatal outcomes.

  19. Quantum logic between remote quantum registers

    NASA Astrophysics Data System (ADS)

    Yao, N. Y.; Gong, Z.-X.; Laumann, C. R.; Bennett, S. D.; Duan, L.-M.; Lukin, M. D.; Jiang, L.; Gorshkov, A. V.

    2013-02-01

    We consider two approaches to dark-spin-mediated quantum computing in hybrid solid-state spin architectures. First, we review the notion of eigenmode-mediated unpolarized spin-chain state transfer and extend the analysis to various experimentally relevant imperfections: quenched disorder, dynamical decoherence, and uncompensated long-range coupling. In finite-length chains, the interplay between disorder-induced localization and decoherence yields a natural optimal channel fidelity, which we calculate. Long-range dipolar couplings induce a finite intrinsic lifetime for the mediating eigenmode; extensive numerical simulations of dipolar chains of lengths up to L=12 show remarkably high fidelity despite these decay processes. We further briefly consider the extension of the protocol to bosonic systems of coupled oscillators. Second, we introduce a quantum mirror based architecture for universal quantum computing that exploits all of the dark spins in the system as potential qubits. While this dramatically increases the number of qubits available, the composite operations required to manipulate dark-spin qubits significantly raise the error threshold for robust operation. Finally, we demonstrate that eigenmode-mediated state transfer can enable robust long-range logic between spatially separated nitrogen-vacancy registers in diamond; disorder-averaged numerics confirm that high-fidelity gates are achievable even in the presence of moderate disorder.

  20. The cost-effectiveness of targeted or universal screening for vasa praevia at 18-20 weeks of gestation in Ontario.

    PubMed

    Cipriano, L E; Barth, W H; Zaric, G S

    2010-08-01

    To estimate the cost-effectiveness of targeted and universal screening for vasa praevia at 18-20 weeks of gestation in singleton and twin pregnancies. Cost-utility analysis based on a decision-analytic model comparing relevant strategies and life-long outcomes for mother and infant(s). Ontario, Canada. A cohort of pregnant women in 1 year. We constructed a decision-analytic model to estimate the lifetime incremental costs and benefits of screening for vasa praevia. Inputs were estimated from the literature. Costs were collected from the London Health Sciences Centre, the Ontario Health Insurance Program, and other sources. We used one-way, scenario and probabilistic sensitivity analysis to determine the robustness of the results. Incremental costs, life expectancy, quality-adjusted life-years (QALY) and incremental cost-effectiveness ratio (ICER). Universal transvaginal ultrasound screening of twin pregnancies has an ICER of $5488 per QALY-gained. Screening all singleton pregnancies with the risk factors low-lying placentas, in vitro fertilisation (IVF) conception, accessory placental lobes, or velamentous cord insertion has an ICER of $15,764 per QALY-gained even though identifying some of these risk factors requires routine use of colour Doppler during transabdominal examinations. Screening women with a marginal cord insertion costs an additional $27,603 per QALY-gained. Universal transvaginal screening for vasa praevia in singleton pregnancies costs $579,164 per QALY compared with targeted screening. Compared with current practice, screening all twin pregnancies for vasa praevia with transvaginal ultrasound is cost-effective. Among the alternatives considered, the use of colour Doppler at all transabdominal ultrasound examinations of singleton pregnancies and targeted use of transvaginal ultrasound for IVF pregnancies or when the placenta has been found to be associated with one or more risk factors is cost-effective. Universal screening of singleton pregnancies is not cost-effective compared with targeted screening.

  1. ALMA-SZ Detection of a Galaxy Cluster Merger Shock at Half the Age of the Universe

    NASA Astrophysics Data System (ADS)

    Basu, K.; Sommer, M.; Erler, J.; Eckert, D.; Vazza, F.; Magnelli, B.; Bertoldi, F.; Tozzi, P.

    2016-10-01

    We present ALMA measurements of a merger shock using the thermal Sunyaev-Zel’dovich (SZ) effect signal, at the location of a radio relic in the famous El Gordo galaxy cluster at z≈ 0.9. Multi-wavelength analysis in combination with the archival Chandra data and a high-resolution radio image provides a consistent picture of the thermal and non-thermal signal variation across the shock front and helps to put robust constraints on the shock Mach number as well as the relic magnetic field. We employ a Bayesian analysis technique for modeling the SZ and X-ray data self-consistently, illustrating respective parameter degeneracies. Combined results indicate a shock with Mach number { M }={2.4}-0.6+1.3, which in turn suggests a high value of the magnetic field (of the order of 4-10 μ {{G}}) to account for the observed relic width at 2 GHz. At roughly half the current age of the universe, this is the highest-redshift direct detection of a cluster shock to date, and one of the first instances of an ALMA-SZ observation in a galaxy cluster. It shows the tremendous potential for future ALMA-SZ observations to detect merger shocks and other cluster substructures out to the highest redshifts.

  2. Model reference tracking control of an aircraft: a robust adaptive approach

    NASA Astrophysics Data System (ADS)

    Tanyer, Ilker; Tatlicioglu, Enver; Zergeroglu, Erkan

    2017-05-01

    This work presents the design and the corresponding analysis of a nonlinear robust adaptive controller for model reference tracking of an aircraft that has parametric uncertainties in its system matrices and additive state- and/or time-dependent nonlinear disturbance-like terms in its dynamics. Specifically, robust integral of the sign of the error feedback term and an adaptive term is fused with a proportional integral controller. Lyapunov-based stability analysis techniques are utilised to prove global asymptotic convergence of the output tracking error. Extensive numerical simulations are presented to illustrate the performance of the proposed robust adaptive controller.

  3. Robust-mode analysis of hydrodynamic flows

    NASA Astrophysics Data System (ADS)

    Roy, Sukesh; Gord, James R.; Hua, Jia-Chen; Gunaratne, Gemunu H.

    2017-04-01

    The emergence of techniques to extract high-frequency high-resolution data introduces a new avenue for modal decomposition to assess the underlying dynamics, especially of complex flows. However, this task requires the differentiation of robust, repeatable flow constituents from noise and other irregular features of a flow. Traditional approaches involving low-pass filtering and principle components analysis have shortcomings. The approach outlined here, referred to as robust-mode analysis, is based on Koopman decomposition. Three applications to (a) a counter-rotating cellular flame state, (b) variations in financial markets, and (c) turbulent injector flows are provided.

  4. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  5. RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.

    PubMed

    Glaab, Enrico; Schneider, Reinhard

    2015-07-01

    High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  6. Transportation Infrastructure Robustness : Joint Engineering and Economic Analysis

    DOT National Transportation Integrated Search

    2017-11-01

    The objectives of this study are to develop a methodology for assessing the robustness of transportation infrastructure facilities and assess the effect of damage to such facilities on travel demand and the facilities users welfare. The robustness...

  7. Implementing Data Definition Consistency for Emergency Department Operations Benchmarking and Research.

    PubMed

    Yiadom, Maame Yaa A B; Scheulen, James; McWade, Conor M; Augustine, James J

    2016-07-01

    The objective was to obtain a commitment to adopt a common set of definitions for emergency department (ED) demographic, clinical process, and performance metrics among the ED Benchmarking Alliance (EDBA), ED Operations Study Group (EDOSG), and Academy of Academic Administrators of Emergency Medicine (AAAEM) by 2017. A retrospective cross-sectional analysis of available data from three ED operations benchmarking organizations supported a negotiation to use a set of common metrics with identical definitions. During a 1.5-day meeting-structured according to social change theories of information exchange, self-interest, and interdependence-common definitions were identified and negotiated using the EDBA's published definitions as a start for discussion. Methods of process analysis theory were used in the 8 weeks following the meeting to achieve official consensus on definitions. These two lists were submitted to the organizations' leadership for implementation approval. A total of 374 unique measures were identified, of which 57 (15%) were shared by at least two organizations. Fourteen (4%) were common to all three organizations. In addition to agreement on definitions for the 14 measures used by all three organizations, agreement was reached on universal definitions for 17 of the 57 measures shared by at least two organizations. The negotiation outcome was a list of 31 measures with universal definitions to be adopted by each organization by 2017. The use of negotiation, social change, and process analysis theories achieved the adoption of universal definitions among the EDBA, EDOSG, and AAAEM. This will impact performance benchmarking for nearly half of US EDs. It initiates a formal commitment to utilize standardized metrics, and it transitions consistency in reporting ED operations metrics from consensus to implementation. This work advances our ability to more accurately characterize variation in ED care delivery models, resource utilization, and performance. In addition, it permits future aggregation of these three data sets, thus facilitating the creation of more robust ED operations research data sets unified by a universal language. Negotiation, social change, and process analysis principles can be used to advance the adoption of additional definitions. © 2016 by the Society for Academic Emergency Medicine.

  8. Enabling Rapid and Robust Structural Analysis During Conceptual Design

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.; Padula, Sharon L.; Li, Wu

    2015-01-01

    This paper describes a multi-year effort to add a structural analysis subprocess to a supersonic aircraft conceptual design process. The desired capabilities include parametric geometry, automatic finite element mesh generation, static and aeroelastic analysis, and structural sizing. The paper discusses implementation details of the new subprocess, captures lessons learned, and suggests future improvements. The subprocess quickly compares concepts and robustly handles large changes in wing or fuselage geometry. The subprocess can rank concepts with regard to their structural feasibility and can identify promising regions of the design space. The automated structural analysis subprocess is deemed robust and rapid enough to be included in multidisciplinary conceptual design and optimization studies.

  9. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  10. Human Fear Chemosignaling: Evidence from a Meta-Analysis.

    PubMed

    de Groot, Jasper H B; Smeets, Monique A M

    2017-10-01

    Alarm pheromones are widely used in the animal kingdom. Notably, there are 26 published studies (N = 1652) highlighting a human capacity to communicate fear, stress, and anxiety via body odor from one person (66% males) to another (69% females). The question is whether the findings of this literature reflect a true effect, and what the average effect size is. These questions were answered by combining traditional meta-analysis with novel meta-analytical tools, p-curve analysis and p-uniform-techniques that could indicate whether findings are likely to reflect a true effect based on the distribution of P-values. A traditional random-effects meta-analysis yielded a small-to-moderate effect size (Hedges' g: 0.36, 95% CI: 0.31-0.41), p-curve analysis showed evidence diagnostic of a true effect (ps < 0.0001), and there was no evidence for publication bias. This meta-analysis did not assess the internal validity of the current studies; yet, the combined results illustrate the statistical robustness of a field in human olfaction dealing with the human capacity to communicate certain emotions (fear, stress, anxiety) via body odor. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Micro-Structured Sapphire Fiber Sensors for Simultaneous Measurements of High-T and Dynamic Gas Pressure in Harsh Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, Hai; Tsai, Hai-Lung; Dong, Junhang

    2014-09-30

    This is the final report for the program “Micro-Structured Sapphire Fiber Sensors for Simultaneous Measurements of High Temperature and Dynamic Gas Pressure in Harsh Environments”, funded by NETL, and performed by Missouri University of Science and Technology, Clemson University and University of Cincinnati from October 1, 2009 to September 30, 2014. Securing a sustainable energy economy by developing affordable and clean energy from coal and other fossil fuels is a central element to the mission of The U.S. Department of Energy’s (DOE) National Energy Technology Laboratory (NETL). To further this mission, NETL funds research and development of novel sensor technologiesmore » that can function under the extreme operating conditions often found in advanced power systems. The main objective of this research program is to conduct fundamental and applied research that will lead to successful development and demonstration of robust, multiplexed, microstructured silica and single-crystal sapphire fiber sensors to be deployed into the hot zones of advanced power and fuel systems for simultaneous measurements of high temperature and gas pressure. The specific objectives of this research program include: 1) Design, fabrication and demonstration of multiplexed, robust silica and sapphire fiber temperature and dynamic gas pressure sensors that can survive and maintain fully operational in high-temperature harsh environments. 2) Development and demonstration of a novel method to demodulate the multiplexed interferograms for simultaneous measurements of temperature and gas pressure in harsh environments. 3) Development and demonstration of novel sapphire fiber cladding and low numerical aperture (NA) excitation techniques to assure high signal integrity and sensor robustness.« less

  12. Thermochromic Artificial Nacre Based on Montmorillonite.

    PubMed

    Peng, Jingsong; Cheng, Yiren; Tomsia, Antoni P; Jiang, Lei; Cheng, Qunfeng

    2017-07-26

    Nacre-inspired nanocomposites have attracted a great deal of attention in recent years because of their special mechanical properties and universality of the underlying principles of materials engineering. The ability to respond to external stimuli will augment the high toughness and high strength of artificial nacre-like composites and open new technological horizons for these materials. Herein, we fabricated robust artificial nacre based on montmorillonite (MMT) that combines robustness with reversible thermochromism. Our artificial nacre shows great potential in various fields such as aerospace and sensors and opens an avenue to fabricate artificial nacre responsive to other external stimuli in the future.

  13. Robust one-step catalytic machine for high fidelity anticloning and W-state generation in a multiqubit system.

    PubMed

    Olaya-Castro, Alexandra; Johnson, Neil F; Quiroga, Luis

    2005-03-25

    We propose a physically realizable machine which can either generate multiparticle W-like states, or implement high-fidelity 1-->M (M=1,2,...infinity) anticloning of an arbitrary qubit state, in a single step. This universal machine acts as a catalyst in that it is unchanged after either procedure, effectively resetting itself for its next operation. It possesses an inherent immunity to decoherence. Most importantly in terms of practical multiparty quantum communication, the machine's robustness in the presence of decoherence actually increases as the number of qubits M increases.

  14. Anytime synthetic projection: Maximizing the probability of goal satisfaction

    NASA Technical Reports Server (NTRS)

    Drummond, Mark; Bresina, John L.

    1990-01-01

    A projection algorithm is presented for incremental control rule synthesis. The algorithm synthesizes an initial set of goal achieving control rules using a combination of situation probability and estimated remaining work as a search heuristic. This set of control rules has a certain probability of satisfying the given goal. The probability is incrementally increased by synthesizing additional control rules to handle 'error' situations the execution system is likely to encounter when following the initial control rules. By using situation probabilities, the algorithm achieves a computationally effective balance between the limited robustness of triangle tables and the absolute robustness of universal plans.

  15. Syndrome source coding and its universal generalization

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1975-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.

  16. Universal Quantum Noise in Adiabatic Pumping

    NASA Astrophysics Data System (ADS)

    Herasymenko, Yaroslav; Snizhko, Kyrylo; Gefen, Yuval

    2018-06-01

    We consider charge pumping in a system of parafermions, implemented at fractional quantum Hall edges. Our pumping protocol leads to a noisy behavior of the pumped current. As the adiabatic limit is approached, not only does the noisy behavior persist but the counting statistics of the pumped current becomes robust and universal. In particular, the resulting Fano factor is given in terms of the system's topological degeneracy and the pumped quasiparticle charge. Our results are also applicable to the more conventional Majorana fermions.

  17. Analysis and improvements of Adaptive Particle Refinement (APR) through CPU time, accuracy and robustness considerations

    NASA Astrophysics Data System (ADS)

    Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.

    2018-02-01

    While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.

  18. Hyperspectral Image Classification via Multitask Joint Sparse Representation and Stepwise MRF Optimization.

    PubMed

    Yuan, Yuan; Lin, Jianzhe; Wang, Qi

    2016-12-01

    Hyperspectral image (HSI) classification is a crucial issue in remote sensing. Accurate classification benefits a large number of applications such as land use analysis and marine resource utilization. But high data correlation brings difficulty to reliable classification, especially for HSI with abundant spectral information. Furthermore, the traditional methods often fail to well consider the spatial coherency of HSI that also limits the classification performance. To address these inherent obstacles, a novel spectral-spatial classification scheme is proposed in this paper. The proposed method mainly focuses on multitask joint sparse representation (MJSR) and a stepwise Markov random filed framework, which are claimed to be two main contributions in this procedure. First, the MJSR not only reduces the spectral redundancy, but also retains necessary correlation in spectral field during classification. Second, the stepwise optimization further explores the spatial correlation that significantly enhances the classification accuracy and robustness. As far as several universal quality evaluation indexes are concerned, the experimental results on Indian Pines and Pavia University demonstrate the superiority of our method compared with the state-of-the-art competitors.

  19. HALOE Algorithm Improvements for Upper Tropospheric Sounding

    NASA Technical Reports Server (NTRS)

    McHugh, Martin J.; Gordley, Larry L.; Russell, James M., III; Hervig, Mark E.

    1999-01-01

    This report details the ongoing efforts by GATS, Inc., in conjunction with Hampton University and University of Wyoming, in NASA's Mission to Planet Earth UARS Science Investigator Program entitled "HALOE Algorithm Improvements for Upper Tropospheric Soundings." The goal of this effort is to develop and implement major inversion and processing improvements that will extend HALOE measurements further into the troposphere. In particular, O3, H2O, and CH4 retrievals may be extended into the middle troposphere, and NO, HCl and possibly HF into the upper troposphere. Key areas of research being carried out to accomplish this include: pointing/tracking analysis; cloud identification and modeling; simultaneous multichannel retrieval capability; forward model improvements; high vertical-resolution gas filter channel retrievals; a refined temperature retrieval; robust error analyses; long-term trend reliability studies; and data validation. The current (first-year) effort concentrates on the pointer/tracker correction algorithms, cloud filtering and validation, and multi-channel retrieval development. However, these areas are all highly coupled, so progress in one area benefits from and sometimes depends on work in others.

  20. HALOE Algorithm Improvements for Upper Tropospheric Sounding

    NASA Technical Reports Server (NTRS)

    Thompson, Robert Earl; McHugh, Martin J.; Gordley, Larry L.; Hervig, Mark E.; Russell, James M., III; Douglass, Anne (Technical Monitor)

    2001-01-01

    This report details the ongoing efforts by GATS, Inc., in conjunction with Hampton University and University of Wyoming, in NASA's Mission to Planet Earth Upper Atmospheric Research Satellite (UARS) Science Investigator Program entitled 'HALOE Algorithm Improvements for Upper Tropospheric Sounding.' The goal of this effort is to develop and implement major inversion and processing improvements that will extend Halogen Occultation Experiment (HALOE) measurements further into the troposphere. In particular, O3, H2O, and CH4 retrievals may be extended into the middle troposphere, and NO, HCl and possibly HF into the upper troposphere. Key areas of research being carried out to accomplish this include: pointing/tracking analysis; cloud identification and modeling; simultaneous multichannel retrieval capability; forward model improvements; high vertical-resolution gas filter channel retrievals; a refined temperature retrieval; robust error analyses; long-term trend reliability studies; and data validation. The current (first year) effort concentrates on the pointer/tracker correction algorithms, cloud filtering and validation, and multichannel retrieval development. However, these areas are all highly coupled, so progress in one area benefits from and sometimes depends on work in others.

  1. Robustness analysis of multirate and periodically time varying systems

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1991-01-01

    A new method for analyzing the stability and robustness of multirate and periodically time varying systems is presented. It is shown that a multirate or periodically time varying system can be transformed into an equivalent time invariant system. For a SISO system, traditional gain and phase margins can be found by direct application of the Nyquist criterion to this equivalent time invariant system. For a MIMO system, structured and unstructured singular values can be used to determine the system's robustness. The limitations and implications of utilizing this equivalent time invariant system for calculating gain and phase margins, and for estimating robustness via singular value analysis are discussed.

  2. Gaining Cyber Dominance

    DTIC Science & Technology

    2015-01-01

    Robust team exercise and simulation • Air-gapped; isolation from production networks • “Train as you fight” scenarios • Advanced user and Internet...Security Onion • SIFT (Linux/Windows) • Kali • Rucksack • Docker • VTS 18 GCD Overview January 2015 © 2014 Carnegie Mellon University TEXN Architecture

  3. Universality and robustness of revivals in the transverse field XY model

    NASA Astrophysics Data System (ADS)

    Häppölä, Juho; Halász, Gábor B.; Hamma, Alioscia

    2012-03-01

    We study the structure of the revivals in an integrable quantum many-body system, the transverse field XY spin chain, after a quantum quench. The time evolutions of the Loschmidt echo, the magnetization, and the single-spin entanglement entropy are calculated. We find that the revival times for all of these observables are given by integer multiples of Trev≃L/vmax, where L is the linear size of the system and vmax is the maximal group velocity of quasiparticles. This revival structure is universal in the sense that it does not depend on the initial state and the size of the quench. Applying nonintegrable perturbations to the XY model, we observe that the revivals are robust against such perturbations: they are still visible at time scales much larger than the quasiparticle lifetime. We therefore propose a generic connection between the revival structure and the locality of the dynamics, where the quasiparticle speed vmax generalizes into the Lieb-Robinson speed vLR.

  4. Neural-network-designed pulse sequences for robust control of singlet-triplet qubits

    NASA Astrophysics Data System (ADS)

    Yang, Xu-Chen; Yung, Man-Hong; Wang, Xin

    2018-04-01

    Composite pulses are essential for universal manipulation of singlet-triplet spin qubits. In the absence of noise, they are required to perform arbitrary single-qubit operations due to the special control constraint of a singlet-triplet qubit, while in a noisy environment, more complicated sequences have been developed to dynamically correct the error. Tailoring these sequences typically requires numerically solving a set of nonlinear equations. Here we demonstrate that these pulse sequences can be generated by a well-trained, double-layer neural network. For sequences designed for the noise-free case, the trained neural network is capable of producing almost exactly the same pulses known in the literature. For more complicated noise-correcting sequences, the neural network produces pulses with slightly different line shapes, but the robustness against noises remains comparable. These results indicate that the neural network can be a judicious and powerful alternative to existing techniques in developing pulse sequences for universal fault-tolerant quantum computation.

  5. Robust Optimization and Sensitivity Analysis with Multi-Objective Genetic Algorithms: Single- and Multi-Disciplinary Applications

    DTIC Science & Technology

    2007-01-01

    multi-disciplinary optimization with uncertainty. Robust optimization and sensitivity analysis is usually used when an optimization model has...formulation is introduced in Section 2.3. We briefly discuss several definitions used in the sensitivity analysis in Section 2.4. Following in...2.5. 2.4 SENSITIVITY ANALYSIS In this section, we discuss several definitions used in Chapter 5 for Multi-Objective Sensitivity Analysis . Inner

  6. The "universal" behavior of the Breakthrough Curve in 3D aquifer transport and the validity of the First-Order solution

    NASA Astrophysics Data System (ADS)

    Jankovic, Igor; Maghrebi, Mahdi; Fiori, Aldo; Zarlenga, Antonio; Dagan, Gedeon

    2017-04-01

    We examine the impact of permeability structures on the Breakthrough Curve (BTC) of solute, at a distance x from the injection plane, under mean uniform flow of mean velocity U. The study is carried out through accurate 3D numerical simulations, rather than the 2D models adopted in most of previous works. All structures share the same univariate distribution of the logconductivity Y = lnK and autocorrelation function ρY , but differ in higher order statistics. The main finding is that the BTC of ergodic plumes for the different examined structures is quite robust, displaying a seemingly "universal" behavior. The result is in variance with similar analyses carried out in the past for 2D permeability structures. The basic parameters (i.e. the geometric mean, the logconductivity variance σY 2 and the horizontal integral scale I) have to be identified from field data (e.g. core analysis, pumping test or other methods). However, prediction requires the knowledge of U, and the results suggest that improvement of the BTC prediction in applications can be achieved by independent estimates of the mean velocity U, e.g. by pumping tests, rather than attempting to characterize the permeability structure beyond its second-order characterization. The BTC prediction made by the Inverse Gaussian (IG) distribution, adopting the macrodispersion coefficient estimated by the First Order approximation αL = σY 2I, is also quite robust, providing a simple and effective solution to be employed in applications. The consequences of the latter result are further explored by modeling the mass distribution that occurred at the MADE-1 natural gradient experiment, for which we show that most of the plume features are adequately captured by the simple First Order approach.

  7. The cellular distribution of fluorescently labeled arrestins provides a robust, sensitive, and universal assay for screening G protein-coupled receptors.

    PubMed

    Oakley, Robert H; Hudson, Christine C; Cruickshank, Rachael D; Meyers, Diane M; Payne, Richard E; Rhem, Shay M; Loomis, Carson R

    2002-11-01

    G protein-coupled receptors (GPCRs) have proven to be a rich source of therapeutic targets; therefore, finding compounds that regulate these receptors is a critical goal in drug discovery. The Transfluor technology utilizes the redistribution of fluorescently labeled arrestins from the cytoplasm to agonist-occupied receptors at the plasma membrane to monitor quantitatively the activation or inactivation of GPCRs. Here, we show that the Transfluor technology can be quantitated on the INCell Analyzer system (INCAS) using the vasopressin V(2) receptor (V(2)R), which binds arrestin with high affinity, and the beta(2)-adrenergic receptor (beta(2)AR), which binds arrestin with low affinity. U2OS cells stably expressing an arrestin-green fluorescent protein conjugate and either the V(2)R or the beta(2)AR were plated in 96-well plastic plates and analyzed by the INCAS at a screening rate of 5 min per plate. Agonist dose-response and antagonist dose-inhibition curves revealed signal-to-background ratios of approximately 25:1 and 8:1 for the V(2)R and beta(2)AR, respectively. EC(50) values agreed closely with K(d) values reported in the literature for the different receptor agonists. In addition, small amounts of arrestin translocation induced by sub-EC(50) doses of agonist were distinguished from the background noise of untreated cells. Furthermore, differences in the magnitude of arrestin translocation distinguished partial agonists from full agonists, and Z' values for these ligands were >0.5. These data show that the Transfluor technology, combined with an automated image analysis system, provides a direct, robust, and universal assay for high throughput screening of known and orphan GPCRs.

  8. Is cosmic acceleration proven by local cosmological probes?

    NASA Astrophysics Data System (ADS)

    Tutusaus, I.; Lamine, B.; Dupays, A.; Blanchard, A.

    2017-06-01

    Context. The cosmological concordance model (ΛCDM) matches the cosmological observations exceedingly well. This model has become the standard cosmological model with the evidence for an accelerated expansion provided by the type Ia supernovae (SNIa) Hubble diagram. However, the robustness of this evidence has been addressed recently with somewhat diverging conclusions. Aims: The purpose of this paper is to assess the robustness of the conclusion that the Universe is indeed accelerating if we rely only on low-redshift (z ≲ 2) observations, that is to say with SNIa, baryonic acoustic oscillations, measurements of the Hubble parameter at different redshifts, and measurements of the growth of matter perturbations. Methods: We used the standard statistical procedure of minimizing the χ2 function for the different probes to quantify the goodness of fit of a model for both ΛCDM and a simple nonaccelerated low-redshift power law model. In this analysis, we do not assume that supernovae intrinsic luminosity is independent of the redshift, which has been a fundamental assumption in most previous studies that cannot be tested. Results: We have found that, when SNIa intrinsic luminosity is not assumed to be redshift independent, a nonaccelerated low-redshift power law model is able to fit the low-redshift background data as well as, or even slightly better, than ΛCDM. When measurements of the growth of structures are added, a nonaccelerated low-redshift power law model still provides an excellent fit to the data for all the luminosity evolution models considered. Conclusions: Without the standard assumption that supernovae intrinsic luminosity is independent of the redshift, low-redshift probes are consistent with a nonaccelerated universe.

  9. RSRE: RNA structural robustness evaluator

    PubMed Central

    Shu, Wenjie; Zheng, Zhiqiang; Wang, Shengqi

    2007-01-01

    Biological robustness, defined as the ability to maintain stable functioning in the face of various perturbations, is an important and fundamental topic in current biology, and has become a focus of numerous studies in recent years. Although structural robustness has been explored in several types of RNA molecules, the origins of robustness are still controversial. Computational analysis results are needed to make up for the lack of evidence of robustness in natural biological systems. The RNA structural robustness evaluator (RSRE) web server presented here provides a freely available online tool to quantitatively evaluate the structural robustness of RNA based on the widely accepted definition of neutrality. Several classical structure comparison methods are employed; five randomization methods are implemented to generate control sequences; sub-optimal predicted structures can be optionally utilized to mitigate the uncertainty of secondary structure prediction. With a user-friendly interface, the web application is easy to use. Intuitive illustrations are provided along with the original computational results to facilitate analysis. The RSRE will be helpful in the wide exploration of RNA structural robustness and will catalyze our understanding of RNA evolution. The RSRE web server is freely available at http://biosrv1.bmi.ac.cn/RSRE/ or http://biotech.bmi.ac.cn/RSRE/. PMID:17567615

  10. Determinants of health-related lifestyles among university students.

    PubMed

    Aceijas, Carmen; Waldhäusl, Sabrina; Lambert, Nicky; Cassar, Simon; Bello-Corassa, Rafael

    2017-07-01

    The aim of this study was to investigate students' health-related lifestyles and to identify barriers and social determinants of healthier lifestyles. An online survey, two focus groups and three in-depth interviews across 2014/2015. A stratified by school size and random sample ( n = 468) of university students answered a 67-item questionnaire comprising six scales: Rapid Assessment of Physical Activity, Rapid Eating and Activity Assessment for Patients-Short Version, CAGE, Fagerström Test for Nicotine Dependence, Warwick-Edinburgh Mental Wellbeing Scale short version, and ad hoc scale for drug use/misuse. Stratified by gender, χ 2 tests were run to test associations/estimate risks and three multivariate Logistic Regression models were adjusted. A thematic approach guided the analysis of qualitative data. A total of 60% of the respondents were insufficiently physically active, 47% had an unbalanced diet and 30% had low mental wellbeing. Alcohol drinkers versus abstinent were almost equally distributed. A total of 42% of alcohol drinkers reported getting drunk at least once a month. Smokers accounted for 16% of the respondents. Identified risk factors for suboptimal physical activity were as follows: being a woman, not using the university gym and smoking. Risk factors for unbalanced diet were low mental wellbeing and drug use. Poor mental wellbeing was predicted by unbalanced diet, not feeling like shopping and cooking frequently, and a lack of help-seeking behaviour in cases of distress. Qualitative analysis revealed seven thematic categories: transition to new life, university environment and systems, finances, academic pressure, health promotion on campus and recommendations. This study provides robust evidence that the health-related lifestyles of the student population are worrying and suggests that the trend in chronic diseases associated with unhealthy lifestyles sustained over years might be unlikely to change in future generations. University students' health-related lifestyle is a concern. Nine out of the identified 10 predictors of problematic physical activity, nutrition and mental wellbeing, were environmental/societal or institutional barriers. Universities must expand corporate responsibilities to include the promotion of health as part of their core values.

  11. Association analysis using next-generation sequence data from publicly available control groups: the robust variance score statistic.

    PubMed

    Derkach, Andriy; Chiang, Theodore; Gong, Jiafen; Addis, Laura; Dobbins, Sara; Tomlinson, Ian; Houlston, Richard; Pal, Deb K; Strug, Lisa J

    2014-08-01

    Sufficiently powered case-control studies with next-generation sequence (NGS) data remain prohibitively expensive for many investigators. If feasible, a more efficient strategy would be to include publicly available sequenced controls. However, these studies can be confounded by differences in sequencing platform; alignment, single nucleotide polymorphism and variant calling algorithms; read depth; and selection thresholds. Assuming one can match cases and controls on the basis of ethnicity and other potential confounding factors, and one has access to the aligned reads in both groups, we investigate the effect of systematic differences in read depth and selection threshold when comparing allele frequencies between cases and controls. We propose a novel likelihood-based method, the robust variance score (RVS), that substitutes genotype calls by their expected values given observed sequence data. We show theoretically that the RVS eliminates read depth bias in the estimation of minor allele frequency. We also demonstrate that, using simulated and real NGS data, the RVS method controls Type I error and has comparable power to the 'gold standard' analysis with the true underlying genotypes for both common and rare variants. An RVS R script and instructions can be found at strug.research.sickkids.ca, and at https://github.com/strug-lab/RVS. lisa.strug@utoronto.ca Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Employment of telemedicine in emergency medicine. Clinical requirement analysis, system development and first test results.

    PubMed

    Czaplik, M; Bergrath, S; Rossaint, R; Thelen, S; Brodziak, T; Valentin, B; Hirsch, F; Beckers, S K; Brokmann, J C

    2014-01-01

    Demographic change, rising co-morbidity and an increasing number of emergencies are the main challenges that emergency medical services (EMS) in several countries worldwide are facing. In order to improve quality in EMS, highly trained personnel and well-equipped ambulances are essential. However several studies have shown a deficiency in qualified EMS physicians. Telemedicine emerges as a complementary system in EMS that may provide expertise and improve quality of medical treatment on the scene. Hence our aim is to develop and test a specific teleconsultation system. During the development process several use cases were defined and technically specified by medical experts and engineers in the areas of: system administration, start-up of EMS assistance systems, audio communication, data transfer, routine tele-EMS physician activities and research capabilities. Upon completion, technical field tests were performed under realistic conditions to test system properties such as robustness, feasibility and usability, providing end-to-end measurements. Six ambulances were equipped with telemedical facilities based on the results of the requirement analysis and 55 scenarios were tested under realistic conditions in one month. The results indicate that the developed system performed well in terms of usability and robustness. The major challenges were, as expected, mobile communication and data network availability. Third generation networks were only available in 76.4% of the cases. Although 3G (third generation), such as Universal Mobile Telecommunications System (UMTS), provides beneficial conditions for higher bandwidth, system performance for most features was also acceptable under adequate 2G (second generation) test conditions. An innovative concept for the use of telemedicine for medical consultations in EMS was developed. Organisational and technical aspects were considered and practical requirements specified. Since technical feasibility was demonstrated in these technical field tests, the next step would be to prove medical usefulness and technical robustness under real conditions in a clinical trial.

  13. Robust control of accelerators

    NASA Astrophysics Data System (ADS)

    Joel, W.; Johnson, D.; Chaouki, Abdallah T.

    1991-07-01

    The problem of controlling the variations in the rf power system can be effectively cast as an application of modern control theory. Two components of this theory are obtaining a model and a feedback structure. The model inaccuracies influence the choice of a particular controller structure. Because of the modelling uncertainty, one has to design either a variable, adaptive controller or a fixed, robust controller to achieve the desired objective. The adaptive control scheme usually results in very complex hardware; and, therefore, shall not be pursued in this research. In contrast, the robust control method leads to simpler hardware. However, robust control requires a more accurate mathematical model of the physical process than is required by adaptive control. Our research at the Los Alamos National Laboratory (LANL) and the University of New Mexico (UNM) has led to the development and implementation of a new robust rf power feedback system. In this article, we report on our research progress. In section 1, the robust control problem for the rf power system and the philosophy adopted for the beginning phase of our research is presented. In section 2, the results of our proof-of-principle experiments are presented. In section 3, we describe the actual controller configuration that is used in LANL FEL physics experiments. The novelty of our approach is that the control hardware is implemented directly in rf. without demodulating, compensating, and then remodulating.

  14. Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer

    DTIC Science & Technology

    2006-03-01

    able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem

  15. The presence of a phantom field in a Randall–Sundrum scenario

    NASA Astrophysics Data System (ADS)

    Acuña-Cárdenas, Rubén O.; Astorga-Moreno, J. A.; García-Aspeitia, Miguel A.; López-Domínguez, J. C.

    2018-02-01

    The presence of phantom dark energy in brane world cosmology generates important new effects, causing a premature big rip singularity when we increase the presence of extra dimensions and considerably competing with the other components of our Universe. This article first considers only a field with the characteristic equation ω<-1 and then the explicit form of the scalar field with a potential with a maximum (with the aim of avoiding a big rip singularity). In both cases we study the dynamics robustly through dynamical analysis theory, considering in detail parameters such as the deceleration q and the vector field associated to the dynamical system. Results are discussed with the purpose of treating the cosmology with a phantom field as dark energy in a Randall–Sundrum scenario.

  16. Reaction Decoder Tool (RDT): extracting features from chemical reactions.

    PubMed

    Rahman, Syed Asad; Torrance, Gilliean; Baldacci, Lorenzo; Martínez Cuesta, Sergio; Fenninger, Franz; Gopal, Nimish; Choudhary, Saket; May, John W; Holliday, Gemma L; Steinbeck, Christoph; Thornton, Janet M

    2016-07-01

    Extracting chemical features like Atom-Atom Mapping (AAM), Bond Changes (BCs) and Reaction Centres from biochemical reactions helps us understand the chemical composition of enzymatic reactions. Reaction Decoder is a robust command line tool, which performs this task with high accuracy. It supports standard chemical input/output exchange formats i.e. RXN/SMILES, computes AAM, highlights BCs and creates images of the mapped reaction. This aids in the analysis of metabolic pathways and the ability to perform comparative studies of chemical reactions based on these features. This software is implemented in Java, supported on Windows, Linux and Mac OSX, and freely available at https://github.com/asad/ReactionDecoder : asad@ebi.ac.uk or s9asad@gmail.com. © The Author 2016. Published by Oxford University Press.

  17. On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.

    PubMed

    Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar

    2015-01-01

    Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.

  18. Wavelet Filtering to Reduce Conservatism in Aeroservoelastic Robust Stability Margins

    NASA Technical Reports Server (NTRS)

    Brenner, Marty; Lind, Rick

    1998-01-01

    Wavelet analysis for filtering and system identification was used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins was reduced with parametric and nonparametric time-frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data was used to reduce the effects of external desirableness and unmodeled dynamics. Parametric estimates of modal stability were also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. F-18 high Alpha Research Vehicle aeroservoelastic flight test data demonstrated improved robust stability prediction by extension of the stability boundary beyond the flight regime.

  19. Modelling uncertainties and possible future trends of precipitation and temperature for 10 sub-basins in Columbia River Basin (CRB)

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, A.; Rana, A.; Qin, Y.; Moradkhani, H.

    2014-12-01

    Trends and changes in future climatic parameters, such as, precipitation and temperature have been a central part of climate change studies. In the present work, we have analyzed the seasonal and yearly trends and uncertainties of prediction in all the 10 sub-basins of Columbia River Basin (CRB) for future time period of 2010-2099. The work is carried out using 2 different sets of statistically downscaled Global Climate Model (GCMs) projection datasets i.e. Bias correction and statistical downscaling (BCSD) generated at Portland State University and The Multivariate Adaptive Constructed Analogs (MACA) generated at University of Idaho. The analysis is done for with 10 GCM downscaled products each from CMIP5 daily dataset totaling to 40 different downscaled products for robust analysis. Summer, winter and yearly trend analysis is performed for all the 10 sub-basins using linear regression (significance tested by student t test) and Mann Kendall test (0.05 percent significance level), for precipitation (P), temperature maximum (Tmax) and temperature minimum (Tmin). Thereafter, all the parameters are modelled for uncertainty, across all models, in all the 10 sub-basins and across the CRB for future scenario periods. Results have indicated in varied degree of trends for all the sub-basins, mostly pointing towards a significant increase in all three climatic parameters, for all the seasons and yearly considerations. Uncertainty analysis have reveled very high change in all the parameters across models and sub-basins under consideration. Basin wide uncertainty analysis is performed to corroborate results from smaller, sub-basin scale. Similar trends and uncertainties are reported on the larger scale as well. Interestingly, both trends and uncertainties are higher during winter period than during summer, contributing to large part of the yearly change.

  20. Own-Group Face Recognition Bias: The Effects of Location and Reputation

    PubMed Central

    Yan, Linlin; Wang, Zhe; Huang, Jianling; Sun, Yu-Hao P.; Judges, Rebecca A.; Xiao, Naiqi G.; Lee, Kang

    2017-01-01

    In the present study, we examined whether social categorization based on university affiliation can induce an advantage in recognizing faces. Moreover, we investigated how the reputation or location of the university affected face recognition performance using an old/new paradigm. We assigned five different university labels to the faces: participants’ own university and four other universities. Among the four other university labels, we manipulated the academic reputation and geographical location of these universities relative to the participants’ own university. The results showed that an own-group face recognition bias emerged for faces with own-university labels comparing to those with other-university labels. Furthermore, we found a robust own-group face recognition bias only when the other university was located in a different city far away from participants’ own university. Interestingly, we failed to find the influence of university reputation on own-group face recognition bias. These results suggest that categorizing a face as a member of one’s own university is sufficient to enhance recognition accuracy and the location will play a more important role in the effect of social categorization on face recognition than reputation. The results provide insight into the role of motivational factors underlying the university membership in face perception. PMID:29066989

  1. Robust Mean and Covariance Structure Analysis through Iteratively Reweighted Least Squares.

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Bentler, Peter M.

    2000-01-01

    Adapts robust schemes to mean and covariance structures, providing an iteratively reweighted least squares approach to robust structural equation modeling. Each case is weighted according to its distance, based on first and second order moments. Test statistics and standard error estimators are given. (SLD)

  2. MPLEx: a Robust and Universal Protocol for Single-Sample Integrative Proteomic, Metabolomic, and Lipidomic Analyses

    PubMed Central

    Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.; Burnum-Johnson, Kristin E.; Kim, Young-Mo; Kyle, Jennifer E.; Matzke, Melissa M.; Shukla, Anil K.; Chu, Rosalie K.; Schepmoes, Athena A.; Jacobs, Jon M.; Baric, Ralph S.; Webb-Robertson, Bobbie-Jo; Smith, Richard D.

    2016-01-01

    ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. The metabolite, protein, and lipid extraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of this protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental, in vitro, and clinical). IMPORTANCE In systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample. Author Video: An author video summary of this article is available. PMID:27822525

  3. MetaKTSP: a meta-analytic top scoring pair method for robust cross-study validation of omics prediction analysis.

    PubMed

    Kim, SungHwan; Lin, Chien-Wei; Tseng, George C

    2016-07-01

    Supervised machine learning is widely applied to transcriptomic data to predict disease diagnosis, prognosis or survival. Robust and interpretable classifiers with high accuracy are usually favored for their clinical and translational potential. The top scoring pair (TSP) algorithm is an example that applies a simple rank-based algorithm to identify rank-altered gene pairs for classifier construction. Although many classification methods perform well in cross-validation of single expression profile, the performance usually greatly reduces in cross-study validation (i.e. the prediction model is established in the training study and applied to an independent test study) for all machine learning methods, including TSP. The failure of cross-study validation has largely diminished the potential translational and clinical values of the models. The purpose of this article is to develop a meta-analytic top scoring pair (MetaKTSP) framework that combines multiple transcriptomic studies and generates a robust prediction model applicable to independent test studies. We proposed two frameworks, by averaging TSP scores or by combining P-values from individual studies, to select the top gene pairs for model construction. We applied the proposed methods in simulated data sets and three large-scale real applications in breast cancer, idiopathic pulmonary fibrosis and pan-cancer methylation. The result showed superior performance of cross-study validation accuracy and biomarker selection for the new meta-analytic framework. In conclusion, combining multiple omics data sets in the public domain increases robustness and accuracy of the classification model that will ultimately improve disease understanding and clinical treatment decisions to benefit patients. An R package MetaKTSP is available online. (http://tsenglab.biostat.pitt.edu/software.htm). ctseng@pitt.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. What Is Robustness?: Problem Framing Challenges for Water Systems Planning Under Change

    NASA Astrophysics Data System (ADS)

    Herman, J. D.; Reed, P. M.; Zeff, H. B.; Characklis, G. W.

    2014-12-01

    Water systems planners have long recognized the need for robust solutions capable of withstanding deviations from the conditions for which they were designed. Faced with a set of alternatives to choose from—for example, resulting from a multi-objective optimization—existing analysis frameworks offer competing definitions of robustness under change. Robustness analyses have moved from expected utility to exploratory "bottom-up" approaches in which vulnerable scenarios are identified prior to assigning likelihoods; examples include Robust Decision Making (RDM), Decision Scaling, Info-Gap, and Many-Objective Robust Decision Making (MORDM). We propose a taxonomy of robustness frameworks to compare and contrast these approaches, based on their methods of (1) alternative selection, (2) sampling of states of the world, (3) quantification of robustness measures, and (4) identification of key uncertainties using sensitivity analysis. Using model simulations from recent work in multi-objective urban water supply portfolio planning, we illustrate the decision-relevant consequences that emerge from each of these choices. Results indicate that the methodological choices in the taxonomy lead to substantially different planning alternatives, underscoring the importance of an informed definition of robustness. We conclude with a set of recommendations for problem framing: that alternatives should be searched rather than prespecified; dominant uncertainties should be discovered rather than assumed; and that a multivariate satisficing measure of robustness allows stakeholders to achieve their problem-specific performance requirements. This work highlights the importance of careful problem formulation, and provides a common vocabulary to link the robustness frameworks widely used in the field of water systems planning.

  5. Effects of Text Structure, Reading Goals and Epistemic Beliefs on Conceptual Change

    ERIC Educational Resources Information Center

    Trevors, Gregory; Muis, Krista R.

    2015-01-01

    We investigated the online and offline effects of learner and instructional characteristics on conceptual change of a robust misconception in science. Fifty-nine undergraduate university students with misconceptions about evolution were identified as espousing evaluativist or non-evaluativist epistemic beliefs in science. Participants were…

  6. Robust Behavior-Based Control for Distributed Multi-Robot Collection Tasks

    DTIC Science & Technology

    2000-01-01

    Department, University of Southern California, Los Angeles, CA 90089-0781 USA (e-mail: mataric @usc.edu) For a given task environment and set of robots...Press: Cambridge, Mas- sachusetts. [17] Richard T. Vaughan, Kasper Sty, Gaurav S. Sukhatme, and Maja J Mataric, \\Whistling in the dark : Cooperative

  7. Partnership across Programs and Schools: Fostering Collaboration in Shared Spaces

    ERIC Educational Resources Information Center

    Han, Heejeong Sophia; Parker, Audra K.; Berson, Ilene R.

    2014-01-01

    Recent reports call for a structural transformation of teacher preparation programs with increased attention to quality field-based learning experiences for pre-service teachers. Ideally, this occurs in the context of robust university-school partnerships. The challenges lie in identifying such school sites and building meaningful, reciprocal…

  8. A new automated assessment method for contrast-detail images by applying support vector machine and its robustness to nonlinear image processing.

    PubMed

    Takei, Takaaki; Ikeda, Mitsuru; Imai, Kuniharu; Yamauchi-Kawaura, Chiyo; Kato, Katsuhiko; Isoda, Haruo

    2013-09-01

    The automated contrast-detail (C-D) analysis methods developed so-far cannot be expected to work well on images processed with nonlinear methods, such as noise reduction methods. Therefore, we have devised a new automated C-D analysis method by applying support vector machine (SVM), and tested for its robustness to nonlinear image processing. We acquired the CDRAD (a commercially available C-D test object) images at a tube voltage of 120 kV and a milliampere-second product (mAs) of 0.5-5.0. A partial diffusion equation based technique was used as noise reduction method. Three radiologists and three university students participated in the observer performance study. The training data for our SVM method was the classification data scored by the one radiologist for the CDRAD images acquired at 1.6 and 3.2 mAs and their noise-reduced images. We also compared the performance of our SVM method with the CDRAD Analyser algorithm. The mean C-D diagrams (that is a plot of the mean of the smallest visible hole diameter vs. hole depth) obtained from our devised SVM method agreed well with the ones averaged across the six human observers for both original and noise-reduced CDRAD images, whereas the mean C-D diagrams from the CDRAD Analyser algorithm disagreed with the ones from the human observers for both original and noise-reduced CDRAD images. In conclusion, our proposed SVM method for C-D analysis will work well for the images processed with the non-linear noise reduction method as well as for the original radiographic images.

  9. Patterns of Gray Matter Abnormalities in Schizophrenia Based on an International Mega-analysis.

    PubMed

    Gupta, Cota Navin; Calhoun, Vince D; Rachakonda, Srinivas; Chen, Jiayu; Patel, Veena; Liu, Jingyu; Segall, Judith; Franke, Barbara; Zwiers, Marcel P; Arias-Vasquez, Alejandro; Buitelaar, Jan; Fisher, Simon E; Fernandez, Guillen; van Erp, Theo G M; Potkin, Steven; Ford, Judith; Mathalon, Daniel; McEwen, Sarah; Lee, Hyo Jong; Mueller, Bryon A; Greve, Douglas N; Andreassen, Ole; Agartz, Ingrid; Gollub, Randy L; Sponheim, Scott R; Ehrlich, Stefan; Wang, Lei; Pearlson, Godfrey; Glahn, David C; Sprooten, Emma; Mayer, Andrew R; Stephen, Julia; Jung, Rex E; Canive, Jose; Bustillo, Juan; Turner, Jessica A

    2015-09-01

    Analyses of gray matter concentration (GMC) deficits in patients with schizophrenia (Sz) have identified robust changes throughout the cortex. We assessed the relationships between diagnosis, overall symptom severity, and patterns of gray matter in the largest aggregated structural imaging dataset to date. We performed both source-based morphometry (SBM) and voxel-based morphometry (VBM) analyses on GMC images from 784 Sz and 936 controls (Ct) across 23 scanning sites in Europe and the United States. After correcting for age, gender, site, and diagnosis by site interactions, SBM analyses showed 9 patterns of diagnostic differences. They comprised separate cortical, subcortical, and cerebellar regions. Seven patterns showed greater GMC in Ct than Sz, while 2 (brainstem and cerebellum) showed greater GMC for Sz. The greatest GMC deficit was in a single pattern comprising regions in the superior temporal gyrus, inferior frontal gyrus, and medial frontal cortex, which replicated over analyses of data subsets. VBM analyses identified overall cortical GMC loss and one small cluster of increased GMC in Sz, which overlapped with the SBM brainstem component. We found no significant association between the component loadings and symptom severity in either analysis. This mega-analysis confirms that the commonly found GMC loss in Sz in the anterior temporal lobe, insula, and medial frontal lobe form a single, consistent spatial pattern even in such a diverse dataset. The separation of GMC loss into robust, repeatable spatial patterns across multiple datasets paves the way for the application of these methods to identify subtle genetic and clinical cohort effects. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  10. Robust extrema features for time-series data analysis.

    PubMed

    Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N

    2013-06-01

    The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.

  11. Robust Selection Algorithm (RSA) for Multi-Omic Biomarker Discovery; Integration with Functional Network Analysis to Identify miRNA Regulated Pathways in Multiple Cancers.

    PubMed

    Sehgal, Vasudha; Seviour, Elena G; Moss, Tyler J; Mills, Gordon B; Azencott, Robert; Ram, Prahlad T

    2015-01-01

    MicroRNAs (miRNAs) play a crucial role in the maintenance of cellular homeostasis by regulating the expression of their target genes. As such, the dysregulation of miRNA expression has been frequently linked to cancer. With rapidly accumulating molecular data linked to patient outcome, the need for identification of robust multi-omic molecular markers is critical in order to provide clinical impact. While previous bioinformatic tools have been developed to identify potential biomarkers in cancer, these methods do not allow for rapid classification of oncogenes versus tumor suppressors taking into account robust differential expression, cutoffs, p-values and non-normality of the data. Here, we propose a methodology, Robust Selection Algorithm (RSA) that addresses these important problems in big data omics analysis. The robustness of the survival analysis is ensured by identification of optimal cutoff values of omics expression, strengthened by p-value computed through intensive random resampling taking into account any non-normality in the data and integration into multi-omic functional networks. Here we have analyzed pan-cancer miRNA patient data to identify functional pathways involved in cancer progression that are associated with selected miRNA identified by RSA. Our approach demonstrates the way in which existing survival analysis techniques can be integrated with a functional network analysis framework to efficiently identify promising biomarkers and novel therapeutic candidates across diseases.

  12. Investigation of air transportation technology at Princeton University, 1991-1992

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1993-01-01

    The Air Transportation Research Program at Princeton University proceeded along six avenues during the past year: (1) intelligent flight control; (2) computer-aided control system design; (3) neural networks for flight control; (4) stochastic robustness of flight control systems; (5) microburst hazards to aircraft; and (6) fundamental dynamics of atmospheric flight. This research has resulted in a number of publications, including archival papers and conference papers. An annotated bibliography of publications that appeared between June 1991 and June 1992 appears at the end of this report. The research that these papers describe was supported in whole or in part by the Joint University Program, including work that was completed prior to the reporting period.

  13. Smart phones: platform enabling modular, chemical, biological, and explosives sensing

    NASA Astrophysics Data System (ADS)

    Finch, Amethist S.; Coppock, Matthew; Bickford, Justin R.; Conn, Marvin A.; Proctor, Thomas J.; Stratis-Cullum, Dimitra N.

    2013-05-01

    Reliable, robust, and portable technologies are needed for the rapid identification and detection of chemical, biological, and explosive (CBE) materials. A key to addressing the persistent threat to U.S. troops in the current war on terror is the rapid detection and identification of the precursor materials used in development of improvised explosive devices, homemade explosives, and bio-warfare agents. However, a universal methodology for detection and prevention of CBE materials in the use of these devices has proven difficult. Herein, we discuss our efforts towards the development of a modular, robust, inexpensive, pervasive, archival, and compact platform (android based smart phone) enabling the rapid detection of these materials.

  14. Robustness properties of discrete time regulators, LOG regulators and hybrid systems

    NASA Technical Reports Server (NTRS)

    Stein, G.; Athans, M.

    1979-01-01

    Robustness properites of sample-data LQ regulators are derived which show that these regulators have fundamentally inferior uncertainty tolerances when compared to their continuous-time counterparts. Results are also presented in stability theory, multivariable frequency domain analysis, LQG robustness, and mathematical representations of hybrid systems.

  15. Benchmarking of a treatment planning system for spot scanning proton therapy: Comparison and analysis of robustness to setup errors of photon IMRT and proton SFUD treatment plans of base of skull meningioma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harding, R., E-mail: ruth.harding2@wales.nhs.uk; Trnková, P.; Lomax, A. J.

    Purpose: Base of skull meningioma can be treated with both intensity modulated radiation therapy (IMRT) and spot scanned proton therapy (PT). One of the main benefits of PT is better sparing of organs at risk, but due to the physical and dosimetric characteristics of protons, spot scanned PT can be more sensitive to the uncertainties encountered in the treatment process compared with photon treatment. Therefore, robustness analysis should be part of a comprehensive comparison between these two treatment methods in order to quantify and understand the sensitivity of the treatment techniques to uncertainties. The aim of this work was tomore » benchmark a spot scanning treatment planning system for planning of base of skull meningioma and to compare the created plans and analyze their robustness to setup errors against the IMRT technique. Methods: Plans were produced for three base of skull meningioma cases: IMRT planned with a commercial TPS [Monaco (Elekta AB, Sweden)]; single field uniform dose (SFUD) spot scanning PT produced with an in-house TPS (PSI-plan); and SFUD spot scanning PT plan created with a commercial TPS [XiO (Elekta AB, Sweden)]. A tool for evaluating robustness to random setup errors was created and, for each plan, both a dosimetric evaluation and a robustness analysis to setup errors were performed. Results: It was possible to create clinically acceptable treatment plans for spot scanning proton therapy of meningioma with a commercially available TPS. However, since each treatment planning system uses different methods, this comparison showed different dosimetric results as well as different sensitivities to setup uncertainties. The results confirmed the necessity of an analysis tool for assessing plan robustness to provide a fair comparison of photon and proton plans. Conclusions: Robustness analysis is a critical part of plan evaluation when comparing IMRT plans with spot scanned proton therapy plans.« less

  16. Nonequilibrium dynamic critical scaling of the quantum Ising chain.

    PubMed

    Kolodrubetz, Michael; Clark, Bryan K; Huse, David A

    2012-07-06

    We solve for the time-dependent finite-size scaling functions of the one-dimensional transverse-field Ising chain during a linear-in-time ramp of the field through the quantum critical point. We then simulate Mott-insulating bosons in a tilted potential, an experimentally studied system in the same equilibrium universality class, and demonstrate that universality holds for the dynamics as well. We find qualitatively athermal features of the scaling functions, such as negative spin correlations, and we show that they should be robustly observable within present cold atom experiments.

  17. An observatory control system for the University of Hawai'i 2.2m Telescope

    NASA Astrophysics Data System (ADS)

    McKay, Luke; Erickson, Christopher; Mukensnable, Donn; Stearman, Anthony; Straight, Brad

    2016-07-01

    The University of Hawai'i 2.2m telescope at Maunakea has operated since 1970, and has had several controls upgrades to date. The newest system will operate as a distributed hierarchy of GNU/Linux central server, networked single-board computers, microcontrollers, and a modular motion control processor for the main axes. Rather than just a telescope control system, this new effort is towards a cohesive, modular, and robust whole observatory control system, with design goals of fully robotic unattended operation, high reliability, and ease of maintenance and upgrade.

  18. From LPF to eLISA: new approach in payload software

    NASA Astrophysics Data System (ADS)

    Gesa, Ll.; Martin, V.; Conchillo, A.; Ortega, J. A.; Mateos, I.; Torrents, A.; Lopez-Zaragoza, J. P.; Rivas, F.; Lloro, I.; Nofrarias, M.; Sopuerta, CF.

    2017-05-01

    eLISA will be the first observatory in space to explore the Gravitational Universe. It will gather revolutionary information about the dark universe. This implies a robust and reliable embedded control software and hardware working together. With the lessons learnt with the LISA Pathfinder payload software as baseline, we will introduce in this short article the key concepts and new approaches that our group is working on in terms of software: multiprocessor, self-modifying-code strategies, 100% hardware and software monitoring, embedded scripting, Time and Space Partition among others.

  19. Dark energy and the cosmic microwave background radiation

    NASA Technical Reports Server (NTRS)

    Dodelson, S.; Knox, L.

    2000-01-01

    We find that current cosmic microwave background anisotropy data strongly constrain the mean spatial curvature of the Universe to be near zero, or, equivalently, the total energy density to be near critical-as predicted by inflation. This result is robust to editing of data sets, and variation of other cosmological parameters (totaling seven, including a cosmological constant). Other lines of argument indicate that the energy density of nonrelativistic matter is much less than critical. Together, these results are evidence, independent of supernovae data, for dark energy in the Universe.

  20. Dark energy and the cosmic microwave background radiation.

    PubMed

    Dodelson, S; Knox, L

    2000-04-17

    We find that current cosmic microwave background anisotropy data strongly constrain the mean spatial curvature of the Universe to be near zero, or, equivalently, the total energy density to be near critical-as predicted by inflation. This result is robust to editing of data sets, and variation of other cosmological parameters (totaling seven, including a cosmological constant). Other lines of argument indicate that the energy density of nonrelativistic matter is much less than critical. Together, these results are evidence, independent of supernovae data, for dark energy in the Universe.

  1. Robust, Causal, and Incremental Approaches to Investigating Linguistic Adaptation

    PubMed Central

    Roberts, Seán G.

    2018-01-01

    This paper discusses the maximum robustness approach for studying cases of adaptation in language. We live in an age where we have more data on more languages than ever before, and more data to link it with from other domains. This should make it easier to test hypotheses involving adaptation, and also to spot new patterns that might be explained by adaptation. However, there is not much discussion of the overall approach to research in this area. There are outstanding questions about how to formalize theories, what the criteria are for directing research and how to integrate results from different methods into a clear assessment of a hypothesis. This paper addresses some of those issues by suggesting an approach which is causal, incremental and robust. It illustrates the approach with reference to a recent claim that dry environments select against the use of precise contrasts in pitch. Study 1 replicates a previous analysis of the link between humidity and lexical tone with an alternative dataset and finds that it is not robust. Study 2 performs an analysis with a continuous measure of tone and finds no significant correlation. Study 3 addresses a more recent analysis of the link between humidity and vowel use and finds that it is robust, though the effect size is small and the robustness of the measurement of vowel use is low. Methodological robustness of the general theory is addressed by suggesting additional approaches including iterated learning, a historical case study, corpus studies, and studying individual speech. PMID:29515487

  2. Rapid neural discrimination of communicative gestures.

    PubMed

    Redcay, Elizabeth; Carlson, Thomas A

    2015-04-01

    Humans are biased toward social interaction. Behaviorally, this bias is evident in the rapid effects that self-relevant communicative signals have on attention and perceptual systems. The processing of communicative cues recruits a wide network of brain regions, including mentalizing systems. Relatively less work, however, has examined the timing of the processing of self-relevant communicative cues. In the present study, we used multivariate pattern analysis (decoding) approach to the analysis of magnetoencephalography (MEG) to study the processing dynamics of social-communicative actions. Twenty-four participants viewed images of a woman performing actions that varied on a continuum of communicative factors including self-relevance (to the participant) and emotional valence, while their brain activity was recorded using MEG. Controlling for low-level visual factors, we found early discrimination of emotional valence (70 ms) and self-relevant communicative signals (100 ms). These data offer neural support for the robust and rapid effects of self-relevant communicative cues on behavior. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  3. Raman tweezers in microfluidic systems for analysis and sorting of living cells

    NASA Astrophysics Data System (ADS)

    Pilát, Zdeněk.; Ježek, Jan; Kaňka, Jan; Zemánek, Pavel

    2014-12-01

    We have devised an analytical and sorting system combining optical trapping with Raman spectroscopy in microfluidic environment, dedicated to identification and sorting of biological objects, such as living cells of various unicellular organisms. Our main goal was to create a robust and universal platform for non-destructive and non-contact sorting of micro-objects based on their Raman spectral properties. This approach allowed us to collect spectra containing information about the chemical composition of the objects, such as the presence and composition of pigments, lipids, proteins, or nucleic acids, avoiding artificial chemical probes such as fluorescent markers. The non-destructive nature of this optical analysis and manipulation allowed us to separate individual living cells of our interest in a sterile environment and provided the possibility to cultivate the selected cells for further experiments. We used a mixture of polystyrene micro-particles and algal cells to test and demonstrate the function of our analytical and sorting system. The devised system could find its use in many medical, biotechnological, and biological applications.

  4. Lokiarchaea are close relatives of Euryarchaeota, not bridging the gap between prokaryotes and eukaryotes

    PubMed Central

    Forterre, Patrick

    2017-01-01

    The eocyte hypothesis, in which Eukarya emerged from within Archaea, has been boosted by the description of a new candidate archaeal phylum, “Lokiarchaeota”, from metagenomic data. Eukarya branch within Lokiarchaeota in a tree reconstructed from the concatenation of 36 universal proteins. However, individual phylogenies revealed that lokiarchaeal proteins sequences have different evolutionary histories. The individual markers phylogenies revealed at least two subsets of proteins, either supporting the Woese or the Eocyte tree of life. Strikingly, removal of a single protein, the elongation factor EF2, is sufficient to break the Eukaryotes-Lokiarchaea affiliation. Our analysis suggests that the three lokiarchaeal EF2 proteins have a chimeric organization that could be due to contamination and/or homologous recombination with patches of eukaryotic sequences. A robust phylogenetic analysis of RNA polymerases with a new dataset indicates that Lokiarchaeota and related phyla of the Asgard superphylum are sister group to Euryarchaeota, not to Eukarya, and supports the monophyly of Archaea with their rooting in the branch leading to Thaumarchaeota. PMID:28604769

  5. DEsubs: an R package for flexible identification of differentially expressed subpathways using RNA-seq experiments.

    PubMed

    Vrahatis, Aristidis G; Balomenos, Panos; Tsakalidis, Athanasios K; Bezerianos, Anastasios

    2016-12-15

    DEsubs is a network-based systems biology R package that extracts disease-perturbed subpathways within a pathway network as recorded by RNA-seq experiments. It contains an extensive and customized framework with a broad range of operation modes at all stages of the subpathway analysis, enabling so a case-specific approach. The operation modes include pathway network construction and processing, subpathway extraction, visualization and enrichment analysis with regard to various biological and pharmacological features. Its capabilities render DEsubs a tool-guide for both the modeler and experimentalist for the identification of more robust systems-level drug targets and biomarkers for complex diseases. DEsubs is implemented as an R package following Bioconductor guidelines: http://bioconductor.org/packages/DEsubs/ CONTACT: tassos.bezerianos@nus.edu.sgSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Raman tweezers in microfluidic systems for analysis and sorting of living cells

    NASA Astrophysics Data System (ADS)

    Pilát, Zdenëk; Ježek, Jan; Kaňka, Jan; Zemánek, Pavel

    2014-03-01

    We have devised an analytical and sorting system combining optical trapping with Raman spectroscopy in microfluidic environment in order to identify and sort biological objects, such as living cells of various prokaryotic and eukaryotic organisms. Our main objective was to create a robust and universal platform for non-contact sorting of microobjects based on their Raman spectral properties. This approach allowed us to collect information about the chemical composition of the objects, such as the presence and composition of lipids, proteins, or nucleic acids without using artificial chemical probes such as fluorescent markers. The non-destructive and non-contact nature of this optical analysis and manipulation allowed us to separate individual living cells of our interest in a sterile environment and provided the possibility to cultivate the selected cells for further experiments. We used differently treated cells of algae to test and demonstrate the function of our analytical and sorting system. The devised system could find its use in many medical, biotechnological, and biological applications.

  7. Robustness analysis of a green chemistry-based model for the classification of silver nanoparticles synthesis processes

    EPA Science Inventory

    This paper proposes a robustness analysis based on Multiple Criteria Decision Aiding (MCDA). The ensuing model was used to assess the implementation of green chemistry principles in the synthesis of silver nanoparticles. Its recommendations were also compared to an earlier develo...

  8. Robustness of Type I Error and Power in Set Correlation Analysis of Contingency Tables.

    ERIC Educational Resources Information Center

    Cohen, Jacob; Nee, John C. M.

    1990-01-01

    The analysis of contingency tables via set correlation allows the assessment of subhypotheses involving contrast functions of the categories of the nominal scales. The robustness of such methods with regard to Type I error and statistical power was studied via a Monte Carlo experiment. (TJH)

  9. Identification of robust adaptation gene regulatory network parameters using an improved particle swarm optimization algorithm.

    PubMed

    Huang, X N; Ren, H P

    2016-05-13

    Robust adaptation is a critical ability of gene regulatory network (GRN) to survive in a fluctuating environment, which represents the system responding to an input stimulus rapidly and then returning to its pre-stimulus steady state timely. In this paper, the GRN is modeled using the Michaelis-Menten rate equations, which are highly nonlinear differential equations containing 12 undetermined parameters. The robust adaption is quantitatively described by two conflicting indices. To identify the parameter sets in order to confer the GRNs with robust adaptation is a multi-variable, multi-objective, and multi-peak optimization problem, which is difficult to acquire satisfactory solutions especially high-quality solutions. A new best-neighbor particle swarm optimization algorithm is proposed to implement this task. The proposed algorithm employs a Latin hypercube sampling method to generate the initial population. The particle crossover operation and elitist preservation strategy are also used in the proposed algorithm. The simulation results revealed that the proposed algorithm could identify multiple solutions in one time running. Moreover, it demonstrated a superior performance as compared to the previous methods in the sense of detecting more high-quality solutions within an acceptable time. The proposed methodology, owing to its universality and simplicity, is useful for providing the guidance to design GRN with superior robust adaptation.

  10. The brain functional connectome is robustly altered by lack of sleep.

    PubMed

    Kaufmann, Tobias; Elvsåshagen, Torbjørn; Alnæs, Dag; Zak, Nathalia; Pedersen, Per Ø; Norbom, Linn B; Quraishi, Sophia H; Tagliazucchi, Enzo; Laufs, Helmut; Bjørnerud, Atle; Malt, Ulrik F; Andreassen, Ole A; Roussos, Evangelos; Duff, Eugene P; Smith, Stephen M; Groote, Inge R; Westlye, Lars T

    2016-02-15

    Sleep is a universal phenomenon necessary for maintaining homeostasis and function across a range of organs. Lack of sleep has severe health-related consequences affecting whole-body functioning, yet no other organ is as severely affected as the brain. The neurophysiological mechanisms underlying these deficits are poorly understood. Here, we characterize the dynamic changes in brain connectivity profiles inflicted by sleep deprivation and how they deviate from regular daily variability. To this end, we obtained functional magnetic resonance imaging data from 60 young, adult male participants, scanned in the morning and evening of the same day and again the following morning. 41 participants underwent total sleep deprivation before the third scan, whereas the remainder had another night of regular sleep. Sleep deprivation strongly altered the connectivity of several resting-state networks, including dorsal attention, default mode, and hippocampal networks. Multivariate classification based on connectivity profiles predicted deprivation state with high accuracy, corroborating the robustness of the findings on an individual level. Finally, correlation analysis suggested that morning-to-evening connectivity changes were reverted by sleep (control group)-a pattern which did not occur after deprivation. We conclude that both, a day of waking and a night of sleep deprivation dynamically alter the brain functional connectome. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Caius: Synthetic Observations Using a Robust End-to-End Radiative Transfer Pipeline

    NASA Astrophysics Data System (ADS)

    Simeon Barrow, Kirk Stuart; Wise, John H.; O'Shea, Brian; Norman, Michael L.; Xu, Hao

    2018-01-01

    We present synthetic observations for the first generations of galaxies in the Universe and make predictions for future deep field observations for redshifts greater than 6. Due to the strong impact of nebular emission lines and the relatively compact scale of HII regions, high resolution cosmological simulations and a robust suite of analysis tools are required to properly simulate spectra. We created a software pipeline consisting of FSPS, Yggdrasil, Hyperion, Cloudy and our own tools to generate synthetic IR observations from a fully three-dimensional arrangement of gas, dust, and stars. Our prescription allows us to include emission lines for a complete chemical network and tackle the effect of dust extinction and scattering in the various lines of sight. We provide spectra, 2-D binned photon imagery for both HST and JWST IR filters, luminosity relationships, and emission line strengths for a large sample of high redshift galaxies in the Renaissance Simulations (Xu et al. 2013). We also pay special attention to contributions from Population III stars and high-mass X-ray binaries and explore a direct-collapse black hole simulation (Aykutalp et al. 2014). Our resulting synthetic spectra show high variability between galactic halos with a strong dependence on stellar mass, viewing angle, metallicity, gas mass fraction, and formation history.

  12. Teachable, high-content analytics for live-cell, phase contrast movies.

    PubMed

    Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J

    2010-09-01

    CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.

  13. Dimensionality-varied deep convolutional neural network for spectral-spatial classification of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Qu, Haicheng; Liang, Xuejian; Liang, Shichao; Liu, Wanjun

    2018-01-01

    Many methods of hyperspectral image classification have been proposed recently, and the convolutional neural network (CNN) achieves outstanding performance. However, spectral-spatial classification of CNN requires an excessively large model, tremendous computations, and complex network, and CNN is generally unable to use the noisy bands caused by water-vapor absorption. A dimensionality-varied CNN (DV-CNN) is proposed to address these issues. There are four stages in DV-CNN and the dimensionalities of spectral-spatial feature maps vary with the stages. DV-CNN can reduce the computation and simplify the structure of the network. All feature maps are processed by more kernels in higher stages to extract more precise features. DV-CNN also improves the classification accuracy and enhances the robustness to water-vapor absorption bands. The experiments are performed on data sets of Indian Pines and Pavia University scene. The classification performance of DV-CNN is compared with state-of-the-art methods, which contain the variations of CNN, traditional, and other deep learning methods. The experiment of performance analysis about DV-CNN itself is also carried out. The experimental results demonstrate that DV-CNN outperforms state-of-the-art methods for spectral-spatial classification and it is also robust to water-vapor absorption bands. Moreover, reasonable parameters selection is effective to improve classification accuracy.

  14. The connection between globular cluster systems and their host galaxy and environment: a case study of the isolated elliptical NGC 821

    NASA Astrophysics Data System (ADS)

    Spitler, Lee R.; Forbes, Duncan A.; Strader, Jay; Brodie, Jean P.; Gallagher, Jay S.

    2008-03-01

    In an effort to probe the globular cluster (GC) system of an isolated elliptical galaxy, a comprehensive analysis of the NGC 821 GC system was performed. New imaging from the WIYN Mini-Mosaic imager, supplemented with Hubble Space Telescope (HST) WFPC2 images reveals a GC system similar to those found in counterpart ellipticals located in high-density environments. To put these results into the context of galaxy formation, a robustly determined census of GC systems is presented and analysed for galaxies spanning a wide range of masses (> M*), morphologies and environments. Results from this meta-study: (1) confirm previous findings that the number of GCs normalized by host galaxy stellar mass increases with host stellar mass. Spiral galaxies in the sample show smaller relative GC numbers than those of massive ellipticals, suggesting the GC systems of massive ellipticals were not formed from major spiral-spiral mergers; (2) indicate that GC system numbers per unit galaxy baryon mass increases with host baryon mass and that GC formation efficiency may not be universal as previously thought; (3) suggest previously reported trends with environment may be incorrect due to sample bias or the use of galaxy stellar masses to normalize GC numbers. Thus claims for environmentally dependent GC formation efficiencies should be revisited; (4) in combination with weak-lensing halo mass estimates, suggest that GCs formed in direct proportion to the halo mass; (5) are consistent with theoretical predictions whereby the local epoch of reionization did not vary significantly with environment or host galaxy type. Based upon data from the WIYN Observatory, which is a joint facility of the University of Wisconsin-Madison, Indiana University, Yale University and the National Optical Astronomy Observatories. Also includes analysis of observations made with the Hubble Space Telescope obtained from the ESO/ST-ECF Science Archive Facility. E-mail: lspitler@astro.swin.edu.au

  15. Rapid Analysis of Carbohydrates in Bioprocess Samples: An Evaluation of the CarboPac SA10 for HPAE-PAD Analysis by Interlaboratory Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevcik, R. S.; Hyman, D. A.; Basumallich, L.

    2013-01-01

    A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less

  16. Robust optimization in lung treatment plans accounting for geometric uncertainty.

    PubMed

    Zhang, Xin; Rong, Yi; Morrill, Steven; Fang, Jian; Narayanasamy, Ganesh; Galhardo, Edvaldo; Maraboyina, Sanjay; Croft, Christopher; Xia, Fen; Penagaricano, Jose

    2018-05-01

    Robust optimization generates scenario-based plans by a minimax optimization method to find optimal scenario for the trade-off between target coverage robustness and organ-at-risk (OAR) sparing. In this study, 20 lung cancer patients with tumors located at various anatomical regions within the lungs were selected and robust optimization photon treatment plans including intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) plans were generated. The plan robustness was analyzed using perturbed doses with setup error boundary of ±3 mm in anterior/posterior (AP), ±3 mm in left/right (LR), and ±5 mm in inferior/superior (IS) directions from isocenter. Perturbed doses for D 99 , D 98 , and D 95 were computed from six shifted isocenter plans to evaluate plan robustness. Dosimetric study was performed to compare the internal target volume-based robust optimization plans (ITV-IMRT and ITV-VMAT) and conventional PTV margin-based plans (PTV-IMRT and PTV-VMAT). The dosimetric comparison parameters were: ITV target mean dose (D mean ), R 95 (D 95 /D prescription ), Paddick's conformity index (CI), homogeneity index (HI), monitor unit (MU), and OAR doses including lung (D mean , V 20 Gy and V 15 Gy ), chest wall, heart, esophagus, and maximum cord doses. A comparison of optimization results showed the robust optimization plan had better ITV dose coverage, better CI, worse HI, and lower OAR doses than conventional PTV margin-based plans. Plan robustness evaluation showed that the perturbed doses of D 99 , D 98 , and D 95 were all satisfied at least 99% of the ITV to received 95% of prescription doses. It was also observed that PTV margin-based plans had higher MU than robust optimization plans. The results also showed robust optimization can generate plans that offer increased OAR sparing, especially for normal lungs and OARs near or abutting the target. Weak correlation was found between normal lung dose and target size, and no other correlation was observed in this study. © 2018 University of Arkansas for Medical Sciences. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  17. AORN and University of Michigan School of nursing research alliance.

    PubMed

    Talsma, Akkeneel; Chard, Robin; Kleiner, Catherine; Anderson, Christine; Geun, Hyogeun

    2011-06-01

    Research related to perioperative care requires advanced training and is well suited to take place at a research-intensive university. A recent research alliance established between AORN and the University of Michigan School of Nursing, Ann Arbor, uses the strengths of both a robust perioperative professional organization and a research-intensive university to make progress toward improving patient safety and transforming the perioperative work environment. Research activities undertaken by this alliance include investigating nurse staffing characteristics and patient outcomes, as well as evaluating the congruence and definitions of data elements contained in AORN's SYNTEGRITY™ Standardized Perioperative Framework. Disseminating the findings of the alliance is expected to facilitate the communication and application of new knowledge to nursing practice and help advance the perioperative nursing profession. Copyright © 2011 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  18. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    ERIC Educational Resources Information Center

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  19. Risk Management: An Accountability Guide for University and College Boards

    ERIC Educational Resources Information Center

    Abraham, Janice M.

    2013-01-01

    With proven advice and practical best practices for sound risk management, this robust publication written by the CEO of United Educators identifies how engaged board members should collaborate closely with institutional leaders on a variety of operational and strategic risks. All board members, whatever their role or committee assignment, will…

  20. Residency: Can It Transform Teaching the Way It Did Medicine?

    ERIC Educational Resources Information Center

    Thorpe, Ronald

    2014-01-01

    Universal teacher residency would benefit the teaching profession and ultimately the education of our children. We have yet to work out the fine details, but there is nothing more important than developing robust residency schools where young educators go between their undergraduate preparation and their arrival in the classroom as autonomous…

  1. Leadership: Theory and Practice. Sixth Edition

    ERIC Educational Resources Information Center

    Northouse, Peter G.

    2012-01-01

    Adopted at more than 1,000 colleges and universities worldwide, the market-leading text owes its success to the unique way in which it combines an academically robust account of the major theories and models of leadership with an accessible style and practical exercises that help students apply what they learn. Each chapter of Peter…

  2. The Metropolitan Studies Institute at USC Upstate: Translational Research that Drives Community Decision-Making

    ERIC Educational Resources Information Center

    Brady, Kathleen

    2012-01-01

    The Metropolitan Studies Institute (MSI) at the University of South Carolina Upstate (USC Upstate) demonstrates a robust and unique record of community impact through community indicators research and other translational research. The MSI's work drives programmatic priorities and funding decisions, generates revenue, and increases the community's…

  3. Digital Immigrants, Digital Learning: Reaching Adults through Information Literacy Instruction Online

    ERIC Educational Resources Information Center

    Rapchak, Marcia; Behary, Robert

    2013-01-01

    As information literacy programs become more robust, finding methods of reaching students beyond the traditional undergraduate has become a priority for many institutions. At Duquesne University, efforts have been made to reach adult learners in an accelerated program targeted to nontraditional students, much of which is provided online. This…

  4. Incendiary Discourse: Reconsidering Flaming, Authority, and Democratic Subjectivity in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Oleksiak, Timothy

    2012-01-01

    This article explores the relationship between teacher authority and flaming in asynchronous online communication. Teachers who rely on what I call stabilization and universal applicability--two concepts emerging from a liberal democratic theory--may actually be preventing a full and robust understanding of the complexities of 21st-century…

  5. Robust Research and Rapid Response: The Plum Pox Virus Story

    ERIC Educational Resources Information Center

    Alter, Theodore R.; Bridger, Jeffrey C.; Travis, James W.

    2004-01-01

    Universities are frequently criticized for being unresponsive to the needs of their stakeholders. In response to this perception, many institutions of higher learning have taken steps to become more productively engaged with the people, organizations, and communities they serve. In this article, we analyze the process of engagement by focusing on…

  6. Development of a rapid, robust, and universal picogreen-based method to titer adeno-associated vectors.

    PubMed

    Piedra, Jose; Ontiveros, Maria; Miravet, Susana; Penalva, Cristina; Monfar, Mercè; Chillon, Miguel

    2015-02-01

    Recombinant adeno-associated viruses (rAAVs) are promising vectors in preclinical and clinical assays for the treatment of diseases with gene therapy strategies. Recent technological advances in amplification and purification have allowed the production of highly purified rAAV vector preparations. Although quantitative polymerase chain reaction (qPCR) is the current method of choice for titrating rAAV genomes, it shows high variability. In this work, we report a rapid and robust rAAV titration method based on the quantitation of encapsidated DNA with the fluorescent dye PicoGreen®. This method allows detection from 3×10(10) viral genome/ml up to 2.4×10(13) viral genome/ml in a linear range. Contrasted with dot blot or qPCR, the PicoGreen-based assay has less intra- and interassay variability. Moreover, quantitation is rapid, does not require specific primers or probes, and is independent of the rAAV pseudotype analyzed. In summary, development of this universal rAAV-titering method may have substantive implications in rAAV technology.

  7. Universal filtered multi-carrier system for asynchronous uplink transmission in optical access network

    NASA Astrophysics Data System (ADS)

    Kang, Soo-Min; Kim, Chang-Hun; Han, Sang-Kook

    2016-02-01

    In passive optical network (PON), orthogonal frequency division multiplexing (OFDM) has been studied actively due to its advantages such as high spectra efficiency (SE), dynamic resource allocation in time or frequency domain, and dispersion robustness. However, orthogonal frequency division multiple access (OFDMA)-PON requires tight synchronization among multiple access signals. If not, frequency orthogonality could not be maintained. Also its sidelobe causes inter-channel interference (ICI) to adjacent channel. To prevent ICI caused by high sidelobes, guard band (GB) is usually used which degrades SE. Thus, OFDMA-PON is not suitable for asynchronous uplink transmission in optical access network. In this paper, we propose intensity modulation/direct detection (IM/DD) based universal filtered multi-carrier (UFMC) PON for asynchronous multiple access. The UFMC uses subband filtering to subsets of subcarriers. Since it reduces sidelobe of each subband by applying subband filtering, it could achieve better performance compared to OFDM. For the experimental demonstration, different sample delay was applied to subbands to implement asynchronous transmission condition. As a result, time synchronization robustness of UFMC was verified in asynchronous multiple access system.

  8. Emergent phases and critical behavior in a non-Markovian open quantum system

    NASA Astrophysics Data System (ADS)

    Cheung, H. F. H.; Patil, Y. S.; Vengalattore, M.

    2018-05-01

    Open quantum systems exhibit a range of novel out-of-equilibrium behavior due to the interplay between coherent quantum dynamics and dissipation. Of particular interest in these systems are driven, dissipative transitions, the emergence of dynamical phases with novel broken symmetries, and critical behavior that lies beyond the conventional paradigm of Landau-Ginzburg phenomenology. Here, we consider a parametrically driven two-mode system in the presence of non-Markovian system-reservoir interactions. We show that the non-Markovian dynamics modifies the phase diagram of this system, resulting in the emergence of a broken symmetry phase in a universality class that has no counterpart in the corresponding Markovian system. This emergent phase is accompanied by enhanced two-mode entanglement that remains robust at finite temperatures. Such reservoir-engineered dynamical phases can potentially shed light on universal aspects of dynamical phase transitions in a wide range of nonequilibrium systems, and aid in the development of techniques for the robust generation of entanglement and quantum correlations at finite temperatures with potential applications to quantum control, state preparation, and metrology.

  9. Universality in the Self Organized Critical behavior of a cellular model of superconducting vortex dynamics

    NASA Astrophysics Data System (ADS)

    Sun, Yudong; Vadakkan, Tegy; Bassler, Kevin

    2007-03-01

    We study the universality and robustness of variants of the simple model of superconducting vortex dynamics first introduced by Bassler and Paczuski in Phys. Rev. Lett. 81, 3761 (1998). The model is a coarse-grained model that captures the essential features of the plastic vortex motion. It accounts for the repulsive interaction between vortices, the pining of vortices at quenched disordered locations in the material, and the over-damped dynamics of the vortices that leads to tearing of the flux line lattice. We report the results of extensive simulations of the critical ``Bean state" dynamics of the model. We find a phase diagram containing four distinct phases of dynamical behavior, including two phases with distinct Self Organized Critical (SOC) behavior. Exponents describing the avalanche scaling behavior in the two SOC phases are determined using finite-size scaling. The exponents are found to be robust within each phase and for different variants of the model. The difference of the scaling behavior in the two phases is also observed in the morphology of the avalanches.

  10. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    PubMed

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the sample size is relatively small, we recommend the use of the robust method under the working independence assumption. We illustrate the proposed method through 2 meta-analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  11. How robust is a robust policy? A comparative analysis of alternative robustness metrics for supporting robust decision analysis.

    NASA Astrophysics Data System (ADS)

    Kwakkel, Jan; Haasnoot, Marjolijn

    2015-04-01

    In response to climate and socio-economic change, in various policy domains there is increasingly a call for robust plans or policies. That is, plans or policies that performs well in a very large range of plausible futures. In the literature, a wide range of alternative robustness metrics can be found. The relative merit of these alternative conceptualizations of robustness has, however, received less attention. Evidently, different robustness metrics can result in different plans or policies being adopted. This paper investigates the consequences of several robustness metrics on decision making, illustrated here by the design of a flood risk management plan. A fictitious case, inspired by a river reach in the Netherlands is used. The performance of this system in terms of casualties, damages, and costs for flood and damage mitigation actions is explored using a time horizon of 100 years, and accounting for uncertainties pertaining to climate change and land use change. A set of candidate policy options is specified up front. This set of options includes dike raising, dike strengthening, creating more space for the river, and flood proof building and evacuation options. The overarching aim is to design an effective flood risk mitigation strategy that is designed from the outset to be adapted over time in response to how the future actually unfolds. To this end, the plan will be based on the dynamic adaptive policy pathway approach (Haasnoot, Kwakkel et al. 2013) being used in the Dutch Delta Program. The policy problem is formulated as a multi-objective robust optimization problem (Kwakkel, Haasnoot et al. 2014). We solve the multi-objective robust optimization problem using several alternative robustness metrics, including both satisficing robustness metrics and regret based robustness metrics. Satisficing robustness metrics focus on the performance of candidate plans across a large ensemble of plausible futures. Regret based robustness metrics compare the performance of a candidate plan with the performance of other candidate plans across a large ensemble of plausible futures. Initial results suggest that the simplest satisficing metric, inspired by the signal to noise ratio, results in very risk averse solutions. Other satisficing metrics, which handle the average performance and the dispersion around the average separately, provide substantial additional insights into the trade off between the average performance, and the dispersion around this average. In contrast, the regret-based metrics enhance insight into the relative merits of candidate plans, while being less clear on the average performance or the dispersion around this performance. These results suggest that it is beneficial to use multiple robustness metrics when doing a robust decision analysis study. Haasnoot, M., J. H. Kwakkel, W. E. Walker and J. Ter Maat (2013). "Dynamic Adaptive Policy Pathways: A New Method for Crafting Robust Decisions for a Deeply Uncertain World." Global Environmental Change 23(2): 485-498. Kwakkel, J. H., M. Haasnoot and W. E. Walker (2014). "Developing Dynamic Adaptive Policy Pathways: A computer-assisted approach for developing adaptive strategies for a deeply uncertain world." Climatic Change.

  12. Conceptual problems in detecting the evolution of dark energy when using distance measurements

    NASA Astrophysics Data System (ADS)

    Bolejko, K.

    2011-01-01

    Context. Dark energy is now one of the most important and topical problems in cosmology. The first step to reveal its nature is to detect the evolution of dark energy or to prove beyond doubt that the cosmological constant is indeed constant. However, in the standard approach to cosmology, the Universe is described by the homogeneous and isotropic Friedmann models. Aims: We aim to show that in the perturbed universe (even if perturbations vanish if averaged over sufficiently large scales) the distance-redshift relation is not the same as in the unperturbed universe. This has a serious consequence when studying the nature of dark energy and, as shown here, can impair the analysis and studies of dark energy. Methods: The analysis is based on two methods: the linear lensing approximation and the non-linear Szekeres Swiss-Cheese model. The inhomogeneity scale is ~50 Mpc, and both models have the same density fluctuations along the line of sight. Results: The comparison between linear and non-linear methods shows that non-linear corrections are not negligible. When inhomogeneities are present the distance changes by several percent. To show how this change influences the measurements of dark energy, ten future observations with 2% uncertainties are generated. It is shown the using the standard methods (i.e. under the assumption of homogeneity) the systematics due to inhomogeneities can distort our analysis, and may lead to a conclusion that dark energy evolves when in fact it is constant (or vice versa). Conclusions: Therefore, if future observations are analysed only within the homogeneous framework then the impact of inhomogeneities (such as voids and superclusters) can be mistaken for evolving dark energy. Since the robust distinction between the evolution and non-evolution of dark energy is the first step to understanding the nature of dark energy a proper handling of inhomogeneities is essential.

  13. Robust tracking control of a magnetically suspended rigid body

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.; Cox, David E.

    1994-01-01

    This study is an application of H-infinity and micro-synthesis for designing robust tracking controllers for the Large Angle Magnetic Suspension Test Facility. The modeling, design, analysis, simulation, and testing of a control law that guarantees tracking performance under external disturbances and model uncertainties is investigated. The type of uncertainties considered and the tracking performance metric used is discussed. This study demonstrates the tradeoff between tracking performance at low frequencies and robustness at high frequencies. Two sets of controllers were designed and tested. The first set emphasized performance over robustness, while the second set traded off performance for robustness. Comparisons of simulation and test results are also included. Current simulation and experimental results indicate that reasonably good robust tracking performance can be attained for this system using multivariable robust control approach.

  14. Defining Tsunami Magnitude as Measure of Potential Impact

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Tang, L.

    2016-12-01

    The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.

  15. Which System Variables Carry Robust Early Signs of Upcoming Phase Transition? An Ecological Example.

    PubMed

    Negahbani, Ehsan; Steyn-Ross, D Alistair; Steyn-Ross, Moira L; Aguirre, Luis A

    2016-01-01

    Growth of critical fluctuations prior to catastrophic state transition is generally regarded as a universal phenomenon, providing a valuable early warning signal in dynamical systems. Using an ecological fisheries model of three populations (juvenile prey J, adult prey A and predator P), a recent study has reported silent early warning signals obtained from P and A populations prior to saddle-node (SN) bifurcation, and thus concluded that early warning signals are not universal. By performing a full eigenvalue analysis of the same system we demonstrate that while J and P populations undergo SN bifurcation, A does not jump to a new state, so it is not expected to carry early warning signs. In contrast with the previous study, we capture a significant increase in the noise-induced fluctuations in the P population, but only on close approach to the bifurcation point; it is not clear why the P variance initially shows a decaying trend. Here we resolve this puzzle using observability measures from control theory. By computing the observability coefficient for the system from the recordings of each population considered one at a time, we are able to quantify their ability to describe changing internal dynamics. We demonstrate that precursor fluctuations are best observed using only the J variable, and also P variable if close to transition. Using observability analysis we are able to describe why a poorly observable variable (P) has poor forecasting capabilities although a full eigenvalue analysis shows that this variable undergoes a bifurcation. We conclude that observability analysis provides complementary information to identify the variables carrying early-warning signs about impending state transition.

  16. The Ethics of Introducing GMOs into sub-Saharan Africa: Considerations from the sub-Saharan African Theory of Ubuntu.

    PubMed

    Komparic, Ana

    2015-11-01

    A growing number of countries in sub-Saharan Africa are considering legalizing the growth of genetically modified organisms (GMOs). Furthermore, several projects are underway to develop transgenic crops tailored to the region. Given the contentious nature of GMOs and prevalent anti-GMO sentiments in Africa, a robust ethical analysis examining the concerns arising from the development, adoption, and regulation of GMOs in sub-Saharan Africa is warranted. To date, ethical analyses of GMOs in the global context have drawn predominantly on Western philosophy, dealing with Africa primarily on a material level. Yet, a growing number of scholars are articulating and engaging with ethical theories that draw upon sub-Saharan African value systems. One such theory, Ubuntu, is a well-studied sub-Saharan African communitarian morality. I propose that a robust ethical analysis of Africa's agricultural future necessitates engaging with African moral theory. I articulate how Ubuntu may lead to a novel and constructive understanding of the ethical considerations for introducing GMOs into sub-Saharan Africa. However, rather than reaching a definitive prescription, which would require significant engagement with local communities, I consider some of Ubuntu's broader implications for conceptualizing risk and engaging with local communities when evaluating GMOs. I conclude by reflecting on the implications of using local moral theory in bioethics by considering how one might negotiate between universalism and particularism in the global context. Rather than advocating for a form of ethical relativism, I suggest that local moral theories shed light on salient ethical considerations that are otherwise overlooked. © 2015 John Wiley & Sons Ltd.

  17. A New Multielement Method for LA-ICP-MS Data Acquisition from Glacier Ice Cores.

    PubMed

    Spaulding, Nicole E; Sneed, Sharon B; Handley, Michael J; Bohleber, Pascal; Kurbatov, Andrei V; Pearce, Nicholas J; Erhardt, Tobias; Mayewski, Paul A

    2017-11-21

    To answer pressing new research questions about the rate and timing of abrupt climate transitions, a robust system for ultrahigh-resolution sampling of glacier ice is needed. Here, we present a multielement method of LA-ICP-MS analysis wherein an array of chemical elements is simultaneously measured from the same ablation area. Although multielement techniques are commonplace for high-concentration materials, prior to the development of this method, all LA-ICP-MS analyses of glacier ice involved a single element per ablation pass or spot. This new method, developed using the LA-ICP-MS system at the W. M. Keck Laser Ice Facility at the University of Maine Climate Change Institute, has already been used to shed light on our flawed understanding of natural levels of Pb in Earth's atmosphere.

  18. Sieve estimation in a Markov illness-death process under dual censoring.

    PubMed

    Boruvka, Audrey; Cook, Richard J

    2016-04-01

    Semiparametric methods are well established for the analysis of a progressive Markov illness-death process observed up to a noninformative right censoring time. However, often the intermediate and terminal events are censored in different ways, leading to a dual censoring scheme. In such settings, unbiased estimation of the cumulative transition intensity functions cannot be achieved without some degree of smoothing. To overcome this problem, we develop a sieve maximum likelihood approach for inference on the hazard ratio. A simulation study shows that the sieve estimator offers improved finite-sample performance over common imputation-based alternatives and is robust to some forms of dependent censoring. The proposed method is illustrated using data from cancer trials. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Scaling and universality in the human voice.

    PubMed

    Luque, Jordi; Luque, Bartolo; Lacasa, Lucas

    2015-04-06

    Speech is a distinctive complex feature of human capabilities. In order to understand the physics underlying speech production, in this work, we empirically analyse the statistics of large human speech datasets ranging several languages. We first show that during speech, the energy is unevenly released and power-law distributed, reporting a universal robust Gutenberg-Richter-like law in speech. We further show that such 'earthquakes in speech' show temporal correlations, as the interevent statistics are again power-law distributed. As this feature takes place in the intraphoneme range, we conjecture that the process responsible for this complex phenomenon is not cognitive, but it resides in the physiological (mechanical) mechanisms of speech production. Moreover, we show that these waiting time distributions are scale invariant under a renormalization group transformation, suggesting that the process of speech generation is indeed operating close to a critical point. These results are put in contrast with current paradigms in speech processing, which point towards low dimensional deterministic chaos as the origin of nonlinear traits in speech fluctuations. As these latter fluctuations are indeed the aspects that humanize synthetic speech, these findings may have an impact in future speech synthesis technologies. Results are robust and independent of the communication language or the number of speakers, pointing towards a universal pattern and yet another hint of complexity in human speech. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  20. Using Public Data for Comparative Proteome Analysis in Precision Medicine Programs.

    PubMed

    Hughes, Christopher S; Morin, Gregg B

    2018-03-01

    Maximizing the clinical utility of information obtained in longitudinal precision medicine programs would benefit from robust comparative analyses to known information to assess biological features of patient material toward identifying the underlying features driving their disease phenotype. Herein, the potential for utilizing publically deposited mass-spectrometry-based proteomics data to perform inter-study comparisons of cell-line or tumor-tissue materials is investigated. To investigate the robustness of comparison between MS-based proteomics studies carried out with different methodologies, deposited data representative of label-free (MS1) and isobaric tagging (MS2 and MS3 quantification) are utilized. In-depth quantitative proteomics data acquired from analysis of ovarian cancer cell lines revealed the robust recapitulation of observable gene expression dynamics between individual studies carried out using significantly different methodologies. The observed signatures enable robust inter-study clustering of cell line samples. In addition, the ability to classify and cluster tumor samples based on observed gene expression trends when using a single patient sample is established. With this analysis, relevant gene expression dynamics are obtained from a single patient tumor, in the context of a precision medicine analysis, by leveraging a large cohort of repository data as a comparator. Together, these data establish the potential for state-of-the-art MS-based proteomics data to serve as resources for robust comparative analyses in precision medicine applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Scalable Robust Principal Component Analysis Using Grassmann Averages.

    PubMed

    Hauberg, Sren; Feragen, Aasa; Enficiaud, Raffi; Black, Michael J

    2016-11-01

    In large datasets, manual data verification is impossible, and we must expect the number of outliers to increase with data size. While principal component analysis (PCA) can reduce data size, and scalable solutions exist, it is well-known that outliers can arbitrarily corrupt the results. Unfortunately, state-of-the-art approaches for robust PCA are not scalable. We note that in a zero-mean dataset, each observation spans a one-dimensional subspace, giving a point on the Grassmann manifold. We show that the average subspace corresponds to the leading principal component for Gaussian data. We provide a simple algorithm for computing this Grassmann Average ( GA), and show that the subspace estimate is less sensitive to outliers than PCA for general distributions. Because averages can be efficiently computed, we immediately gain scalability. We exploit robust averaging to formulate the Robust Grassmann Average (RGA) as a form of robust PCA. The resulting Trimmed Grassmann Average ( TGA) is appropriate for computer vision because it is robust to pixel outliers. The algorithm has linear computational complexity and minimal memory requirements. We demonstrate TGA for background modeling, video restoration, and shadow removal. We show scalability by performing robust PCA on the entire Star Wars IV movie; a task beyond any current method. Source code is available online.

  2. Universal Rim Thickness in Unsteady Sheet Fragmentation.

    PubMed

    Wang, Y; Dandekar, R; Bustos, N; Poulain, S; Bourouiba, L

    2018-05-18

    Unsteady fragmentation of a fluid bulk into droplets is important for epidemiology as it governs the transport of pathogens from sneezes and coughs, or from contaminated crops in agriculture. It is also ubiquitous in industrial processes such as paint, coating, and combustion. Unsteady fragmentation is distinct from steady fragmentation on which most theoretical efforts have been focused thus far. We address this gap by studying a canonical unsteady fragmentation process: the breakup from a drop impact on a finite surface where the drop fluid is transferred to a free expanding sheet of time-varying properties and bounded by a rim of time-varying thickness. The continuous rim destabilization selects the final spray droplets, yet this process remains poorly understood. We combine theory with advanced image analysis to study the unsteady rim destabilization. We show that, at all times, the rim thickness is governed by a local instantaneous Bond number equal to unity, defined with the instantaneous, local, unsteady rim acceleration. This criterion is found to be robust and universal for a family of unsteady inviscid fluid sheet fragmentation phenomena, from impacts of drops on various surface geometries to impacts on films. We discuss under which viscous and viscoelastic conditions the criterion continues to govern the unsteady rim thickness.

  3. Universal Rim Thickness in Unsteady Sheet Fragmentation

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Dandekar, R.; Bustos, N.; Poulain, S.; Bourouiba, L.

    2018-05-01

    Unsteady fragmentation of a fluid bulk into droplets is important for epidemiology as it governs the transport of pathogens from sneezes and coughs, or from contaminated crops in agriculture. It is also ubiquitous in industrial processes such as paint, coating, and combustion. Unsteady fragmentation is distinct from steady fragmentation on which most theoretical efforts have been focused thus far. We address this gap by studying a canonical unsteady fragmentation process: the breakup from a drop impact on a finite surface where the drop fluid is transferred to a free expanding sheet of time-varying properties and bounded by a rim of time-varying thickness. The continuous rim destabilization selects the final spray droplets, yet this process remains poorly understood. We combine theory with advanced image analysis to study the unsteady rim destabilization. We show that, at all times, the rim thickness is governed by a local instantaneous Bond number equal to unity, defined with the instantaneous, local, unsteady rim acceleration. This criterion is found to be robust and universal for a family of unsteady inviscid fluid sheet fragmentation phenomena, from impacts of drops on various surface geometries to impacts on films. We discuss under which viscous and viscoelastic conditions the criterion continues to govern the unsteady rim thickness.

  4. Simple, quick and cost-efficient: A universal RT-PCR and sequencing strategy for genomic characterisation of foot-and-mouth disease viruses.

    PubMed

    Dill, V; Beer, M; Hoffmann, B

    2017-08-01

    Foot-and-mouth disease (FMD) is a major contributor to poverty and food insecurity in Africa and Asia, and it is one of the biggest threats to agriculture in highly developed countries. As FMD is extremely contagious, strategies for its prevention, early detection, and the immediate characterisation of outbreak strains are of great importance. The generation of whole-genome sequences enables phylogenetic characterisation, the epidemiological tracing of virus transmission pathways and is supportive in disease control strategies. This study describes the development and validation of a rapid, universal and cost-efficient RT-PCR system to generate genome sequences of FMDV, reaching from the IRES to the end of the open reading frame. The method was evaluated using twelve different virus strains covering all seven serotypes of FMDV. Additionally, samples from experimentally infected animals were tested to mimic diagnostic field samples. All primer pairs showed a robust amplification with a high sensitivity for all serotypes. In summary, the described assay is suitable for the generation of FMDV sequences from all serotypes to allow immediate phylogenetic analysis, detailed genotyping and molecular epidemiology. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Actigraphy-Derived Daily Rest-Activity Patterns and Body Mass Index in Community-Dwelling Adults.

    PubMed

    Cespedes Feliciano, Elizabeth M; Quante, Mirja; Weng, Jia; Mitchell, Jonathan A; James, Peter; Marinac, Catherine R; Mariani, Sara; Redline, Susan; Kerr, Jacqueline; Godbole, Suneeta; Manteiga, Alicia; Wang, Daniel; Hipp, J Aaron

    2017-12-01

    To examine associations between 24-hour rest-activity patterns and body mass index (BMI) among community-dwelling US adults. Rest-activity patterns provide a field method to study exposures related to circadian rhythms. Adults (N = 578) wore an actigraph on their nondominant wrist for 7 days. Intradaily variability and interdaily stability (IS), M10 (most active 10-hours), L5 (least active 5-hours), and relative amplitude (RA) were derived using nonparametric rhythm analysis. Mesor, acrophase, and amplitude were calculated from log-transformed count data using the parametric cosinor approach. Participants were 80% female and mean (standard deviation) age was 52 (15) years. Participants with higher BMI had lower values for magnitude, RA, IS, total sleep time (TST), and sleep efficiency. In multivariable analyses, less robust 24-hour rest-activity patterns as represented by lower RA were consistently associated with higher BMI: comparing the bottom quintile (least robust) to the top quintile (most robust 24-hour rest-activity pattern) of RA, BMI was 3-kg/m2 higher (p = .02). Associations were similar in magnitude to an hour less of TST (1-kg/m2 higher BMI) or a 10% decrease in sleep efficiency (2-kg/m2 higher BMI), and independent of age, sex, race, education, and the duration of rest and/or activity. Lower RA, reflecting both higher night activity and lower daytime activity, was associated with higher BMI. Independent of the duration of rest or activity during the day or night, 24-hour rest, and activity patterns from actigraphy provide aggregated measures of activity that associate with BMI in community-dwelling adults. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  6. Robust 2DPCA with non-greedy l1 -norm maximization for image analysis.

    PubMed

    Wang, Rong; Nie, Feiping; Yang, Xiaojun; Gao, Feifei; Yao, Minli

    2015-05-01

    2-D principal component analysis based on l1 -norm (2DPCA-L1) is a recently developed approach for robust dimensionality reduction and feature extraction in image domain. Normally, a greedy strategy is applied due to the difficulty of directly solving the l1 -norm maximization problem, which is, however, easy to get stuck in local solution. In this paper, we propose a robust 2DPCA with non-greedy l1 -norm maximization in which all projection directions are optimized simultaneously. Experimental results on face and other datasets confirm the effectiveness of the proposed approach.

  7. Analysis and Design of Launch Vehicle Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Du, Wei; Whorton, Mark

    2008-01-01

    This paper describes the fundamental principles of launch vehicle flight control analysis and design. In particular, the classical concept of "drift-minimum" and "load-minimum" control principles is re-examined and its performance and stability robustness with respect to modeling uncertainties and a gimbal angle constraint is discussed. It is shown that an additional feedback of angle-of-attack or lateral acceleration can significantly improve the overall performance and robustness, especially in the presence of unexpected large wind disturbance. Non-minimum-phase structural filtering of "unstably interacting" bending modes of large flexible launch vehicles is also shown to be effective and robust.

  8. Analysis of Infrared Signature Variation and Robust Filter-Based Supersonic Target Detection

    PubMed Central

    Sun, Sun-Gu; Kim, Kyung-Tae

    2014-01-01

    The difficulty of small infrared target detection originates from the variations of infrared signatures. This paper presents the fundamental physics of infrared target variations and reports the results of variation analysis of infrared images acquired using a long wave infrared camera over a 24-hour period for different types of backgrounds. The detection parameters, such as signal-to-clutter ratio were compared according to the recording time, temperature and humidity. Through variation analysis, robust target detection methodologies are derived by controlling thresholds and designing a temporal contrast filter to achieve high detection rate and low false alarm rate. Experimental results validate the robustness of the proposed scheme by applying it to the synthetic and real infrared sequences. PMID:24672290

  9. Respiratory motion correction in dynamic MRI using robust data decomposition registration - application to DCE-MRI.

    PubMed

    Hamy, Valentin; Dikaios, Nikolaos; Punwani, Shonit; Melbourne, Andrew; Latifoltojar, Arash; Makanyanga, Jesica; Chouhan, Manil; Helbren, Emma; Menys, Alex; Taylor, Stuart; Atkinson, David

    2014-02-01

    Motion correction in Dynamic Contrast Enhanced (DCE-) MRI is challenging because rapid intensity changes can compromise common (intensity based) registration algorithms. In this study we introduce a novel registration technique based on robust principal component analysis (RPCA) to decompose a given time-series into a low rank and a sparse component. This allows robust separation of motion components that can be registered, from intensity variations that are left unchanged. This Robust Data Decomposition Registration (RDDR) is demonstrated on both simulated and a wide range of clinical data. Robustness to different types of motion and breathing choices during acquisition is demonstrated for a variety of imaged organs including liver, small bowel and prostate. The analysis of clinically relevant regions of interest showed both a decrease of error (15-62% reduction following registration) in tissue time-intensity curves and improved areas under the curve (AUC60) at early enhancement. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Investigation on changes of modularity and robustness by edge-removal mutations in signaling networks.

    PubMed

    Truong, Cong-Doan; Kwon, Yung-Keun

    2017-12-21

    Biological networks consisting of molecular components and interactions are represented by a graph model. There have been some studies based on that model to analyze a relationship between structural characteristics and dynamical behaviors in signaling network. However, little attention has been paid to changes of modularity and robustness in mutant networks. In this paper, we investigated the changes of modularity and robustness by edge-removal mutations in three signaling networks. We first observed that both the modularity and robustness increased on average in the mutant network by the edge-removal mutations. However, the modularity change was negatively correlated with the robustness change. This implies that it is unlikely that both the modularity and the robustness values simultaneously increase by the edge-removal mutations. Another interesting finding is that the modularity change was positively correlated with the degree, the number of feedback loops, and the edge betweenness of the removed edges whereas the robustness change was negatively correlated with them. We note that these results were consistently observed in randomly structure networks. Additionally, we identified two groups of genes which are incident to the highly-modularity-increasing and the highly-robustness-decreasing edges with respect to the edge-removal mutations, respectively, and observed that they are likely to be central by forming a connected component of a considerably large size. The gene-ontology enrichment of each of these gene groups was significantly different from the rest of genes. Finally, we showed that the highly-robustness-decreasing edges can be promising edgetic drug-targets, which validates the usefulness of our analysis. Taken together, the analysis of changes of robustness and modularity against edge-removal mutations can be useful to unravel novel dynamical characteristics underlying in signaling networks.

  11. FAA Airport Design Competition for Universities

    NASA Technical Reports Server (NTRS)

    Sandy, Mary

    2008-01-01

    Raise awareness of the importance of airports to the National Airspace System infrastructure. Increase the involvement of the academic community in addressing airport operations and infrastructure issues and needs. Engage U.S. students in the conceptualization of applications, systems and equipment capable of addressing related challenges in a robust, reliable and comprehensive manner. Encourage U.S. undergraduate and graduate students to contribute innovative ideas and solutions to airport and runway safety issues. Provide the framework and incentives for quality educational experiences for university students. d Develop an awareness of and an interest in airports as a vital and interesting area for engineering and technology careers.

  12. The wired generation: academic and social outcomes of electronic media use among university students.

    PubMed

    Jacobsen, Wade C; Forste, Renata

    2011-05-01

    Little is known about the influence of electronic media use on the academic and social lives of university students. Using time-diary and survey data, we explore the use of various types of electronic media among first-year students. Time-diary results suggest that the majority of students use electronic media to multitask. Robust regression results indicate a negative relationship between the use of various types of electronic media and first-semester grades. In addition, we find a positive association between social-networking-site use, cellular-phone communication, and face-to-face social interaction.

  13. To what degree does the missing-data technique influence the estimated growth in learning strategies over time? A tutorial example of sensitivity analysis for longitudinal data.

    PubMed

    Coertjens, Liesje; Donche, Vincent; De Maeyer, Sven; Vanthournout, Gert; Van Petegem, Peter

    2017-01-01

    Longitudinal data is almost always burdened with missing data. However, in educational and psychological research, there is a large discrepancy between methodological suggestions and research practice. The former suggests applying sensitivity analysis in order to the robustness of the results in terms of varying assumptions regarding the mechanism generating the missing data. However, in research practice, participants with missing data are usually discarded by relying on listwise deletion. To help bridge the gap between methodological recommendations and applied research in the educational and psychological domain, this study provides a tutorial example of sensitivity analysis for latent growth analysis. The example data concern students' changes in learning strategies during higher education. One cohort of students in a Belgian university college was asked to complete the Inventory of Learning Styles-Short Version, in three measurement waves. A substantial number of students did not participate on each occasion. Change over time in student learning strategies was assessed using eight missing data techniques, which assume different mechanisms for missingness. The results indicated that, for some learning strategy subscales, growth estimates differed between the models. Guidelines in terms of reporting the results from sensitivity analysis are synthesised and applied to the results from the tutorial example.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiebenga, J. H.; Atzema, E. H.; Boogaard, A. H. van den

    Robust design of forming processes using numerical simulations is gaining attention throughout the industry. In this work, it is demonstrated how robust optimization can assist in further stretching the limits of metal forming processes. A deterministic and a robust optimization study are performed, considering a stretch-drawing process of a hemispherical cup product. For the robust optimization study, both the effect of material and process scatter are taken into account. For quantifying the material scatter, samples of 41 coils of a drawing quality forming steel have been collected. The stochastic material behavior is obtained by a hybrid approach, combining mechanical testingmore » and texture analysis, and efficiently implemented in a metamodel based optimization strategy. The deterministic and robust optimization results are subsequently presented and compared, demonstrating an increased process robustness and decreased number of product rejects by application of the robust optimization approach.« less

  15. First Universities Allied for Essential Medicines (UAEM) Neglected Diseases and Innovation Symposium.

    PubMed

    Musselwhite, Laura W; Maciag, Karolina; Lankowski, Alex; Gretes, Michael C; Wellems, Thomas E; Tavera, Gloria; Goulding, Rebecca E; Guillen, Ethan

    2012-01-01

    Universities Allied for Essential Medicines organized its first Neglected Diseases and Innovation Symposium to address expanding roles of public sector research institutions in innovation in research and development of biomedical technologies for treatment of diseases, particularly neglected tropical diseases. Universities and other public research institutions are increasingly integrated into the pharmaceutical innovation system. Academic entities now routinely undertake robust high-throughput screening and medicinal chemistry research programs to identify lead compounds for small molecule drugs and novel drug targets. Furthermore, product development partnerships are emerging between academic institutions, non-profit entities, and biotechnology and pharmaceutical companies to create diagnostics, therapies, and vaccines for diseases of the poor. With not for profit mission statements, open access publishing standards, open source platforms for data sharing and collaboration, and a shift in focus to more translational research, universities and other public research institutions are well-placed to accelerate development of medical technologies, particularly for neglected tropical diseases.

  16. Regularization of the big bang singularity with random perturbations

    NASA Astrophysics Data System (ADS)

    Belbruno, Edward; Xue, BingKan

    2018-03-01

    We show how to regularize the big bang singularity in the presence of random perturbations modeled by Brownian motion using stochastic methods. We prove that the physical variables in a contracting universe dominated by a scalar field can be continuously and uniquely extended through the big bang as a function of time to an expanding universe only for a discrete set of values of the equation of state satisfying special co-prime number conditions. This result significantly generalizes a previous result (Xue and Belbruno 2014 Class. Quantum Grav. 31 165002) that did not model random perturbations. This result implies that the extension from a contracting to an expanding universe for the discrete set of co-prime equation of state is robust, which is a surprising result. Implications for a purely expanding universe are discussed, such as a non-smooth, randomly varying scale factor near the big bang.

  17. Robustness surfaces of complex networks

    NASA Astrophysics Data System (ADS)

    Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis

    2014-09-01

    Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared.

  18. Robustness surfaces of complex networks.

    PubMed

    Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis

    2014-09-02

    Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared.

  19. Robustness of Two Formulas to Correct Pearson Correlation for Restriction of Range

    ERIC Educational Resources Information Center

    tran, minh

    2011-01-01

    Many research studies involving Pearson correlations are conducted in settings where one of the two variables has a restricted range in the sample. For example, this situation occurs when tests are used for selecting candidates for employment or university admission. Often after selection, there is interest in correlating the selection variable,…

  20. National Water Center opens on University of Alabama campus in Tuscaloosa

    Science.gov Websites

    will be the first ever clearing house for research and operational forecasting of all water-related boasts a robust research program focused on the protection and restoration of the nation's water supply NOAA HOME WEATHER OCEANS FISHERIES CHARTING SATELLITES CLIMATE RESEARCH COASTS CAREERS National

  1. Development in Reading and Math in Children from Different SES Backgrounds: The Moderating Role of Child Temperament

    ERIC Educational Resources Information Center

    Wang, Zhe; Soden, Brooke; Deater-Deckard, Kirby; Lukowski, Sarah L.; Schenker, Victoria J.; Willcutt, Erik G.; Thompson, Lee A.; Petrill, Stephen A.

    2017-01-01

    Socioeconomic risks (SES risks) are robust risk factors influencing children's academic development. However, it is unclear whether the effects of SES on academic development operate universally in all children equally or whether they vary differentially in children with particular characteristics. The current study aimed to explore children's…

  2. On Synchronous Distance Teaching in a Mathematics MS (Master of Science) Program

    ERIC Educational Resources Information Center

    Li, Kuiyuan; Amin, Raid; Uvah, Josaphat

    2011-01-01

    A fully online graduate program that was developed at the UWF (University of West Florida) has been successfully implemented using synchronous instruction since fall 2009. The hybrid nature of the developed model has proven to be of benefit to both face-to-face and distance students. Aside from the robustness of students' discussions and…

  3. The Female Fish Is More Responsive: Gender Moderates the BFLPE in the Domain of Science

    ERIC Educational Resources Information Center

    Plieninger, Hansjörg; Dickhäuser, Oliver

    2015-01-01

    Academic self-concept is positively related to individual achievement but negatively related to class- or school-average achievement: the big-fish--little-pond effect (BFLPE). This contrast effect results from social comparison processes. The BFLPE is known to be long-lasting, universal and robust. However, there is little evidence regarding its…

  4. Discovering Open Source Discovery: Using VuFind to Create MnPALS Plus

    ERIC Educational Resources Information Center

    Digby, Todd; Elfstrand, Stephen

    2011-01-01

    The goal of having a robust discovery system is a priority of the libraries the authors serve (both work at the Minnesota State Colleges and Universities). Given the current fiscal situation facing public higher education in their state, the current commercial systems were not affordable for most of their libraries. They decided to implement and…

  5. Class Size and Student Evaluations in Sweden

    ERIC Educational Resources Information Center

    Westerlund, Joakim

    2008-01-01

    This paper examines the effect of class size on student evaluations of the quality of an introductory mathematics course at Lund University in Sweden. In contrast to much other studies, we find a large negative, and statistically significant, effect of class size on the quality of the course. This result appears to be quite robust, as almost all…

  6. Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data

    NASA Technical Reports Server (NTRS)

    Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)

    2001-01-01

    A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.

  7. Robust Mokken Scale Analysis by Means of the Forward Search Algorithm for Outlier Detection

    ERIC Educational Resources Information Center

    Zijlstra, Wobbe P.; van der Ark, L. Andries; Sijtsma, Klaas

    2011-01-01

    Exploratory Mokken scale analysis (MSA) is a popular method for identifying scales from larger sets of items. As with any statistical method, in MSA the presence of outliers in the data may result in biased results and wrong conclusions. The forward search algorithm is a robust diagnostic method for outlier detection, which we adapt here to…

  8. Topological robustness analysis of protein interaction networks reveals key targets for overcoming chemotherapy resistance in glioma

    NASA Astrophysics Data System (ADS)

    Azevedo, Hátylas; Moreira-Filho, Carlos Alberto

    2015-11-01

    Biological networks display high robustness against random failures but are vulnerable to targeted attacks on central nodes. Thus, network topology analysis represents a powerful tool for investigating network susceptibility against targeted node removal. Here, we built protein interaction networks associated with chemoresistance to temozolomide, an alkylating agent used in glioma therapy, and analyzed their modular structure and robustness against intentional attack. These networks showed functional modules related to DNA repair, immunity, apoptosis, cell stress, proliferation and migration. Subsequently, network vulnerability was assessed by means of centrality-based attacks based on the removal of node fractions in descending orders of degree, betweenness, or the product of degree and betweenness. This analysis revealed that removing nodes with high degree and high betweenness was more effective in altering networks’ robustness parameters, suggesting that their corresponding proteins may be particularly relevant to target temozolomide resistance. In silico data was used for validation and confirmed that central nodes are more relevant for altering proliferation rates in temozolomide-resistant glioma cell lines and for predicting survival in glioma patients. Altogether, these results demonstrate how the analysis of network vulnerability to topological attack facilitates target prioritization for overcoming cancer chemoresistance.

  9. Gradient descent for robust kernel-based regression

    NASA Astrophysics Data System (ADS)

    Guo, Zheng-Chu; Hu, Ting; Shi, Lei

    2018-06-01

    In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.

  10. Robustness of meta-analyses in finding gene × environment interactions

    PubMed Central

    Shi, Gang; Nehorai, Arye

    2017-01-01

    Meta-analyses that synthesize statistical evidence across studies have become important analytical tools for genetic studies. Inspired by the success of genome-wide association studies of the genetic main effect, researchers are searching for gene × environment interactions. Confounders are routinely included in the genome-wide gene × environment interaction analysis as covariates; however, this does not control for any confounding effects on the results if covariate × environment interactions are present. We carried out simulation studies to evaluate the robustness to the covariate × environment confounder for meta-regression and joint meta-analysis, which are two commonly used meta-analysis methods for testing the gene × environment interaction or the genetic main effect and interaction jointly. Here we show that meta-regression is robust to the covariate × environment confounder while joint meta-analysis is subject to the confounding effect with inflated type I error rates. Given vast sample sizes employed in genome-wide gene × environment interaction studies, non-significant covariate × environment interactions at the study level could substantially elevate the type I error rate at the consortium level. When covariate × environment confounders are present, type I errors can be controlled in joint meta-analysis by including the covariate × environment terms in the analysis at the study level. Alternatively, meta-regression can be applied, which is robust to potential covariate × environment confounders. PMID:28362796

  11. Robustness of Reconstructed Ancestral Protein Functions to Statistical Uncertainty.

    PubMed

    Eick, Geeta N; Bridgham, Jamie T; Anderson, Douglas P; Harms, Michael J; Thornton, Joseph W

    2017-02-01

    Hypotheses about the functions of ancient proteins and the effects of historical mutations on them are often tested using ancestral protein reconstruction (APR)-phylogenetic inference of ancestral sequences followed by synthesis and experimental characterization. Usually, some sequence sites are ambiguously reconstructed, with two or more statistically plausible states. The extent to which the inferred functions and mutational effects are robust to uncertainty about the ancestral sequence has not been studied systematically. To address this issue, we reconstructed ancestral proteins in three domain families that have different functions, architectures, and degrees of uncertainty; we then experimentally characterized the functional robustness of these proteins when uncertainty was incorporated using several approaches, including sampling amino acid states from the posterior distribution at each site and incorporating the alternative amino acid state at every ambiguous site in the sequence into a single "worst plausible case" protein. In every case, qualitative conclusions about the ancestral proteins' functions and the effects of key historical mutations were robust to sequence uncertainty, with similar functions observed even when scores of alternate amino acids were incorporated. There was some variation in quantitative descriptors of function among plausible sequences, suggesting that experimentally characterizing robustness is particularly important when quantitative estimates of ancient biochemical parameters are desired. The worst plausible case method appears to provide an efficient strategy for characterizing the functional robustness of ancestral proteins to large amounts of sequence uncertainty. Sampling from the posterior distribution sometimes produced artifactually nonfunctional proteins for sequences reconstructed with substantial ambiguity. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  12. Investigation of progressive failure robustness and alternate load paths for damage tolerant structures

    NASA Astrophysics Data System (ADS)

    Marhadi, Kun Saptohartyadi

    Structural optimization for damage tolerance under various unforeseen damage scenarios is computationally challenging. It couples non-linear progressive failure analysis with sampling-based stochastic analysis of random damage. The goal of this research was to understand the relationship between alternate load paths available in a structure and its damage tolerance, and to use this information to develop computationally efficient methods for designing damage tolerant structures. Progressive failure of a redundant truss structure subjected to small random variability was investigated to identify features that correlate with robustness and predictability of the structure's progressive failure. The identified features were used to develop numerical surrogate measures that permit computationally efficient deterministic optimization to achieve robustness and predictability of progressive failure. Analysis of damage tolerance on designs with robust progressive failure indicated that robustness and predictability of progressive failure do not guarantee damage tolerance. Damage tolerance requires a structure to redistribute its load to alternate load paths. In order to investigate the load distribution characteristics that lead to damage tolerance in structures, designs with varying degrees of damage tolerance were generated using brute force stochastic optimization. A method based on principal component analysis was used to describe load distributions (alternate load paths) in the structures. Results indicate that a structure that can develop alternate paths is not necessarily damage tolerant. The alternate load paths must have a required minimum load capability. Robustness analysis of damage tolerant optimum designs indicates that designs are tailored to specified damage. A design Optimized under one damage specification can be sensitive to other damages not considered. Effectiveness of existing load path definitions and characterizations were investigated for continuum structures. A load path definition using a relative compliance change measure (U* field) was demonstrated to be the most useful measure of load path. This measure provides quantitative information on load path trajectories and qualitative information on the effectiveness of the load path. The use of the U* description of load paths in optimizing structures for effective load paths was investigated.

  13. Robust L1-norm two-dimensional linear discriminant analysis.

    PubMed

    Li, Chun-Na; Shao, Yuan-Hai; Deng, Nai-Yang

    2015-05-01

    In this paper, we propose an L1-norm two-dimensional linear discriminant analysis (L1-2DLDA) with robust performance. Different from the conventional two-dimensional linear discriminant analysis with L2-norm (L2-2DLDA), where the optimization problem is transferred to a generalized eigenvalue problem, the optimization problem in our L1-2DLDA is solved by a simple justifiable iterative technique, and its convergence is guaranteed. Compared with L2-2DLDA, our L1-2DLDA is more robust to outliers and noises since the L1-norm is used. This is supported by our preliminary experiments on toy example and face datasets, which show the improvement of our L1-2DLDA over L2-2DLDA. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. When Can Categorical Variables Be Treated as Continuous? A Comparison of Robust Continuous and Categorical SEM Estimation Methods under Suboptimal Conditions

    ERIC Educational Resources Information Center

    Rhemtulla, Mijke; Brosseau-Liard, Patricia E.; Savalei, Victoria

    2012-01-01

    A simulation study compared the performance of robust normal theory maximum likelihood (ML) and robust categorical least squares (cat-LS) methodology for estimating confirmatory factor analysis models with ordinal variables. Data were generated from 2 models with 2-7 categories, 4 sample sizes, 2 latent distributions, and 5 patterns of category…

  15. Robust dynamic inversion controller design and analysis (using the X-38 vehicle as a case study)

    NASA Astrophysics Data System (ADS)

    Ito, Daigoro

    A new way to approach robust Dynamic Inversion controller synthesis is addressed in this paper. A Linear Quadratic Gaussian outer-loop controller improves the robustness of a Dynamic Inversion inner-loop controller in the presence of uncertainties. Desired dynamics are given by the dynamic compensator, which shapes the loop. The selected dynamics are based on both performance and stability robustness requirements. These requirements are straightforwardly formulated as frequency-dependent singular value bounds during synthesis of the controller. Performance and robustness of the designed controller is tested using a worst case time domain quadratic index, which is a simple but effective way to measure robustness due to parameter variation. Using this approach, a lateral-directional controller for the X-38 vehicle is designed and its robustness to parameter variations and disturbances is analyzed. It is found that if full state measurements are available, the performance of the designed lateral-directional control system, measured by the chosen cost function, improves by approximately a factor of four. Also, it is found that the designed system is stable up to a parametric variation of 1.65 standard deviation with the set of uncertainty considered. The system robustness is determined to be highly sensitive to the dihedral derivative and the roll damping coefficients. The controller analysis is extended to the nonlinear system where both control input displacements and rates are bounded. In this case, the considered nonlinear system is stable up to 48.1° in bank angle and 1.59° in sideslip angle variations, indicating it is more sensitive to variations in sideslip angle than in bank angle. This nonlinear approach is further extended for the actuator failure mode analysis. The results suggest that the designed system maintains a high level of stability in the event of aileron failure. However, only 35% or less of the original stability range is maintained for the rudder failure case. Overall, this combination of controller synthesis and robustness criteria compares well with the mu-synthesis technique. It also is readily accessible to the practicing engineer, in terms of understanding and use.

  16. No-Reference Video Quality Assessment Based on Statistical Analysis in 3D-DCT Domain.

    PubMed

    Li, Xuelong; Guo, Qun; Lu, Xiaoqiang

    2016-05-13

    It is an important task to design models for universal no-reference video quality assessment (NR-VQA) in multiple video processing and computer vision applications. However, most existing NR-VQA metrics are designed for specific distortion types which are not often aware in practical applications. A further deficiency is that the spatial and temporal information of videos is hardly considered simultaneously. In this paper, we propose a new NR-VQA metric based on the spatiotemporal natural video statistics (NVS) in 3D discrete cosine transform (3D-DCT) domain. In the proposed method, a set of features are firstly extracted based on the statistical analysis of 3D-DCT coefficients to characterize the spatiotemporal statistics of videos in different views. These features are used to predict the perceived video quality via the efficient linear support vector regression (SVR) model afterwards. The contributions of this paper are: 1) we explore the spatiotemporal statistics of videos in 3DDCT domain which has the inherent spatiotemporal encoding advantage over other widely used 2D transformations; 2) we extract a small set of simple but effective statistical features for video visual quality prediction; 3) the proposed method is universal for multiple types of distortions and robust to different databases. The proposed method is tested on four widely used video databases. Extensive experimental results demonstrate that the proposed method is competitive with the state-of-art NR-VQA metrics and the top-performing FR-VQA and RR-VQA metrics.

  17. Robustness analysis of elastoplastic structure subjected to double impulse

    NASA Astrophysics Data System (ADS)

    Kanno, Yoshihiro; Takewaki, Izuru

    2016-11-01

    The double impulse has extensively been used to evaluate the critical response of an elastoplastic structure against a pulse-type input, including near-fault earthquake ground motions. In this paper, we propose a robustness assessment method for elastoplastic single-degree-of-freedom structures subjected to the double impulse input. Uncertainties in the initial velocity of the input, as well as the natural frequency and the strength of the structure, are considered. As fundamental properties of the structural robustness, we show monotonicity of the robustness measure with respect to the natural frequency. In contrast, we show that robustness is not necessarily improved even if the structural strength is increased. Moreover, the robustness preference between two structures with different values of structural strength can possibly reverse when the performance requirement is changed.

  18. Practical robustness measures in multivariable control system analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Lehtomaki, N. A.

    1981-01-01

    The robustness of the stability of multivariable linear time invariant feedback control systems with respect to model uncertainty is considered using frequency domain criteria. Available robustness tests are unified under a common framework based on the nature and structure of model errors. These results are derived using a multivariable version of Nyquist's stability theorem in which the minimum singular value of the return difference transfer matrix is shown to be the multivariable generalization of the distance to the critical point on a single input, single output Nyquist diagram. Using the return difference transfer matrix, a very general robustness theorem is presented from which all of the robustness tests dealing with specific model errors may be derived. The robustness tests that explicitly utilized model error structure are able to guarantee feedback system stability in the face of model errors of larger magnitude than those robustness tests that do not. The robustness of linear quadratic Gaussian control systems are analyzed.

  19. Validation of the European Cyberbullying Intervention Project Questionnaire for Colombian Adolescents.

    PubMed

    Herrera-López, Mauricio; Casas, José A; Romera, Eva M; Ortega-Ruiz, Rosario; Del Rey, Rosario

    2017-02-01

    Cyberbullying is the act of using unjustified aggression to harm or harass via digital devices. Currently regarded as a widespread problem, the phenomenon has attracted growing research interest in different measures of cyberbullying and the similarities and differences across countries and cultures. This article presents the Colombian validation of the European Cyberbullying Intervention Project Questionnaire (ECIPQ) involving 3,830 high school students (M = 13.9 years old, standard deviation = 1.61; 48.9 percent male), of which 1,931 were Colombian and 1,899 Spanish. Confirmatory factor analysis (CFA), content validation, and multigroup analysis were performed with each of the sample subgroups. The optimal fits and psychometric properties obtained confirm the robustness and suitability of the assessment instrument to jointly measure cyber-aggression and cyber-victimization. The results corroborated the theoretical construct and the two-dimensional and universal nature of cyberbullying. The multigroup analysis showed that cyberbullying dynamics are similar in both countries. The comparative analyses of prevalence revealed that Colombian students are less involved in cyberbullying. The results indicate the suitability of the instrument and the advantages of using such a tool to evaluate and guide psychoeducational interventions aimed at preventing cyberbullying in countries where few studies have been performed.

  20. A Confirmatory Factor Analysis of the Student Evidence-Based Practice Questionnaire (S-EBPQ) in an Australian sample.

    PubMed

    Beccaria, Lisa; Beccaria, Gavin; McCosker, Catherine

    2018-03-01

    It is crucial that nursing students develop skills and confidence in using Evidence-Based Practice principles early in their education. This should be assessed with valid tools however, to date, few measures have been developed and applied to the student population. To examine the structural validity of the Student Evidence-Based Practice Questionnaire (S-EBPQ), with an Australian online nursing student cohort. A cross-sectional study for constructing validity. Three hundred and forty-five undergraduate nursing students from an Australian regional university were recruited across two semesters. Confirmatory Factor Analysis was used to examine the structural validity. Confirmatory Factor Analysis was applied which resulted in a good fitting model, based on a revised 20-item tool. The S-EBPQ tool remains a psychometrically robust measure of evidence-based practice use, attitudes, and knowledge and skills and can be applied in an online Australian student context. The findings of this study provided further evidence of the reliability and four factor structure of the S-EBPQ. Opportunities for further refinement of the tool may result in improvements in structural validity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. The Trumorph® system: The new universal technique for the observation and analysis of the morphology of living sperm. [corrected].

    PubMed

    Soler, C; García-Molina, A; Contell, J; Silvestre, M A; Sancho, M

    2015-07-01

    Evaluation of sperm morphology is a fundamental component of semen analysis, but its real significance has been obscured by a plethora of techniques that involve fixation and staining procedures that induce artefacts. Here we describe Trumorph℗®, a new method for sperm morphology assessment that is based upon examination of wet preparations of living spermatozoa immobilized by a short 60°C shock using negative phase contrast microscopy. We have observed samples from five animals of the following species: bull, boar, goat and rabbit. In every case, all the components of the sperm head and tail were perfectly defined, including the acrosome and midpiece (in all its length, including cytoplasmic droplets). A range of morphological forms was observed, similar to those found by conventional fixed and stained preparations, but other forms were found, distinguishable only by the optics used. The ease of preparation makes it a robust method applicable for analysis of living unmodified spermatozoa in a range of situations. Subsequent studies on well-characterized samples are required to describe the morphology of potentially fertilizing spermatozoa. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Dimensionality-varied convolutional neural network for spectral-spatial classification of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Liu, Wanjun; Liang, Xuejian; Qu, Haicheng

    2017-11-01

    Hyperspectral image (HSI) classification is one of the most popular topics in remote sensing community. Traditional and deep learning-based classification methods were proposed constantly in recent years. In order to improve the classification accuracy and robustness, a dimensionality-varied convolutional neural network (DVCNN) was proposed in this paper. DVCNN was a novel deep architecture based on convolutional neural network (CNN). The input of DVCNN was a set of 3D patches selected from HSI which contained spectral-spatial joint information. In the following feature extraction process, each patch was transformed into some different 1D vectors by 3D convolution kernels, which were able to extract features from spectral-spatial data. The rest of DVCNN was about the same as general CNN and processed 2D matrix which was constituted by by all 1D data. So that the DVCNN could not only extract more accurate and rich features than CNN, but also fused spectral-spatial information to improve classification accuracy. Moreover, the robustness of network on water-absorption bands was enhanced in the process of spectral-spatial fusion by 3D convolution, and the calculation was simplified by dimensionality varied convolution. Experiments were performed on both Indian Pines and Pavia University scene datasets, and the results showed that the classification accuracy of DVCNN improved by 32.87% on Indian Pines and 19.63% on Pavia University scene than spectral-only CNN. The maximum accuracy improvement of DVCNN achievement was 13.72% compared with other state-of-the-art HSI classification methods, and the robustness of DVCNN on water-absorption bands noise was demonstrated.

  3. The population cost-effectiveness of delivering universal and indicated school-based interventions to prevent the onset of major depression among youth in Australia.

    PubMed

    Lee, Y Y; Barendregt, J J; Stockings, E A; Ferrari, A J; Whiteford, H A; Patton, G A; Mihalopoulos, C

    2017-10-01

    School-based psychological interventions encompass: universal interventions targeting youth in the general population; and indicated interventions targeting youth with subthreshold depression. This study aimed to: (1) examine the population cost-effectiveness of delivering universal and indicated prevention interventions to youth in the population aged 11-17 years via primary and secondary schools in Australia; and (2) compare the comparative cost-effectiveness of delivering these interventions using face-to-face and internet-based delivery mechanisms. We reviewed literature on the prevention of depression to identify all interventions targeting youth that would be suitable for implementation in Australia and had evidence of efficacy to support analysis. From this, we found evidence of effectiveness for the following intervention types: universal prevention involving group-based psychological interventions delivered to all participating school students; and indicated prevention involving group-based psychological interventions delivered to students with subthreshold depression. We constructed a Markov model to assess the cost-effectiveness of delivering universal and indicated interventions in the population relative to a 'no intervention' comparator over a 10-year time horizon. A disease model was used to simulate epidemiological transitions between three health states (i.e., healthy, diseased and dead). Intervention effect sizes were based on meta-analyses of randomised control trial data identified in the aforementioned review; while health benefits were measured as Disability-adjusted Life Years (DALYs) averted attributable to reductions in depression incidence. Net costs of delivering interventions were calculated using relevant Australian data. Uncertainty and sensitivity analyses were conducted to test model assumptions. Incremental cost-effectiveness ratios (ICERs) were measured in 2013 Australian dollars per DALY averted; with costs and benefits discounted at 3%. Universal and indicated psychological interventions delivered through face-to-face modalities had ICERs below a threshold of $50 000 per DALY averted. That is, $7350 per DALY averted (95% uncertainty interval (UI): dominates - 23 070) for universal prevention, and $19 550 per DALY averted (95% UI: 3081-56 713) for indicated prevention. Baseline ICERs were generally robust to changes in model assumptions. We conducted a sensitivity analysis which found that internet-delivered prevention interventions were highly cost-effective when assuming intervention effect sizes of 100 and 50% relative to effect sizes observed for face-to-face delivered interventions. These results should, however, be interpreted with caution due to the paucity of data. School-based psychological interventions appear to be cost-effective. However, realising efficiency gains in the population is ultimately dependent on ensuring successful system-level implementation.

  4. Large-Scale Analysis of Auditory Segregation Behavior Crowdsourced via a Smartphone App.

    PubMed

    Teki, Sundeep; Kumar, Sukhbinder; Griffiths, Timothy D

    2016-01-01

    The human auditory system is adept at detecting sound sources of interest from a complex mixture of several other simultaneous sounds. The ability to selectively attend to the speech of one speaker whilst ignoring other speakers and background noise is of vital biological significance-the capacity to make sense of complex 'auditory scenes' is significantly impaired in aging populations as well as those with hearing loss. We investigated this problem by designing a synthetic signal, termed the 'stochastic figure-ground' stimulus that captures essential aspects of complex sounds in the natural environment. Previously, we showed that under controlled laboratory conditions, young listeners sampled from the university subject pool (n = 10) performed very well in detecting targets embedded in the stochastic figure-ground signal. Here, we presented a modified version of this cocktail party paradigm as a 'game' featured in a smartphone app (The Great Brain Experiment) and obtained data from a large population with diverse demographical patterns (n = 5148). Despite differences in paradigms and experimental settings, the observed target-detection performance by users of the app was robust and consistent with our previous results from the psychophysical study. Our results highlight the potential use of smartphone apps in capturing robust large-scale auditory behavioral data from normal healthy volunteers, which can also be extended to study auditory deficits in clinical populations with hearing impairments and central auditory disorders.

  5. Characterization of primary cultures of adult human epididymis epithelial cells.

    PubMed

    Leir, Shih-Hsing; Browne, James A; Eggener, Scott E; Harris, Ann

    2015-03-01

    To establish cultures of epithelial cells from all regions of the human epididymis to provide reagents for molecular approaches to functional studies of this epithelium. Experimental laboratory study. University research institute. Epididymis from seven patients undergoing orchiectomy for suspected testicular cancer without epididymal involvement. Human epididymis epithelial cells harvested from adult epididymis tissue. Establishment of a robust culture protocol for adult human epididymal epithelial cells. Cultures of caput, corpus, and cauda epithelial cells were established from epididymis tissue of seven donors. Cells were passaged up to eight times and maintained differentiation markers. They were also cryopreserved and recovered successfully. Androgen receptor, clusterin, and cysteine-rich secretory protein 1 were expressed in cultured cells, as shown by means of immunofluorescence, Western blot, and quantitative reverse-transcription polymerase chain reaction (qRT-PCR). The distribution of other epididymis markers was also shown by means of qRT-PCR. Cultures developed transepithelial resistance (TER), which was androgen responsive in the caput but androgen insensitive in the corpus and cauda, where unstimulated TER values were much higher. The results demonstrate a robust in vitro culture system for differentiated epithelial cell types in the caput, corpus, and cauda of the human epididymis. These cells will be a valuable resource for molecular analysis of epididymis epithelial function, which has a pivotal role in male fertility. Copyright © 2015 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  6. Absolute molecular weight determination of hypromellose acetate succinate by size exclusion chromatography: use of a multi angle laser light scattering detector and a mixed solvent.

    PubMed

    Chen, Raymond; Ilasi, Nicholas; Sekulic, Sonja S

    2011-12-05

    Molecular weight distribution is an important quality attribute for hypromellose acetate succinate (HPMCAS), a pharmaceutical excipient used in spray-dried dispersions. Our previous study showed that neither relative nor universal calibration method of size exclusion chromatography (SEC) works for HPMCAS polymers. We here report our effort to develop a SEC method using a mass sensitive multi angle laser light scattering detector (MALLS) to determine molecular weight distributions of HPMCAS polymers. A solvent screen study reveals that a mixed solvent (60:40%, v/v 50mM NaH(2)PO(4) with 0.1M NaNO(3) buffer: acetonitrile, pH* 8.0) is the best for HPMCAS-LF and MF sub-classes. Use of a mixed solvent creates a challenging condition for the method that uses refractive index detector. Therefore, we thoroughly evaluated the method performance and robustness. The mean weight average molecular weight of a polyethylene oxide standard has a 95% confidence interval of (28,443-28,793) g/mol vs. 28,700g/mol from the Certificate of Analysis. The relative standard deviations of average molecular weights for all polymers are 3-6%. These results and the Design of Experiments study demonstrate that the method is accurate and robust. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. The Robustness Analysis of Wireless Sensor Networks under Uncertain Interference

    PubMed Central

    Deng, Changjian

    2013-01-01

    Based on the complex network theory, robustness analysis of condition monitoring wireless sensor network under uncertain interference is present. In the evolution of the topology of sensor networks, the density weighted algebraic connectivity is taken into account, and the phenomenon of removing and repairing the link and node in the network is discussed. Numerical simulation is conducted to explore algebraic connectivity characteristics and network robustness performance. It is found that nodes density has the effect on algebraic connectivity distribution in the random graph model; high density nodes carry more connections, use more throughputs, and may be more unreliable. Moreover, the results show that, when network should be more error tolerant or robust by repairing nodes or adding new nodes, the network should be better clustered in median and high scale wireless sensor networks and be meshing topology in small scale networks. PMID:24363613

  8. A comparative robustness evaluation of feedforward neurofilters

    NASA Technical Reports Server (NTRS)

    Troudet, Terry; Merrill, Walter

    1993-01-01

    A comparative performance and robustness analysis is provided for feedforward neurofilters trained with back propagation to filter additive white noise. The signals used in this analysis are simulated pitch rate responses to typical pilot command inputs for a modern fighter aircraft model. Various configurations of nonlinear and linear neurofilters are trained to estimate exact signal values from input sequences of noisy sampled signal values. In this application, nonlinear neurofiltering is found to be more efficient than linear neurofiltering in removing the noise from responses of the nominal vehicle model, whereas linear neurofiltering is found to be more robust in the presence of changes in the vehicle dynamics. The possibility of enhancing neurofiltering through hybrid architectures based on linear and nonlinear neuroprocessing is therefore suggested as a way of taking advantage of the robustness of linear neurofiltering, while maintaining the nominal performance advantage of nonlinear neurofiltering.

  9. On-Line Robust Modal Stability Prediction using Wavelet Processing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Lind, Rick

    1998-01-01

    Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.

  10. Uncertainty analysis and robust trajectory linearization control of a flexible air-breathing hypersonic vehicle

    NASA Astrophysics Data System (ADS)

    Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang

    2014-08-01

    Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.

  11. Thermotaxis is a Robust Mechanism for Thermoregulation in C. elegans Nematodes

    PubMed Central

    Ramot, Daniel; MacInnis, Bronwyn L.; Lee, Hau-Chen; Goodman, Miriam B.

    2013-01-01

    Many biochemical networks are robust to variations in network or stimulus parameters. Although robustness is considered an important design principle of such networks, it is not known whether this principle also applies to higher-level biological processes such as animal behavior. In thermal gradients, C. elegans uses thermotaxis to bias its movement along the direction of the gradient. Here we develop a detailed, quantitative map of C. elegans thermotaxis and use these data to derive a computational model of thermotaxis in the soil, a natural environment of C. elegans. This computational analysis indicates that thermotaxis enables animals to avoid temperatures at which they cannot reproduce, to limit excursions from their adapted temperature, and to remain relatively close to the surface of the soil, where oxygen is abundant. Furthermore, our analysis reveals that this mechanism is robust to large variations in the parameters governing both worm locomotion and temperature fluctuations in the soil. We suggest that, similar to biochemical networks, animals evolve behavioral strategies that are robust, rather than strategies that rely on fine-tuning of specific behavioral parameters. PMID:19020047

  12. Leveraging AMI data for distribution system model calibration and situational awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peppanen, Jouni; Reno, Matthew J.; Thakkar, Mohini

    The many new distributed energy resources being installed at the distribution system level require increased visibility into system operations that will be enabled by distribution system state estimation (DSSE) and situational awareness applications. Reliable and accurate DSSE requires both robust methods for managing the big data provided by smart meters and quality distribution system models. This paper presents intelligent methods for detecting and dealing with missing or inaccurate smart meter data, as well as the ways to process the data for different applications. It also presents an efficient and flexible parameter estimation method based on the voltage drop equation andmore » regression analysis to enhance distribution system model accuracy. Finally, it presents a 3-D graphical user interface for advanced visualization of the system state and events. Moreover, we demonstrate this paper for a university distribution network with the state-of-the-art real-time and historical smart meter data infrastructure.« less

  13. Dynamic heterogeneity and non-Gaussian statistics for acetylcholine receptors on live cell membrane

    NASA Astrophysics Data System (ADS)

    He, W.; Song, H.; Su, Y.; Geng, L.; Ackerson, B. J.; Peng, H. B.; Tong, P.

    2016-05-01

    The Brownian motion of molecules at thermal equilibrium usually has a finite correlation time and will eventually be randomized after a long delay time, so that their displacement follows the Gaussian statistics. This is true even when the molecules have experienced a complex environment with a finite correlation time. Here, we report that the lateral motion of the acetylcholine receptors on live muscle cell membranes does not follow the Gaussian statistics for normal Brownian diffusion. From a careful analysis of a large volume of the protein trajectories obtained over a wide range of sampling rates and long durations, we find that the normalized histogram of the protein displacements shows an exponential tail, which is robust and universal for cells under different conditions. The experiment indicates that the observed non-Gaussian statistics and dynamic heterogeneity are inherently linked to the slow-active remodelling of the underlying cortical actin network.

  14. Fast hydrological model calibration based on the heterogeneous parallel computing accelerated shuffled complex evolution method

    NASA Astrophysics Data System (ADS)

    Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Hong, Yang; Zuo, Depeng; Ren, Minglei; Lei, Tianjie; Liang, Ke

    2018-01-01

    Hydrological model calibration has been a hot issue for decades. The shuffled complex evolution method developed at the University of Arizona (SCE-UA) has been proved to be an effective and robust optimization approach. However, its computational efficiency deteriorates significantly when the amount of hydrometeorological data increases. In recent years, the rise of heterogeneous parallel computing has brought hope for the acceleration of hydrological model calibration. This study proposed a parallel SCE-UA method and applied it to the calibration of a watershed rainfall-runoff model, the Xinanjiang model. The parallel method was implemented on heterogeneous computing systems using OpenMP and CUDA. Performance testing and sensitivity analysis were carried out to verify its correctness and efficiency. Comparison results indicated that heterogeneous parallel computing-accelerated SCE-UA converged much more quickly than the original serial version and possessed satisfactory accuracy and stability for the task of fast hydrological model calibration.

  15. Leveraging AMI data for distribution system model calibration and situational awareness

    DOE PAGES

    Peppanen, Jouni; Reno, Matthew J.; Thakkar, Mohini; ...

    2015-01-15

    The many new distributed energy resources being installed at the distribution system level require increased visibility into system operations that will be enabled by distribution system state estimation (DSSE) and situational awareness applications. Reliable and accurate DSSE requires both robust methods for managing the big data provided by smart meters and quality distribution system models. This paper presents intelligent methods for detecting and dealing with missing or inaccurate smart meter data, as well as the ways to process the data for different applications. It also presents an efficient and flexible parameter estimation method based on the voltage drop equation andmore » regression analysis to enhance distribution system model accuracy. Finally, it presents a 3-D graphical user interface for advanced visualization of the system state and events. Moreover, we demonstrate this paper for a university distribution network with the state-of-the-art real-time and historical smart meter data infrastructure.« less

  16. Report on the Stanford/Ames direct-link space suit prehensor

    NASA Technical Reports Server (NTRS)

    Jameson, J. W.; Leifer, Larry

    1987-01-01

    Researchers at the Center for Design Research at Stanford University, in collaboration with NASA Ames at Moffet Field, California, are developing hand-powered mechanical prehensors to replace gloves for EVA spacesuits. The design and functional properties of the first version Direct Link Prehensor (DLP) is discussed. It has a total of six degrees-of-freedom and is the most elaborate of three prehensors being developed for the project. The DLP has a robust design and utilizes only linkages and revolute joints for the drive system. With its anthropomorphic configuration of two fingers and a thumb, it is easy to control and is capable of all of the basic prehension patterns such as cylindrical or lateral pinch grasps. Kinematic analysis reveals that, assuming point contacts, a grasped object can be manipulated with three degrees-of-freedom. Yet, in practice more degrees-of-freedom are possible.

  17. Robustness surfaces of complex networks

    PubMed Central

    Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis

    2014-01-01

    Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared. PMID:25178402

  18. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis.

    PubMed

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-07-01

    A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  19. Robust Linear Models for Cis-eQTL Analysis.

    PubMed

    Rantalainen, Mattias; Lindgren, Cecilia M; Holmes, Christopher C

    2015-01-01

    Expression Quantitative Trait Loci (eQTL) analysis enables characterisation of functional genetic variation influencing expression levels of individual genes. In outbread populations, including humans, eQTLs are commonly analysed using the conventional linear model, adjusting for relevant covariates, assuming an allelic dosage model and a Gaussian error term. However, gene expression data generally have noise that induces heavy-tailed errors relative to the Gaussian distribution and often include atypical observations, or outliers. Such departures from modelling assumptions can lead to an increased rate of type II errors (false negatives), and to some extent also type I errors (false positives). Careful model checking can reduce the risk of type-I errors but often not type II errors, since it is generally too time-consuming to carefully check all models with a non-significant effect in large-scale and genome-wide studies. Here we propose the application of a robust linear model for eQTL analysis to reduce adverse effects of deviations from the assumption of Gaussian residuals. We present results from a simulation study as well as results from the analysis of real eQTL data sets. Our findings suggest that in many situations robust models have the potential to provide more reliable eQTL results compared to conventional linear models, particularly in respect to reducing type II errors due to non-Gaussian noise. Post-genomic data, such as that generated in genome-wide eQTL studies, are often noisy and frequently contain atypical observations. Robust statistical models have the potential to provide more reliable results and increased statistical power under non-Gaussian conditions. The results presented here suggest that robust models should be considered routinely alongside other commonly used methodologies for eQTL analysis.

  20. The Strategic Data Project: Improving Strategic and Management Decisions in Educational Agencies through the Effective Use of Data

    ERIC Educational Resources Information Center

    Kane, Thomas J.; Baxter, Andrew D.; Schooley, Korynn

    2012-01-01

    Launched in 2008, the Strategic Data Project, housed at the Center for Education Policy Research at Harvard University, seeks to bridge the divide between educational research and practice in order to transform the use of data in education to improve student achievement. Through the project, the authors build robust research partnerships with…

  1. The State & the System: NSHE Plan for Nevada's Colleges and Universities--Combining Excellence and Austerity to Attain Success

    ERIC Educational Resources Information Center

    Nevada System of Higher Education, 2011

    2011-01-01

    As Nevada struggles in this difficult time of recession and high unemployment, it is time to acknowledge that only through a robust and adequately funded education infrastructure and, in particular, higher education will the State ever achieve the diversification and growth all Nevadans need. Nevada must make a long term commitment to excellence…

  2. Addressing the Uncertain Future of Preserving the Past: Towards a Robust Strategy for Digital Archiving and Preservation. Technical Report

    ERIC Educational Resources Information Center

    Hoorens, Stijn; Rothenberg, Jeff; van Orange, Constantijn; van der Mandele, Martijn; Levitt, Ruth

    2007-01-01

    Storing and curating authentic academic literature and making it accessible for the long term has been a time-honoured task of national libraries. By guarding existing knowledge and facilitating its use to produce new insights, national and university libraries have formed an integral part of the research environment, complementing the roles of…

  3. A Retrospective Study of Academic Leadership Skill Development, Retention and Use: The Experience of the Food Systems Leadership Institute

    ERIC Educational Resources Information Center

    Fernandez, Claudia S. P.; Noble, Cheryl C.; Jensen, Elizabeth T.; Martin, Linda; Stewart, Marshall

    2016-01-01

    The Food Systems Leadership Institute (FSLI) is a 2-year leadership development program consisting of 3 intensive in-person immersion retreats, and a robust and customizable distance-based program. Participants come primarily from land-grant and public universities and learn about personal, organizational and system leadership with a focus on food…

  4. News and Views: Plain English? Government report highlights management and communication failures at STFC

    NASA Astrophysics Data System (ADS)

    2008-06-01

    The report of the Innovation, Universities, Science and Skills Select Committee sets out in robust, plain language, a damning summary of the Science and Technology Facilities Council's handling of its funding problems over the past few months, highlighting ``a poorly conceived delivery plan, lamentable communication and poor leadership, as well as major senior management misjudgements''.

  5. Microfluidic analysis of oocyte and embryo biomechanical properties to improve outcomes in assisted reproductive technologies.

    PubMed

    Yanez, Livia Z; Camarillo, David B

    2017-04-01

    Measurement of oocyte and embryo biomechanical properties has recently emerged as an exciting new approach to obtain a quantitative, objective estimate of developmental potential. However, many traditional methods for probing cell mechanical properties are time consuming, labor intensive and require expensive equipment. Microfluidic technology is currently making its way into many aspects of assisted reproductive technologies (ART), and is particularly well suited to measure embryo biomechanics due to the potential for robust, automated single-cell analysis at a low cost. This review will highlight microfluidic approaches to measure oocyte and embryo mechanics along with their ability to predict developmental potential and find practical application in the clinic. Although these new devices must be extensively validated before they can be integrated into the existing clinical workflow, they could eventually be used to constantly monitor oocyte and embryo developmental progress and enable more optimal decision making in ART. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Investigation of magnetic and magneto-transport properties of ferromagnetic-charge ordered core-shell nanostructures

    NASA Astrophysics Data System (ADS)

    Das, Kalipada

    2017-10-01

    In our present study, we address in detail the magnetic and magneto-transport properties of ferromagnetic-charge ordered core-shell nanostructures. In these core-shell nanostructures, well-known half metallic La0.67Sr0.33MnO3 nanoparticles (average particle size, ˜20 nm) are wrapped by the charge ordered antiferromagnetic Pr0.67Ca0.33MnO3 (PCMO) matrix. The intrinsic properties of PCMO markedly modify it into such a core-shell form. The robustness of the PCMO matrix becomes fragile and melts at an external magnetic field (H) of ˜20 kOe. The analysis of magneto-transport data indicates the systematic reduction of the electron-electron and electron-magnon interactions in the presence of an external magnetic field in these nanostructures. The pronounced training effect appears in this phase separated compound, which was analyzed by considering the second order tunneling through the grain boundaries of the nanostructures. Additionally, the analysis of low field magnetoconductance data supports the second order tunneling and shows the close value of the universal limit (˜1.33).

  7. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.

    PubMed

    Elgendi, Mohamed

    2016-11-02

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  8. Early parenting program as intervention strategy for emotional distress in first-time mothers: a propensity score analysis.

    PubMed

    Okamoto, Miwako; Ishigami, Hideaki; Tokimoto, Kumiko; Matsuoka, Megumi; Tango, Ryoko

    2013-08-01

    The purpose of this study is to evaluate the effectiveness of a single session intervention designed to reduce emotional distress in first-time mothers. We held a parenting class for first-time mothers who had given birth at a university hospital in Tokyo, Japan. The program of the class consists of lectures on infant care and group discussion, which is a common form of intervention in Japan. The effectiveness of intervention is assessed according to differences in emotional distress experienced by class participants and nonparticipants, and analyzed by the use of a propensity score method to avoid self-selection bias. In order to be more confident about our results, we employ several variations of this method. Results from statistical analysis show that although the effectiveness of the intervention was limited, it was able to alleviate subjects' loss of self-confidence as mothers. Because this outcome shows a good degree of consistency across methods, it can be considered robust. Moreover, it is roughly consistent with previous studies. Effectiveness can probably be increased by developing a program that improves upon the intervention.

  9. A metabolomics-based method for studying the effect of yfcC gene in Escherichia coli on metabolism.

    PubMed

    Wang, Xiyue; Xie, Yuping; Gao, Peng; Zhang, Sufang; Tan, Haidong; Yang, Fengxu; Lian, Rongwei; Tian, Jing; Xu, Guowang

    2014-04-15

    Metabolomics is a potent tool to assist in identifying the function of unknown genes through analysis of metabolite changes in the context of varied genetic backgrounds. However, the availability of a universal unbiased profiling analysis is still a big challenge. In this study, we report an optimized metabolic profiling method based on gas chromatography-mass spectrometry for Escherichia coli. It was found that physiological saline at -80°C could ensure satisfied metabolic quenching with less metabolite leakage. A solution of methanol/water (21:79, v/v) was proved to be efficient for intracellular metabolite extraction. This method was applied to investigate the metabolome difference among wild-type E. coli, its yfcC deletion, and overexpression mutants. Statistical and bioinformatic analysis of the metabolic profiling data indicated that the expression of yfcC potentially affected the metabolism of glyoxylate shunt. This finding was further validated by real-time quantitative polymerase chain reactions showing that expression of aceA and aceB, the key genes in glyoxylate shunt, was upregulated by yfcC. This study exemplifies the robustness of the proposed metabolic profiling analysis strategy and its potential roles in investigating unknown gene functions in view of metabolome difference. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sotiropoulos, Fotis; Marr, Jeffrey D.G.; Milliren, Christopher

    In January 2010, the University of Minnesota, along with academic and industry project partners, began work on a four year project to establish new facilities and research in strategic areas of wind energy necessary to move the nation towards a goal of 20% wind energy by 2030. The project was funded by the U.S. Department of Energy with funds made available through the American Recovery and Reinvestment Act of 2009. $7.9M of funds were provided by DOE and $3.1M was provided through matching funds. The project was organized into three Project Areas. Project Area 1 focused on design and developmentmore » of a utility scale wind energy research facility to support research and innovation. The project commissioned the Eolos Wind Research Field Station in November of 2011. The site, located 20 miles from St. Paul, MN operates a 2.5MW Clipper Liberty C-96 wind turbine, a 130-ft tall sensored meteorological tower and a robust sensor and data acquisition network. The site is operational and will continue to serve as a site for innovation in wind energy for the next 15 years. Project Areas 2 involved research on six distinct research projects critical to the 20% Wind Energy by 2030 goals. The research collaborations involved faculty from two universities, over nine industry partners and two national laboratories. Research outcomes include new knowledge, patents, journal articles, technology advancements, new computational models and establishment of new collaborative relationships between university and industry. Project Area 3 focused on developing educational opportunities in wind energy for engineering and science students. The primary outcome is establishment of a new graduate level course at the University of Minnesota called Wind Engineering Essentials. The seminar style course provides a comprehensive analysis of wind energy technology, economics, and operation. The course is highly successful and will continue to be offered at the University. The vision of U.S. DOE to establish unique, open-access research facilities and creation of university-industry research collaborations in wind energy were achieved through this project. The University of Minnesota, through the establishment of the Eolos Wind Energy Consortium and the Eolos Wind Research Field Station continue to develop new research collaborations with industry partners.« less

  11. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    PubMed

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.

  12. Enhancement of absorption and resistance of motion utilizing a multi-channel opto-electronic sensor to effectively monitor physiological signs during sport exercise

    NASA Astrophysics Data System (ADS)

    Alzahrani, Abdullah; Hu, Sijung; Azorin-Peris, Vicente; Barrett, Laura; Esliger, Dale; Hayes, Matthew; Akbare, Shafique; Achart, Jérôme; Kuoch, Sylvain

    2015-03-01

    This study presents an effective engineering approach for human vital signs monitoring as increasingly demanded by personal healthcare. The aim of this work is to study how to capture critical physiological parameters efficiently through a well-constructed electronic system and a robust multi-channel opto-electronic patch sensor (OEPS), together with a wireless communication. A unique design comprising multi-wavelength illumination sources and a rapid response photo sensor with a 3-axis accelerometer enables to recover pulsatile features, compensate motion and increase signal-to-noise ratio. An approved protocol with designated tests was implemented at Loughborough University a UK leader in sport and exercise assessment. The results of sport physiological effects were extracted from the datasets of physical movements, i.e. sitting, standing, waking, running and cycling. t-test, Bland-Altman and correlation analysis were applied to evaluate the performance of the OEPS system against Acti-Graph and Mio-Alpha.There was no difference in heart rate measured using OEPS and both Acti-Graph and Mio-Alpha (both p<0.05). Strong correlations were observed between HR measured from the OEPS and both the Acti-graph and Mio-Alpha (r = 0.96, p<0.001). Bland-Altman analysis for the Acti-Graph and OEPS found the bias 0.85 bpm, the standard deviation 9.20 bpm, and the limits of agreement (LOA) -17.18 bpm to +18.88 bpm for lower and upper limits of agreement respectively, for the Mio-Alpha and OEPS the bias is 1.63 bpm, standard deviation SD8.62 bpm, lower and upper limits of agreement, - 15.27 bpm and +18.58 bpm respectively. The OEPS demonstrates a real time, robust and remote monitoring of cardiovascular function.

  13. Origin of the invasive Arundo donax (Poaceae): a trans-Asian expedition in herbaria.

    PubMed

    Hardion, Laurent; Verlaque, Régine; Saltonstall, Kristin; Leriche, Agathe; Vila, Bruno

    2014-09-01

    The hypothesis of an ancient introduction, i.e. archaeophyte origin, is one of the most challenging questions in phylogeography. Arundo donax (Poaceae) is currently considered to be one of the worst invasive species globally, but it has also been widely utilzed by man across Eurasia for millennia. Despite a lack of phylogenetic data, recent literature has often speculated on its introduction to the Mediterranean region. This study tests the hypothesis of its ancient introduction from Asia to the Mediterranean by using plastid DNA sequencing and morphometric analysis on 127 herbarium specimens collected across sub-tropical Eurasia. In addition, a bioclimatic species distribution model calibrated on 1221 Mediterranean localities was used to identify similar ecological niches in Asia. Despite analysis of several plastid DNA hypervariable sites and the identification of 13 haplotypes, A. donax was represented by a single haplotype from the Mediterranean to the Middle East. This haplotype is shared with invasive samples worldwide, and its nearest phylogenetic relatives are located in the Middle East. Morphometric data characterized this invasive clone by a robust morphotype distinguishable from all other Asian samples. The ecological niche modelling designated the southern Caspian Sea, southern Iran and the Indus Valley as the most suitable regions of origin in Asia for the invasive clone of A. donax. Using an integrative approach, an ancient dispersion of this robust, polyploid and non-fruiting clone is hypothesized from the Middle East to the west, leading to its invasion throughout the Mediterranean Basin. © The Author 2014. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Unemployment, public-sector health-care spending and breast cancer mortality in the European Union: 1990-2009.

    PubMed

    Maruthappu, Mahiben; Watkins, Johnathan A; Waqar, Mueez; Williams, Callum; Ali, Raghib; Atun, Rifat; Faiz, Omar; Zeltner, Thomas

    2015-04-01

    The global economic crisis has been associated with increased unemployment, reduced health-care spending and adverse health outcomes. Insights into the impact of economic variations on cancer mortality, however, remain limited. We used multivariate regression analysis to assess how changes in unemployment and public-sector expenditure on health care (PSEH) varied with female breast cancer mortality in the 27 European Union member states from 1990 to 2009. We then determined how the association with unemployment was modified by PSEH. Country-specific differences in infrastructure and demographic structure were controlled for, and 1-, 3-, 5- and 10-year lag analyses were conducted. Several robustness checks were also implemented. Unemployment was associated with an increase in breast cancer mortality [P < 0.0001, coefficient (R) = 0.1829, 95% confidence interval (CI) 0.0978-0.2680]. Lag analysis showed a continued increase in breast cancer mortality at 1, 3, 5 and 10 years after unemployment rises (P < 0.05). Controlling for PSEH removed this association (P = 0.063, R = 0.080, 95% CI -0.004 to 0.163). PSEH increases were associated with significant decreases in breast cancer mortality (P < 0.0001, R = -1.28, 95% CI -1.67 to -0.877). The association between unemployment and breast cancer mortality remained in all robustness checks. Rises in unemployment are associated with significant short- and long-term increases in breast cancer mortality, while increases in PSEH are associated with reductions in breast cancer mortality. Initiatives that bolster employment and maintain total health-care expenditure may help minimize increases in breast cancer mortality during economic crises. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  15. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    NASA Astrophysics Data System (ADS)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  16. Identification and robust control of an experimental servo motor.

    PubMed

    Adam, E J; Guestrin, E D

    2002-04-01

    In this work, the design of a robust controller for an experimental laboratory-scale position control system based on a dc motor drive as well as the corresponding identification and robust stability analysis are presented. In order to carry out the robust design procedure, first, a classic closed-loop identification technique is applied and then, the parametrization by internal model control is used. The model uncertainty is evaluated under both parametric and global representation. For the latter case, an interesting discussion about the conservativeness of this description is presented by means of a comparison between the uncertainty disk and the critical perturbation radius approaches. Finally, conclusions about the performance of the experimental system with the robust controller are discussed using comparative graphics of the controlled variable and the Nyquist stability margin as a robustness measurement.

  17. The significance of developmental robustness for species diversity.

    PubMed

    Melzer, Rainer; Theißen, Günter

    2016-04-01

    The origin of new species and of new forms is one of the fundamental characteristics of evolution. However, the mechanisms that govern the diversity and disparity of lineages remain poorly understood. Particularly unclear are the reasons why some taxa are vastly more species-rich than others and the manner in which species diversity and morphological disparity are interrelated. Evolutionary innovations and ecological opportunities are usually cited as among the major factors promoting the evolution of species diversity. In many cases it is likely that these factors are positively reinforcing, with evolutionary innovations creating ecological opportunities that in turn foster the origin of new innovations. However, we propose that a third factor, developmental robustness, is very often essential for this reinforcement to be effective. Evolutionary innovations need to be stably and robustly integrated into the developmental genetic programme of an organism to be a suitable substrate for selection to 'explore' ecological opportunities and morphological 'design' space (morphospace). In particular, we propose that developmental robustness of the bauplan is often a prerequisite for the exploration of morphospace and to enable the evolution of further novelties built upon this bauplan Thus, while robustness may reduce the morphological disparity at one level, it may be the basis for increased morphological disparity and for evolutionary innovations at another level, thus fostering species diversity. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Energy-Water Nexus Knowledge Discovery Framework

    NASA Astrophysics Data System (ADS)

    Bhaduri, B. L.; Foster, I.; Chandola, V.; Chen, B.; Sanyal, J.; Allen, M.; McManamay, R.

    2017-12-01

    As demand for energy grows, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. An integrated data driven modeling, analysis, and visualization capability is needed to understand, design, and develop efficient local and regional practices for the energy-water infrastructure components that can be guided with strategic (federal) policy decisions to ensure national energy resilience. To meet this need of the energy-water nexus (EWN) community, an Energy-Water Knowledge Discovery Framework (EWN-KDF) is being proposed to accomplish two objectives: Development of a robust data management and geovisual analytics platform that provides access to disparate and distributed physiographic, critical infrastructure, and socioeconomic data, along with emergent ad-hoc sensor data to provide a powerful toolkit of analysis algorithms and compute resources to empower user-guided data analysis and inquiries; and Demonstration of knowledge generation with selected illustrative use cases for the implications of climate variability for coupled land-water-energy systems through the application of state-of-the art data integration, analysis, and synthesis. Oak Ridge National Laboratory (ORNL), in partnership with Argonne National Laboratory (ANL) and researchers affiliated with the Center for International Earth Science Information Partnership (CIESIN) at Columbia University and State University of New York-Buffalo (SUNY), propose to develop this Energy-Water Knowledge Discovery Framework to generate new, critical insights regarding the complex dynamics of the EWN and its interactions with climate variability and change. An overarching objective of this project is to integrate impacts, adaptation, and vulnerability (IAV) science with emerging data science to meet the data analysis needs of the U.S. Department of Energy and partner federal agencies with respect to the EWN.

  19. The Influence Function of Principal Component Analysis by Self-Organizing Rule.

    PubMed

    Higuchi; Eguchi

    1998-07-28

    This article is concerned with a neural network approach to principal component analysis (PCA). An algorithm for PCA by the self-organizing rule has been proposed and its robustness observed through the simulation study by Xu and Yuille (1995). In this article, the robustness of the algorithm against outliers is investigated by using the theory of influence function. The influence function of the principal component vector is given in an explicit form. Through this expression, the method is shown to be robust against any directions orthogonal to the principal component vector. In addition, a statistic generated by the self-organizing rule is proposed to assess the influence of data in PCA.

  20. Sample manipulation and data assembly for robust microcrystal synchrotron crystallography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Gongrui; Fuchs, Martin R.; Shi, Wuxian

    With the recent developments in microcrystal handling, synchrotron microdiffraction beamline instrumentation and data analysis, microcrystal crystallography with crystal sizes of less than 10 µm is appealing at synchrotrons. However, challenges remain in sample manipulation and data assembly for robust microcrystal synchrotron crystallography. Here, the development of micro-sized polyimide well-mounts for the manipulation of microcrystals of a few micrometres in size and the implementation of a robust data-analysis method for the assembly of rotational microdiffraction data sets from many microcrystals are described. Here, the method demonstrates that microcrystals may be routinely utilized for the acquisition and assembly of complete data setsmore » from synchrotron microdiffraction beamlines.« less

  1. Sample manipulation and data assembly for robust microcrystal synchrotron crystallography

    DOE PAGES

    Guo, Gongrui; Fuchs, Martin R.; Shi, Wuxian; ...

    2018-04-19

    With the recent developments in microcrystal handling, synchrotron microdiffraction beamline instrumentation and data analysis, microcrystal crystallography with crystal sizes of less than 10 µm is appealing at synchrotrons. However, challenges remain in sample manipulation and data assembly for robust microcrystal synchrotron crystallography. Here, the development of micro-sized polyimide well-mounts for the manipulation of microcrystals of a few micrometres in size and the implementation of a robust data-analysis method for the assembly of rotational microdiffraction data sets from many microcrystals are described. Here, the method demonstrates that microcrystals may be routinely utilized for the acquisition and assembly of complete data setsmore » from synchrotron microdiffraction beamlines.« less

  2. Efficient and robust analysis of complex scattering data under noise in microwave resonators.

    PubMed

    Probst, S; Song, F B; Bushev, P A; Ustinov, A V; Weides, M

    2015-02-01

    Superconducting microwave resonators are reliable circuits widely used for detection and as test devices for material research. A reliable determination of their external and internal quality factors is crucial for many modern applications, which either require fast measurements or operate in the single photon regime with small signal to noise ratios. Here, we use the circle fit technique with diameter correction and provide a step by step guide for implementing an algorithm for robust fitting and calibration of complex resonator scattering data in the presence of noise. The speedup and robustness of the analysis are achieved by employing an algebraic rather than an iterative fit technique for the resonance circle.

  3. Rocket Design for the Future

    NASA Technical Reports Server (NTRS)

    Follett, William W.; Rajagopal, Raj

    2001-01-01

    The focus of the AA MDO team is to reduce product development cost through the capture and automation of best design and analysis practices and through increasing the availability of low-cost, high-fidelity analysis. Implementation of robust designs reduces costs associated with the Test-Fall-Fix cycle. RD is currently focusing on several technologies to improve the design process, including optimization and robust design, expert and rule-based systems, and collaborative technologies.

  4. Robust control charts in industrial production of olive oil

    NASA Astrophysics Data System (ADS)

    Grilo, Luís M.; Mateus, Dina M. R.; Alves, Ana C.; Grilo, Helena L.

    2014-10-01

    Acidity is one of the most important variables in the quality analysis and characterization of olive oil. During the industrial production we use individuals and moving range charts to monitor this variable, which is not always normal distributed. After a brief exploratory data analysis, where we use the bootstrap method, we construct control charts, before and after a Box-Cox transformation, and compare their robustness and performance.

  5. Closed-Loop Evaluation of an Integrated Failure Identification and Fault Tolerant Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine; Khong, thuan

    2006-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems developed for failure detection, identification, and reconfiguration, as well as upset recovery, need to be evaluated over broad regions of the flight envelope or under extreme flight conditions, and should include various sources of uncertainty. To apply formal robustness analysis, formulation of linear fractional transformation (LFT) models of complex parameter-dependent systems is required, which represent system uncertainty due to parameter uncertainty and actuator faults. This paper describes a detailed LFT model formulation procedure from the nonlinear model of a transport aircraft by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The closed-loop system is evaluated over the entire flight envelope based on the generated LFT model which can cover nonlinear dynamics. The robustness analysis results of the closed-loop fault tolerant control system of a transport aircraft are presented. A reliable flight envelope (safe flight regime) is also calculated from the robust performance analysis results, over which the closed-loop system can achieve the desired performance of command tracking and failure detection.

  6. Contour plot assessment of existing meta-analyses confirms robust association of statin use and acute kidney injury risk.

    PubMed

    Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W

    2015-10-01

    Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Review of LMIs, Interior Point Methods, Complexity Theory, and Robustness Analysis

    NASA Technical Reports Server (NTRS)

    Mesbahi, M.

    1996-01-01

    From end of intro: ...We would like to show that for certain problems in systems and control theory, there exist algorithms for which corresponding (xi) can be viewed as a certain measure of robustness, e.g., stability margin.

  8. TH-CD-209-05: Impact of Spot Size and Spacing On the Quality of Robustly-Optimized Intensity-Modulated Proton Therapy Plans for Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, W; Ding, X; Hu, Y

    Purpose: To investigate how spot size and spacing affect plan quality, especially, plan robustness and the impact of interplay effect, of robustly-optimized intensity-modulated proton therapy (IMPT) plans for lung cancer. Methods: Two robustly-optimized IMPT plans were created for 10 lung cancer patients: (1) one for a proton beam with in-air energy dependent large spot size at isocenter (σ: 5–15 mm) and spacing (1.53σ); (2) the other for a proton beam with small spot size (σ: 2–6 mm) and spacing (5 mm). Both plans were generated on the average CTs with internal-gross-tumor-volume density overridden to irradiate internal target volume (ITV). Themore » root-mean-square-dose volume histograms (RVH) measured the sensitivity of the dose to uncertainties, and the areas under RVH curves were used to evaluate plan robustness. Dose evaluation software was developed to model time-dependent spot delivery to incorporate interplay effect with randomized starting phases of each field per fraction. Patient anatomy voxels were mapped from phase to phase via deformable image registration to score doses. Dose-volume-histogram indices including ITV coverage, homogeneity, and organs-at-risk (OAR) sparing were compared using Student-t test. Results: Compared to large spots, small spots resulted in significantly better OAR sparing with comparable ITV coverage and homogeneity in the nominal plan. Plan robustness was comparable for ITV and most OARs. With interplay effect considered, significantly better OAR sparing with comparable ITV coverage and homogeneity is observed using smaller spots. Conclusion: Robust optimization with smaller spots significantly improves OAR sparing with comparable plan robustness and similar impact of interplay effect compare to larger spots. Small spot size requires the use of larger number of spots, which gives optimizer more freedom to render a plan more robust. The ratio between spot size and spacing was found to be more relevant to determine plan robustness and the impact of interplay effect than spot size alone. This research was supported by the National Cancer Institute Career Developmental Award K25CA168984, by the Fraternal Order of Eagles Cancer Research Fund Career Development Award, by The Lawrence W. and Marilyn W. Matteson Fund for Cancer Research, by Mayo Arizona State University Seed Grant, and by The Kemper Marley Foundation.« less

  9. Modeling the impact of the 7-valent pneumococcal conjugate vaccine in Chinese infants: an economic analysis of a compulsory vaccination.

    PubMed

    Che, Datian; Zhou, Hua; He, Jinchun; Wu, Bin

    2014-02-07

    The purpose of this study was to compare, from a Chinese societal perspective, the projected health benefits, costs, and cost-effectiveness of adding pneumococcal conjugate heptavalent vaccine (PCV-7) to the routine compulsory child immunization schedule. A decision-tree model, with data and assumptions adapted for relevance to China, was developed to project the health outcomes of PCV-7 vaccination (compared with no vaccination) over a 5-year period as well as a lifetime. The vaccinated birth cohort included 16,000,000 children in China. A 2 + 1 dose schedule at US$136.51 per vaccine dose was used in the base-case analysis. One-way sensitivity analysis was used to test the robustness of the model. The impact of a net indirect effect (herd immunity) was evaluated. Outcomes are presented in terms of the saved disease burden, costs, quality-adjusted life years (QALYs) and incremental cost-effectiveness ratio. In a Chinese birth cohort, a PCV-7 vaccination program would reduce the number of pneumococcus-related infections by at least 32% and would prevent 2,682 deaths in the first 5 years of life, saving $1,190 million in total costs and gaining an additional 9,895 QALYs (discounted by 3%). The incremental cost per QALY was estimated to be $530,354. When herd immunity was taken into account, the cost per QALY was estimated to be $95,319. The robustness of the model was influenced mainly by the PCV-7 cost per dose, effectiveness herd immunity and incidence of pneumococcal diseases. With and without herd immunity, the break-even costs in China were $29.05 and $25.87, respectively. Compulsory routine infant vaccination with PCV-7 is projected to substantially reduce pneumococcal disease morbidity, mortality, and related costs in China. However, a universal vaccination program with PCV-7 is not cost-effective at the willingness-to-pay threshold that is currently recommended for China by the World Health Organization.

  10. A robust control scheme for flexible arms with friction in the joints

    NASA Technical Reports Server (NTRS)

    Rattan, Kuldip S.; Feliu, Vicente; Brown, H. Benjamin, Jr.

    1988-01-01

    A general control scheme to control flexible arms with friction in the joints is proposed in this paper. This scheme presents the advantage of being robust in the sense that it minimizes the effects of the Coulomb friction existing in the motor and the effects of changes in the dynamic friction coefficient. A justification of the robustness properties of the scheme is given in terms of the sensitivity analysis.

  11. Empirical analysis of RNA robustness and evolution using high-throughput sequencing of ribozyme reactions.

    PubMed

    Hayden, Eric J

    2016-08-15

    RNA molecules provide a realistic but tractable model of a genotype to phenotype relationship. This relationship has been extensively investigated computationally using secondary structure prediction algorithms. Enzymatic RNA molecules, or ribozymes, offer access to genotypic and phenotypic information in the laboratory. Advancements in high-throughput sequencing technologies have enabled the analysis of sequences in the lab that now rivals what can be accomplished computationally. This has motivated a resurgence of in vitro selection experiments and opened new doors for the analysis of the distribution of RNA functions in genotype space. A body of computational experiments has investigated the persistence of specific RNA structures despite changes in the primary sequence, and how this mutational robustness can promote adaptations. This article summarizes recent approaches that were designed to investigate the role of mutational robustness during the evolution of RNA molecules in the laboratory, and presents theoretical motivations, experimental methods and approaches to data analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Universal Scaling of Robust Thermal Hot Spot and Ionic Current Enhancement by Focused Ohmic Heating in a Conic Nanopore

    NASA Astrophysics Data System (ADS)

    Pan, Zehao; Wang, Ceming; Li, Meng; Chang, Hsueh-Chia

    2016-09-01

    A stable nanoscale thermal hot spot, with temperature approaching 100 °C , is shown to be sustained by localized Ohmic heating of a focused electric field at the tip of a slender conic nanopore. The self-similar (length-independent) conic geometry allows us to match the singular heat source at the tip to the singular radial heat loss from the slender cone to obtain a self-similar steady temperature profile along the cone and the resulting ionic current conductance enhancement due to viscosity reduction. The universal scaling, which depends only on a single dimensionless parameter Z , collapses the measured conductance data and computed temperature profiles in ion-track conic nanopores and conic nanopipettes. The collapsed numerical data reveal universal values for the hot-spot location and temperature in an aqueous electrolyte.

  13. Universal Scaling of Robust Thermal Hot Spot and Ionic Current Enhancement by Focused Ohmic Heating in a Conic Nanopore.

    PubMed

    Pan, Zehao; Wang, Ceming; Li, Meng; Chang, Hsueh-Chia

    2016-09-23

    A stable nanoscale thermal hot spot, with temperature approaching 100 °C, is shown to be sustained by localized Ohmic heating of a focused electric field at the tip of a slender conic nanopore. The self-similar (length-independent) conic geometry allows us to match the singular heat source at the tip to the singular radial heat loss from the slender cone to obtain a self-similar steady temperature profile along the cone and the resulting ionic current conductance enhancement due to viscosity reduction. The universal scaling, which depends only on a single dimensionless parameter Z, collapses the measured conductance data and computed temperature profiles in ion-track conic nanopores and conic nanopipettes. The collapsed numerical data reveal universal values for the hot-spot location and temperature in an aqueous electrolyte.

  14. Fire and Explosion Hazards Expected in a Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rasool, Shireen R.; Al-Dahhan, Wedad; Al-Zuhairi, Ali Jassim

    Scientists at universities across Iraq are actively working to report actual incidents and accidents occurring in their laboratories, as well as structural improvements made to improve safety and security, to raise awareness and encourage openness, leading to widespread adoption of robust Chemical Safety and Security (CSS) practices. This manuscript is the fifth in a series of five case studies describing laboratory incidents, accidents, and laboratory improvements. In this study, we summarize unsafe practices involving the improper installation of a Gas Chromatograph (GC) at an Iraqi university which, if not corrected, could have resulted in a dangerous fire and explosion. Wemore » summarize the identified infractions and highlight lessons learned. By openly sharing the experiences at the university involved, we hope to minimize the possibility of another researcher being injured due to similarly unsafe practices in the future.« less

  15. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.

    2013-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R.M. 2007. Robust estimation of the variogram by residual maximum likelihood. Geoderma 140: 62-72. Richardson, A.M. and Welsh, A.H. 1995. Robust restricted maximum likelihood in mixed linear models. Biometrics 51: 1429-1439. Welsh, A.H. and Richardson, A.M. 1997. Approaches to the robust estimation of mixed models. In: Handbook of Statistics Vol. 15, Elsevier, pp. 343-384.

  16. Multiplex social ecological network analysis reveals how social changes affect community robustness more than resource depletion.

    PubMed

    Baggio, Jacopo A; BurnSilver, Shauna B; Arenas, Alex; Magdanz, James S; Kofinas, Gary P; De Domenico, Manlio

    2016-11-29

    Network analysis provides a powerful tool to analyze complex influences of social and ecological structures on community and household dynamics. Most network studies of social-ecological systems use simple, undirected, unweighted networks. We analyze multiplex, directed, and weighted networks of subsistence food flows collected in three small indigenous communities in Arctic Alaska potentially facing substantial economic and ecological changes. Our analysis of plausible future scenarios suggests that changes to social relations and key households have greater effects on community robustness than changes to specific wild food resources.

  17. A Universal Method for Species Identification of Mammals Utilizing Next Generation Sequencing for the Analysis of DNA Mixtures

    PubMed Central

    Tillmar, Andreas O.; Dell'Amico, Barbara; Welander, Jenny; Holmlund, Gunilla

    2013-01-01

    Species identification can be interesting in a wide range of areas, for example, in forensic applications, food monitoring and in archeology. The vast majority of existing DNA typing methods developed for species determination, mainly focuses on a single species source. There are, however, many instances where all species from mixed sources need to be determined, even when the species in minority constitutes less than 1 % of the sample. The introduction of next generation sequencing opens new possibilities for such challenging samples. In this study we present a universal deep sequencing method using 454 GS Junior sequencing of a target on the mitochondrial gene 16S rRNA. The method was designed through phylogenetic analyses of DNA reference sequences from more than 300 mammal species. Experiments were performed on artificial species-species mixture samples in order to verify the method’s robustness and its ability to detect all species within a mixture. The method was also tested on samples from authentic forensic casework. The results showed to be promising, discriminating over 99.9 % of mammal species and the ability to detect multiple donors within a mixture and also to detect minor components as low as 1 % of a mixed sample. PMID:24358309

  18. Mapping of the Available Chemical Space versus the Chemical Universe of Lead-Like Compounds.

    PubMed

    Lin, Arkadii; Horvath, Dragos; Afonina, Valentina; Marcou, Gilles; Reymond, Jean-Louis; Varnek, Alexandre

    2018-03-20

    This is, to our knowledge, the most comprehensive analysis to date based on generative topographic mapping (GTM) of fragment-like chemical space (40 million molecules with no more than 17 heavy atoms, both from the theoretically enumerated GDB-17 and real-world PubChem/ChEMBL databases). The challenge was to prove that a robust map of fragment-like chemical space can actually be built, in spite of a limited (≪10 5 ) maximal number of compounds ("frame set") usable for fitting the GTM manifold. An evolutionary map building strategy has been updated with a "coverage check" step, which discards manifolds failing to accommodate compounds outside the frame set. The evolved map has a good propensity to separate actives from inactives for more than 20 external structure-activity sets. It was proven to properly accommodate the entire collection of 40 m compounds. Next, it served as a library comparison tool to highlight biases of real-world molecules (PubChem and ChEMBL) versus the universe of all possible species represented by FDB-17, a fragment-like subset of GDB-17 containing 10 million molecules. Specific patterns, proper to some libraries and absent from others (diversity holes), were highlighted. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Early Onset of Type 1 Diabetes and Educational Field at Upper Secondary and University Level: Is Own Experience an Asset for a Health Care Career?

    PubMed Central

    Steen Carlsson, Katarina

    2017-01-01

    Ill health in early life has a significant negative impact on school grades, grade repetition, educational level, and labor market outcomes. However, less is known about qualitative socio-economic consequences of a health shock in childhood or adolescence. We investigate the relationship between onset of type 1 diabetes up to age 15 and the probability of choosing and completing a health-oriented path at upper secondary and university level of education. We analyze the Swedish Childhood Diabetes Register, the National Educational Register, and other population registers in Sweden for 2756 people with type 1 diabetes and 10,020 matched population controls. Educational decisions are modeled as unsorted series of binary choices to assess the choice of educational field as a potential mechanism linking early life health to adult outcomes. The analyses reject the hypothesis of no systematic differences in choice of educational field between people with and without type 1 diabetes at both levels. The results are robust to selection on ability proxies and across sensitivity analysis. We conclude that the observed pro health-oriented educational choices among people with type 1 diabetes in our data are consistent with disease onset in childhood and adolescence having qualitative impact on life-course choices. PMID:28665347

  20. Early Onset of Type 1 Diabetes and Educational Field at Upper Secondary and University Level: Is Own Experience an Asset for a Health Care Career?

    PubMed

    Lovén, Ida; Steen Carlsson, Katarina

    2017-06-30

    Ill health in early life has a significant negative impact on school grades, grade repetition, educational level, and labor market outcomes. However, less is known about qualitative socio-economic consequences of a health shock in childhood or adolescence. We investigate the relationship between onset of type 1 diabetes up to age 15 and the probability of choosing and completing a health-oriented path at upper secondary and university level of education. We analyze the Swedish Childhood Diabetes Register, the National Educational Register, and other population registers in Sweden for 2756 people with type 1 diabetes and 10,020 matched population controls. Educational decisions are modeled as unsorted series of binary choices to assess the choice of educational field as a potential mechanism linking early life health to adult outcomes. The analyses reject the hypothesis of no systematic differences in choice of educational field between people with and without type 1 diabetes at both levels. The results are robust to selection on ability proxies and across sensitivity analysis. We conclude that the observed pro health-oriented educational choices among people with type 1 diabetes in our data are consistent with disease onset in childhood and adolescence having qualitative impact on life-course choices.

  1. Strengthening research capacity through the medical education partnership initiative: the Mozambique experience

    PubMed Central

    2013-01-01

    Background Since Mozambique’s independence, the major emphasis of its higher educational institutions has been on didactic education. Because of fiscal and human resource constraints, basic and applied research activities have been relatively modest in scope, and priorities have often been set primarily by external collaborators. These factors have compromised the scope and the relevance of locally conducted research and have limited the impact of Mozambique’s universities as major catalysts for national development. Case description We developed a multi-institutional partnership to undertake a comprehensive analysis of the research environment at Mozambique’s major public universities to identify factors that have served as barriers to the development of a robust research enterprise. Based on this analysis, we developed a multifaceted plan to reduce the impact of these barriers and to enhance research capacity within Mozambique. Interventions On the basis of our needs assessment, we have implemented a number of major initiatives within participating institutions to facilitate basic and applied research activities. These have included specialized training programmes, a reorganization of the research administration infrastructure, the development of multiple collaborative research projects that have emphasized local research priorities and a substantial investment in bioinformatics. We have established a research support centre that provides grant development and management services to Mozambique’s public universities and have developed an independent Institutional Review Board for the review of research involving human research subjects. Multiple research projects involving both communicable and non-communicable diseases have been developed and substantial external research support has been obtained to undertake these projects. A sizable investment in biomedical informatics has enhanced both connectivity and access to digital reference material. Active engagement with relevant entities within the Government of Mozambique has aligned institutional development with national priorities. Conclusions Although multiple challenges remain, over the past 3 years significant progress has been made towards establishing conditions within which a broad range of basic, translational and clinical and public health research can be undertaken. Ongoing development of this research enterprise will enhance capacity to address critical locally relevant research questions and will leverage resources to accelerate the development of Mozambique’s national universities. PMID:24304706

  2. How Robust is Your System Resilience?

    NASA Astrophysics Data System (ADS)

    Homayounfar, M.; Muneepeerakul, R.

    2017-12-01

    Robustness and resilience are concepts in system thinking that have grown in importance and popularity. For many complex social-ecological systems, however, robustness and resilience are difficult to quantify and the connections and trade-offs between them difficult to study. Most studies have either focused on qualitative approaches to discuss their connections or considered only one of them under particular classes of disturbances. In this study, we present an analytical framework to address the linkage between robustness and resilience more systematically. Our analysis is based on a stylized dynamical model that operationalizes a widely used concept framework for social-ecological systems. The model enables us to rigorously define robustness and resilience and consequently investigate their connections. The results reveal the tradeoffs among performance, robustness, and resilience. They also show how the nature of the such tradeoffs varies with the choices of certain policies (e.g., taxation and investment in public infrastructure), internal stresses and external disturbances.

  3. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  4. Reliability Assessment for Low-cost Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Freeman, Paul Michael

    Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.

  5. Creativity and Occupational Accomplishments Among Intellectually Precocious Youths: An Age 13 to Age 33 Longitudinal Study

    ERIC Educational Resources Information Center

    Wai, Jonathan; Lubinski, David; Benbow, Camilla P.

    2005-01-01

    This study tracks intellectually precocious youths (top 1%) over 20 years. Phase 1 (N = 1,243 boys, 732 girls) examines the significance of age 13 ability differences within the top 1% for predicting doctorates, income, patents, and tenure at U.S. universities ranked within the top 50. Phase 2 (N = 323 men, 188 women) evaluates the robustness of…

  6. Prosodic Stress, Information, and Intelligibility of Speech in Noise

    DTIC Science & Technology

    2009-02-28

    across periods during which acoustic information has been suppressed. 15. SUBJECT TERMS Robust speech intelligibility Computational model of...Research Fellow at the Department of Computer Science at the University of Southern California). This research involved superimposing acoustic and...presented at an invitational-only session of the Acoustical Society of America’s and European Acoustic Association’s joint meeting in 2008. In summary, the

  7. Web-Based vs. Face-to-Face MBA Classes: A Comparative Assessment Study

    ERIC Educational Resources Information Center

    Brownstein, Barry; Brownstein, Deborah; Gerlowski, Daniel A.

    2008-01-01

    The challenges of online learning include ensuring that the learning outcomes are at least as robust as in the face-to-face sections of the same course. At the University of Baltimore, both online sections and face-to-face sections of core MBA courses are offered. Once admitted to the MBA, students are free to enroll in any combination of…

  8. Numerical Nonlinear Robust Control with Applications to Humanoid Robots

    DTIC Science & Technology

    2015-07-01

    automatically. While optimization and optimal control theory have been widely applied in humanoid robot control, it is not without drawbacks . A blind... drawback of Galerkin-based approaches is the need to successively produce discrete forms, which is difficult to implement in practice. Related...universal function approx- imation ability, these approaches are not without drawbacks . In practice, while a single hidden layer neural network can

  9. Addressing the Uncertain Future of Preserving the Past: Towards a Robust Strategy for Digital Archiving and Preservation. Technical Report. Executive Summary

    ERIC Educational Resources Information Center

    Hoorens, Stijn; Rothenberg, Jeff; van Oranje-Nassau, Constantijn; van der Mandele, Martin; Levitt, Ruth

    2007-01-01

    Storing and curating authentic academic literature and making it accessible for the long term has been a time-honoured task of national libraries. By guarding existing knowledge and facilitating its use to produce new insights, national and university libraries have formed an integral part of the research environment, complementing the roles of…

  10. Five Decades of Adult Education at U.C.C., 1948-1998: From Roman Catholic Social Reconstructionism to Community Partnerships and Empowerment.

    ERIC Educational Resources Information Center

    O Fathaigh, Mairtin; O'Sullivan, Denis

    Looking back over the past 5 decades of adult education at University College, Cork, one is struck by the realities of continuity and change as the guiding rationale moved from Roman Catholic reconstructionism to community partnership and empowerment. Structures put in place under President O'Rahilly's sponsorship persisted so robustly they…

  11. Universal properties of knotted polymer rings.

    PubMed

    Baiesi, M; Orlandini, E

    2012-09-01

    By performing Monte Carlo sampling of N-steps self-avoiding polygons embedded on different Bravais lattices we explore the robustness of universality in the entropic, metric, and geometrical properties of knotted polymer rings. In particular, by simulating polygons with N up to 10(5) we furnish a sharp estimate of the asymptotic values of the knot probability ratios and show their independence on the lattice type. This universal feature was previously suggested, although with different estimates of the asymptotic values. In addition, we show that the scaling behavior of the mean-squared radius of gyration of polygons depends on their knot type only through its correction to scaling. Finally, as a measure of the geometrical self-entanglement of the self-avoiding polygons we consider the standard deviation of the writhe distribution and estimate its power-law behavior in the large N limit. The estimates of the power exponent do depend neither on the lattice nor on the knot type, strongly supporting an extension of the universality property to some features of the geometrical entanglement.

  12. A Web-Based Review of Sexual and Reproductive Health Services Available at Colleges and Universities in Georgia.

    PubMed

    Cushing, Katherine F; Carson, Anna E; Short, Tyiesha D; Kot, Stefanie N; Tschokert, Merete; Sales, Jessica M

    2018-04-13

    Although two-thirds of graduating high school seniors attend college or university in the U.S., there is a paucity of national or state specific research regarding SRH services available on or near college and university campuses. A review of websites for all colleges and universities in Georgia was conducted to evaluate sexual health services available on campuses and evidence of referral to community providers. Of 96 colleges in Georgia, 44░had campus-located health centers, with only 3 at two-year colleges. Overall SRH service provision was low, with great variation between colleges. Distances between colleges and Title X clinics ranged from 0.33 to 35.45 miles. Many students lack access to campus health centers, and information on college websites regarding SRH service availability and referrals differs dramatically between campuses. In the absence of robust campus-located services, schools should highlight where students can obtain comprehensive SRH care in the community.

  13. Implementing universal nonadiabatic holonomic quantum gates with transmons

    NASA Astrophysics Data System (ADS)

    Hong, Zhuo-Ping; Liu, Bao-Jie; Cai, Jia-Qi; Zhang, Xin-Ding; Hu, Yong; Wang, Z. D.; Xue, Zheng-Yuan

    2018-02-01

    Geometric phases are well known to be noise resilient in quantum evolutions and operations. Holonomic quantum gates provide us with a robust way towards universal quantum computation, as these quantum gates are actually induced by non-Abelian geometric phases. Here we propose and elaborate how to efficiently implement universal nonadiabatic holonomic quantum gates on simpler superconducting circuits, with a single transmon serving as a qubit. In our proposal, an arbitrary single-qubit holonomic gate can be realized in a single-loop scenario by varying the amplitudes and phase difference of two microwave fields resonantly coupled to a transmon, while nontrivial two-qubit holonomic gates may be generated with a transmission-line resonator being simultaneously coupled to the two target transmons in an effective resonant way. Moreover, our scenario may readily be scaled up to a two-dimensional lattice configuration, which is able to support large scalable quantum computation, paving the way for practically implementing universal nonadiabatic holonomic quantum computation with superconducting circuits.

  14. First Universities Allied for Essential Medicines (UAEM) Neglected Diseases and Innovation Symposium

    PubMed Central

    Musselwhite, Laura W.; Maciag, Karolina; Lankowski, Alex; Gretes, Michael C.; Wellems, Thomas E.; Tavera, Gloria; Goulding, Rebecca E.; Guillen, Ethan

    2012-01-01

    Universities Allied for Essential Medicines organized its first Neglected Diseases and Innovation Symposium to address expanding roles of public sector research institutions in innovation in research and development of biomedical technologies for treatment of diseases, particularly neglected tropical diseases. Universities and other public research institutions are increasingly integrated into the pharmaceutical innovation system. Academic entities now routinely undertake robust high-throughput screening and medicinal chemistry research programs to identify lead compounds for small molecule drugs and novel drug targets. Furthermore, product development partnerships are emerging between academic institutions, non-profit entities, and biotechnology and pharmaceutical companies to create diagnostics, therapies, and vaccines for diseases of the poor. With not for profit mission statements, open access publishing standards, open source platforms for data sharing and collaboration, and a shift in focus to more translational research, universities and other public research institutions are well-placed to accelerate development of medical technologies, particularly for neglected tropical diseases. PMID:22232453

  15. Revisiting an old concept: the coupled oscillator model for VCD. Part 2: implications of the generalised coupled oscillator mechanism for the VCD robustness concept.

    PubMed

    Nicu, Valentin Paul

    2016-08-03

    Using two illustrative examples it is shown that the generalised coupled oscillator (GCO) mechanism implies that the stability of the VCD sign computed for a given normal mode is not reflected by the magnitude of the ratio ζ between the rotational strength and dipole strength of the respective mode, i.e., the VCD robustness criterium proposed by Góbi and Magyarfalvi. The performed VCD GCO analysis brings further insight into the GCO mechanism and also into the VCD robustness concept. First, it shows that the GCO mechanism can be interpreted as a VCD resonance enhancement mechanism, i.e. very large VCD signals can be observed when the interacting molecular fragments are in favourable orientation. Second, it shows that the uncertainties observed in the computed VCD signs are associated to uncertainties in the relative orientation of the coupled oscillator fragments and/or to uncertainties in the predicted nuclear displacement vectors, i.e. not uncertainties in the computed magnetic dipole transition moments as was originally assumed. Since it is able to identify such situations easily, the VCD GCO analysis can be used as a VCD robustness analysis.

  16. The Problem of Size in Robust Design

    NASA Technical Reports Server (NTRS)

    Koch, Patrick N.; Allen, Janet K.; Mistree, Farrokh; Mavris, Dimitri

    1997-01-01

    To facilitate the effective solution of multidisciplinary, multiobjective complex design problems, a departure from the traditional parametric design analysis and single objective optimization approaches is necessary in the preliminary stages of design. A necessary tradeoff becomes one of efficiency vs. accuracy as approximate models are sought to allow fast analysis and effective exploration of a preliminary design space. In this paper we apply a general robust design approach for efficient and comprehensive preliminary design to a large complex system: a high speed civil transport (HSCT) aircraft. Specifically, we investigate the HSCT wing configuration design, incorporating life cycle economic uncertainties to identify economically robust solutions. The approach is built on the foundation of statistical experimentation and modeling techniques and robust design principles, and is specialized through incorporation of the compromise Decision Support Problem for multiobjective design. For large problems however, as in the HSCT example, this robust design approach developed for efficient and comprehensive design breaks down with the problem of size - combinatorial explosion in experimentation and model building with number of variables -and both efficiency and accuracy are sacrificed. Our focus in this paper is on identifying and discussing the implications and open issues associated with the problem of size for the preliminary design of large complex systems.

  17. Robustness of fit indices to outliers and leverage observations in structural equation modeling.

    PubMed

    Yuan, Ke-Hai; Zhong, Xiaoling

    2013-06-01

    Normal-distribution-based maximum likelihood (NML) is the most widely used method in structural equation modeling (SEM), although practical data tend to be nonnormally distributed. The effect of nonnormally distributed data or data contamination on the normal-distribution-based likelihood ratio (LR) statistic is well understood due to many analytical and empirical studies. In SEM, fit indices are used as widely as the LR statistic. In addition to NML, robust procedures have been developed for more efficient and less biased parameter estimates with practical data. This article studies the effect of outliers and leverage observations on fit indices following NML and two robust methods. Analysis and empirical results indicate that good leverage observations following NML and one of the robust methods lead most fit indices to give more support to the substantive model. While outliers tend to make a good model superficially bad according to many fit indices following NML, they have little effect on those following the two robust procedures. Implications of the results to data analysis are discussed, and recommendations are provided regarding the use of estimation methods and interpretation of fit indices. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  18. Robust Global Image Registration Based on a Hybrid Algorithm Combining Fourier and Spatial Domain Techniques

    DTIC Science & Technology

    2012-09-01

    Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain techniques Peter N. Crabtree, Collin Seanor...00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain...demonstrate performance of a hybrid algorithm . These results are from analysis of a set of images of an ISO 12233 [12] resolution chart captured in the

  19. Efficient Computation of Info-Gap Robustness for Finite Element Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.

    2012-07-05

    A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers anmore » alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.« less

  20. Trading Robustness Requirements in Mars Entry Trajectory Design

    NASA Technical Reports Server (NTRS)

    Lafleur, Jarret M.

    2009-01-01

    One of the most important metrics characterizing an atmospheric entry trajectory in preliminary design is the size of its predicted landing ellipse. Often, requirements for this ellipse are set early in design and significantly influence both the expected scientific return from a particular mission and the cost of development. Requirements typically specify a certain probability level (6-level) for the prescribed ellipse, and frequently this latter requirement is taken at 36. However, searches for the justification of 36 as a robustness requirement suggest it is an empirical rule of thumb borrowed from non-aerospace fields. This paper presents an investigation into the sensitivity of trajectory performance to varying robustness (6-level) requirements. The treatment of robustness as a distinct objective is discussed, and an analysis framework is presented involving the manipulation of design variables to effect trades between performance and robustness objectives. The scenario for which this method is illustrated is the ballistic entry of an MSL-class Mars entry vehicle. Here, the design variable is entry flight path angle, and objectives are parachute deploy altitude performance and error ellipse robustness. Resulting plots show the sensitivities between these objectives and trends in the entry flight path angles required to design to these objectives. Relevance to the trajectory designer is discussed, as are potential steps for further development and use of this type of analysis.

  1. Pixel-level multisensor image fusion based on matrix completion and robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Wang, Zhuozheng; Deller, J. R.; Fleet, Blair D.

    2016-01-01

    Acquired digital images are often corrupted by a lack of camera focus, faulty illumination, or missing data. An algorithm is presented for fusion of multiple corrupted images of a scene using the lifting wavelet transform. The method employs adaptive fusion arithmetic based on matrix completion and self-adaptive regional variance estimation. Characteristics of the wavelet coefficients are used to adaptively select fusion rules. Robust principal component analysis is applied to low-frequency image components, and regional variance estimation is applied to high-frequency components. Experiments reveal that the method is effective for multifocus, visible-light, and infrared image fusion. Compared with traditional algorithms, the new algorithm not only increases the amount of preserved information and clarity but also improves robustness.

  2. Analysis, calculation and utilization of the k-balance attribute in interdependent networks

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Li, Qing; Wang, Dan; Xu, Mingwei

    2018-05-01

    Interdependent networks, where two networks depend on each other, are becoming more and more significant in modern systems. From previous work, it can be concluded that interdependent networks are more vulnerable than a single network. The robustness in interdependent networks deserves special attention. In this paper, we propose a metric of robustness from a new perspective-the balance. First, we define the balance-coefficient of the interdependent system. Based on precise analysis and derivation, we prove some significant theories and provide an efficient algorithm to compute the balance-coefficient. Finally, we propose an optimal solution to reduce the balance-coefficient to enhance the robustness of the given system. Comprehensive experiments confirm the efficiency of our algorithms.

  3. Robustness of statistical tests for multiplicative terms in the additive main effects and multiplicative interaction model for cultivar trials.

    PubMed

    Piepho, H P

    1995-03-01

    The additive main effects multiplicative interaction model is frequently used in the analysis of multilocation trials. In the analysis of such data it is of interest to decide how many of the multiplicative interaction terms are significant. Several tests for this task are available, all of which assume that errors are normally distributed with a common variance. This paper investigates the robustness of several tests (Gollob, F GH1, FGH2, FR)to departures from these assumptions. It is concluded that, because of its better robustness, the F Rtest is preferable. If the other tests are to be used, preliminary tests for the validity of assumptions should be performed.

  4. Robustness Regions for Dichotomous Decisions.

    ERIC Educational Resources Information Center

    Vijn, Pieter; Molenaar, Ivo W.

    1981-01-01

    In the case of dichotomous decisions, the total set of all assumptions/specifications for which the decision would have been the same is the robustness region. Inspection of this (data-dependent) region is a form of sensitivity analysis which may lead to improved decision making. (Author/BW)

  5. Multiobjective robust design of the double wishbone suspension system based on particle swarm optimization.

    PubMed

    Cheng, Xianfu; Lin, Yuqun

    2014-01-01

    The performance of the suspension system is one of the most important factors in the vehicle design. For the double wishbone suspension system, the conventional deterministic optimization does not consider any deviations of design parameters, so design sensitivity analysis and robust optimization design are proposed. In this study, the design parameters of the robust optimization are the positions of the key points, and the random factors are the uncertainties in manufacturing. A simplified model of the double wishbone suspension is established by software ADAMS. The sensitivity analysis is utilized to determine main design variables. Then, the simulation experiment is arranged and the Latin hypercube design is adopted to find the initial points. The Kriging model is employed for fitting the mean and variance of the quality characteristics according to the simulation results. Further, a particle swarm optimization method based on simple PSO is applied and the tradeoff between the mean and deviation of performance is made to solve the robust optimization problem of the double wishbone suspension system.

  6. Linear, multivariable robust control with a mu perspective

    NASA Technical Reports Server (NTRS)

    Packard, Andy; Doyle, John; Balas, Gary

    1993-01-01

    The structured singular value is a linear algebra tool developed to study a particular class of matrix perturbation problems arising in robust feedback control of multivariable systems. These perturbations are called linear fractional, and are a natural way to model many types of uncertainty in linear systems, including state-space parameter uncertainty, multiplicative and additive unmodeled dynamics uncertainty, and coprime factor and gap metric uncertainty. The structured singular value theory provides a natural extension of classical SISO robustness measures and concepts to MIMO systems. The structured singular value analysis, coupled with approximate synthesis methods, make it possible to study the tradeoff between performance and uncertainty that occurs in all feedback systems. In MIMO systems, the complexity of the spatial interactions in the loop gains make it difficult to heuristically quantify the tradeoffs that must occur. This paper examines the role played by the structured singular value (and its computable bounds) in answering these questions, as well as its role in the general robust, multivariable control analysis and design problem.

  7. Quality by Design: Multidimensional exploration of the design space in high performance liquid chromatography method development for better robustness before validation.

    PubMed

    Monks, K; Molnár, I; Rieger, H-J; Bogáti, B; Szabó, E

    2012-04-06

    Robust HPLC separations lead to fewer analysis failures and better method transfer as well as providing an assurance of quality. This work presents the systematic development of an optimal, robust, fast UHPLC method for the simultaneous assay of two APIs of an eye drop sample and their impurities, in accordance with Quality by Design principles. Chromatography software is employed to effectively generate design spaces (Method Operable Design Regions), which are subsequently employed to determine the final method conditions and to evaluate robustness prior to validation. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Robust model predictive control for constrained continuous-time nonlinear systems

    NASA Astrophysics Data System (ADS)

    Sun, Tairen; Pan, Yongping; Zhang, Jun; Yu, Haoyong

    2018-02-01

    In this paper, a robust model predictive control (MPC) is designed for a class of constrained continuous-time nonlinear systems with bounded additive disturbances. The robust MPC consists of a nonlinear feedback control and a continuous-time model-based dual-mode MPC. The nonlinear feedback control guarantees the actual trajectory being contained in a tube centred at the nominal trajectory. The dual-mode MPC is designed to ensure asymptotic convergence of the nominal trajectory to zero. This paper extends current results on discrete-time model-based tube MPC and linear system model-based tube MPC to continuous-time nonlinear model-based tube MPC. The feasibility and robustness of the proposed robust MPC have been demonstrated by theoretical analysis and applications to a cart-damper springer system and a one-link robot manipulator.

  9. Robust Magnetotelluric Impedance Estimation

    NASA Astrophysics Data System (ADS)

    Sutarno, D.

    2010-12-01

    Robust magnetotelluric (MT) response function estimators are now in standard use by the induction community. Properly devised and applied, these have ability to reduce the influence of unusual data (outliers). The estimators always yield impedance estimates which are better than the conventional least square (LS) estimation because the `real' MT data almost never satisfy the statistical assumptions of Gaussian distribution and stationary upon which normal spectral analysis is based. This paper discuses the development and application of robust estimation procedures which can be classified as M-estimators to MT data. Starting with the description of the estimators, special attention is addressed to the recent development of a bounded-influence robust estimation, including utilization of the Hilbert Transform (HT) operation on causal MT impedance functions. The resulting robust performances are illustrated using synthetic as well as real MT data.

  10. Developing robust recurrence plot analysis techniques for investigating infant respiratory patterns.

    PubMed

    Terrill, Philip I; Wilson, Stephen; Suresh, Sadasivam; Cooper, David M

    2007-01-01

    Recurrence plot analysis is a useful non-linear analysis tool. There are still no well formalised procedures for carrying out this analysis on measured physiological data, and systemising analysis is often difficult. In this paper, the recurrence based embedding is compared to radius based embedding by studying a logistic attractor and measured breathing data collected from sleeping human infants. Recurrence based embedding appears to be a more robust method of carrying out a recurrence analysis when attractor size is likely to be different between datasets. In the infant breathing data, the radius measure calculated at a fixed recurrence, scaled by average respiratory period, allows the accurate discrimination of active sleep from quiet sleep states (AUC=0.975, Sn=098, Sp=0.94).

  11. A robust optimization model for distribution and evacuation in the disaster response phase

    NASA Astrophysics Data System (ADS)

    Fereiduni, Meysam; Shahanaghi, Kamran

    2017-03-01

    Natural disasters, such as earthquakes, affect thousands of people and can cause enormous financial loss. Therefore, an efficient response immediately following a natural disaster is vital to minimize the aforementioned negative effects. This research paper presents a network design model for humanitarian logistics which will assist in location and allocation decisions for multiple disaster periods. At first, a single-objective optimization model is presented that addresses the response phase of disaster management. This model will help the decision makers to make the most optimal choices in regard to location, allocation, and evacuation simultaneously. The proposed model also considers emergency tents as temporary medical centers. To cope with the uncertainty and dynamic nature of disasters, and their consequences, our multi-period robust model considers the values of critical input data in a set of various scenarios. Second, because of probable disruption in the distribution infrastructure (such as bridges), the Monte Carlo simulation is used for generating related random numbers and different scenarios; the p-robust approach is utilized to formulate the new network. The p-robust approach can predict possible damages along pathways and among relief bases. We render a case study of our robust optimization approach for Tehran's plausible earthquake in region 1. Sensitivity analysis' experiments are proposed to explore the effects of various problem parameters. These experiments will give managerial insights and can guide DMs under a variety of conditions. Then, the performances of the "robust optimization" approach and the "p-robust optimization" approach are evaluated. Intriguing results and practical insights are demonstrated by our analysis on this comparison.

  12. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  13. Robustness analysis of non-ordinary Petri nets for flexible assembly/disassembly processes based on structural decomposition

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2011-03-01

    Design of robust supervisory controllers for manufacturing systems with unreliable resources has received significant attention recently. Robustness analysis provides an alternative way to analyse a perturbed system to quickly respond to resource failures. Although we have analysed the robustness properties of several subclasses of ordinary Petri nets (PNs), analysis for non-ordinary PNs has not been done. Non-ordinary PNs have weighted arcs and have the advantage to compactly model operations requiring multiple parts or resources. In this article, we consider a class of flexible assembly/disassembly manufacturing systems and propose a non-ordinary flexible assembly/disassembly Petri net (NFADPN) model for this class of systems. As the class of flexible assembly/disassembly manufacturing systems can be regarded as the integration and interactions of a set of assembly/disassembly subprocesses, a bottom-up approach is adopted in this article to construct the NFADPN models. Due to the routing flexibility in NFADPN, there may exist different ways to accomplish the tasks. To characterise different ways to accomplish the tasks, we propose the concept of completely connected subprocesses. As long as there exists a set of completely connected subprocesses for certain type of products, the production of that type of products can still be maintained without requiring the whole NFADPN to be live. To take advantage of the alternative routes without enforcing liveness for the whole system, we generalise the concept of persistent production proposed to NFADPN. We propose a condition for persistent production based on the concept of completely connected subprocesses. We extend robustness analysis to NFADPN by exploiting its structure. We identify several patterns of resource failures and characterise the conditions to maintain operation in the presence of resource failures.

  14. A Novel Lung Disease Phenotype Adjusted for Mortality Attrition for Cystic Fibrosis Genetic Modifier Studies

    PubMed Central

    Taylor, Chelsea; Commander, Clayton W.; Collaco, Joseph M.; Strug, Lisa J.; Li, Weili; Wright, Fred A.; Webel, Aaron D.; Pace, Rhonda G.; Stonebraker, Jaclyn R.; Naughton, Kathleen; Dorfman, Ruslan; Sandford, Andrew; Blackman, Scott M.; Berthiaume, Yves; Paré, Peter; Drumm, Mitchell L.; Zielenski, Julian; Durie, Peter; Cutting, Garry R.; Knowles, Michael R.; Corey, Mary

    2011-01-01

    SUMMARY Genetic studies of lung disease in Cystic Fibrosis are hampered by the lack of a severity measure that accounts for chronic disease progression and mortality attrition. Further, combining analyses across studies requires common phenotypes that are robust to study design and patient ascertainment. Using data from the North American Cystic Fibrosis Modifier Consortium (Canadian Consortium for CF Genetic Studies, Johns Hopkins University CF Twin and Sibling Study, and University of North Carolina/Case Western Reserve University Gene Modifier Study), the authors calculated age-specific CF percentile values of FEV1 which were adjusted for CF age-specific mortality data. The phenotype was computed for 2061 patients representing the Canadian CF population, 1137 extreme phenotype patients in the UNC/Case Western study, and 1323 patients from multiple CF sib families in the CF Twin and Sibling Study. Despite differences in ascertainment and median age, our phenotype score was distributed in all three samples in a manner consistent with ascertainment differences, reflecting the lung disease severity of each individual in the underlying population. The new phenotype score was highly correlated with the previously recommended complex phenotype, but the new phenotype is more robust for shorter follow-up and for extreme ages. A disease progression and mortality adjusted phenotype reduces the need for stratification or additional covariates, increasing statistical power and avoiding possible distortions. This approach will facilitate large scale genetic and environmental epidemiological studies which will provide targeted therapeutic pathways for the clinical benefit of patients with CF. PMID:21462361

  15. Human embryonic stem cells and good manufacturing practice: Report of a 1- day workshop held at Stem Cell Biology Research Center, Yazd, 27th April 2017.

    PubMed

    Akyash, Fatemeh; Sadeghian-Nodoushan, Fatemeh; Tahajjodi, Somayyeh Sadat; Nikukar, Habib; Farashahi Yazd, Ehsan; Azimzadeh, Mostafa; D Moore, Harry; Aflatoonian, Behrouz

    2017-05-01

    This report explains briefly the minutes of a 1-day workshop entitled; "human embryonic stem cells (hESCs) and good manufacturing practice (GMP)" held by Stem Cell Biology Research Center based in Yazd Reproductive Sciences Institute at Shahid Sadoughi University of Medical Sciences, Yazd, Iran on 27 th April 2017. In this workshop, in addition to the practical sessions, Prof. Harry D. Moore from Centre for Stem Cell Biology, University of Sheffield, UK presented the challenges and the importance of the biotechnology of clinical-grade human embryonic stem cells from first derivation to robust defined culture for therapeutic applications.

  16. Improvement of a Chemical Storage Room Ventilation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousif, Emad; Al-Dahhan, Wedad; Abed, Rashed Nema

    Scientists at universities across Iraq are actively working to report actual incidents and accidents occurring in their laboratories, as well as structural improvements made to improve safety and security, to raise awareness and encourage openness, leading to widespread adoption of robust Chemical Safety and Security (CSS) practices. This manuscript is the third in a series of five case studies describing laboratory incidents, accidents, and laboratory improvements. We summarize an improvement to the chemical storage room ventilation system at Al-Nahrain University to create and maintain a safe working atmosphere in an area where chemicals are stored and handled, using US andmore » European design practices, standards, and regulations.« less

  17. History and Current Status of Cardiovascular Surgery at the University of Pennsylvania.

    PubMed

    Acker, Michael A; Bavaria, Joseph E; Barker, Clyde F

    2015-01-01

    The cardiothoracic surgery program at the University of Pennsylvania has enjoyed a decades long tradition of leadership and contributions to the field. Consistent with its place as a robust contributor in a major academic medical center, its focus is on the tripartite mission of clinical care, research and education, including the provision of cutting edge care delivered to patients in a multidisciplinary fashion. Faculty members' pursuit of translational research facilitates the delivery of such exceptional treatment and provision of excellent care. This foundation is ideal for the training of the outstanding surgeons of tomorrow, as evidenced by a history of such contributions. Copyright © 2015. Published by Elsevier Inc.

  18. On the universality of power laws for tokamak plasma predictions

    NASA Astrophysics Data System (ADS)

    Garcia, J.; Cambon, D.; Contributors, JET

    2018-02-01

    Significant deviations from well established power laws for the thermal energy confinement time, obtained from extensive databases analysis as the IPB98(y,2), have been recently reported in dedicated power scans. In order to illuminate the adequacy, validity and universality of power laws as tools for predicting plasma performance, a simplified analysis has been carried out in the framework of a minimal modeling for heat transport which is, however, able to account for the interplay between turbulence and collinear effects with the input power known to play a role in experiments with significant deviations from such power laws. Whereas at low powers, the usual scaling laws are recovered with little influence of other plasma parameters, resulting in a robust power low exponent, at high power it is shown how the exponents obtained are extremely sensitive to the heating deposition, the q-profile or even the sampling or the number of points considered due to highly non-linear behavior of the heat transport. In particular circumstances, even a minimum of the thermal energy confinement time with the input power can be obtained, which means that the approach of the energy confinement time as a power law might be intrinsically invalid. Therefore plasma predictions with a power law approximation with a constant exponent obtained from a regression of a broad range of powers and other plasma parameters which can non-linearly affect and suppress heat transport, can lead to misleading results suggesting that this approach should be taken cautiously and its results continuously compared with modeling which can properly capture the underline physics, as gyrokinetic simulations.

  19. Diagnostics of Robust Growth Curve Modeling Using Student's "t" Distribution

    ERIC Educational Resources Information Center

    Tong, Xin; Zhang, Zhiyong

    2012-01-01

    Growth curve models with different types of distributions of random effects and of intraindividual measurement errors for robust analysis are compared. After demonstrating the influence of distribution specification on parameter estimation, 3 methods for diagnosing the distributions for both random effects and intraindividual measurement errors…

  20. Robust MOE Detector for DS-CDMA Systems with Signature Waveform Mismatch

    NASA Astrophysics Data System (ADS)

    Lin, Tsui-Tsai

    In this letter, a decision-directed MOE detector with excellent robustness against signature waveform mismatch is proposed for DS-CDMA systems. Both the theoretic analysis and computer simulation results demonstrate that the proposed detector can provide better SINR performance than that of conventional detectors.

  1. Robust Feature Selection Technique using Rank Aggregation.

    PubMed

    Sarkar, Chandrima; Cooley, Sarah; Srivastava, Jaideep

    2014-01-01

    Although feature selection is a well-developed research area, there is an ongoing need to develop methods to make classifiers more efficient. One important challenge is the lack of a universal feature selection technique which produces similar outcomes with all types of classifiers. This is because all feature selection techniques have individual statistical biases while classifiers exploit different statistical properties of data for evaluation. In numerous situations this can put researchers into dilemma as to which feature selection method and a classifiers to choose from a vast range of choices. In this paper, we propose a technique that aggregates the consensus properties of various feature selection methods to develop a more optimal solution. The ensemble nature of our technique makes it more robust across various classifiers. In other words, it is stable towards achieving similar and ideally higher classification accuracy across a wide variety of classifiers. We quantify this concept of robustness with a measure known as the Robustness Index (RI). We perform an extensive empirical evaluation of our technique on eight data sets with different dimensions including Arrythmia, Lung Cancer, Madelon, mfeat-fourier, internet-ads, Leukemia-3c and Embryonal Tumor and a real world data set namely Acute Myeloid Leukemia (AML). We demonstrate not only that our algorithm is more robust, but also that compared to other techniques our algorithm improves the classification accuracy by approximately 3-4% (in data set with less than 500 features) and by more than 5% (in data set with more than 500 features), across a wide range of classifiers.

  2. Designing Phononic Crystals with Wide and Robust Band Gaps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, Zian; Chen, Yanyu; Yang, Haoxiang

    Here, phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with widemore » and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.« less

  3. Designing Phononic Crystals with Wide and Robust Band Gaps

    DOE PAGES

    Jia, Zian; Chen, Yanyu; Yang, Haoxiang; ...

    2018-04-16

    Here, phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with widemore » and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.« less

  4. Designing Phononic Crystals with Wide and Robust Band Gaps

    NASA Astrophysics Data System (ADS)

    Jia, Zian; Chen, Yanyu; Yang, Haoxiang; Wang, Lifeng

    2018-04-01

    Phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with wide and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.

  5. The comparison of robust partial least squares regression with robust principal component regression on a real

    NASA Astrophysics Data System (ADS)

    Polat, Esra; Gunay, Suleyman

    2013-10-01

    One of the problems encountered in Multiple Linear Regression (MLR) is multicollinearity, which causes the overestimation of the regression parameters and increase of the variance of these parameters. Hence, in case of multicollinearity presents, biased estimation procedures such as classical Principal Component Regression (CPCR) and Partial Least Squares Regression (PLSR) are then performed. SIMPLS algorithm is the leading PLSR algorithm because of its speed, efficiency and results are easier to interpret. However, both of the CPCR and SIMPLS yield very unreliable results when the data set contains outlying observations. Therefore, Hubert and Vanden Branden (2003) have been presented a robust PCR (RPCR) method and a robust PLSR (RPLSR) method called RSIMPLS. In RPCR, firstly, a robust Principal Component Analysis (PCA) method for high-dimensional data on the independent variables is applied, then, the dependent variables are regressed on the scores using a robust regression method. RSIMPLS has been constructed from a robust covariance matrix for high-dimensional data and robust linear regression. The purpose of this study is to show the usage of RPCR and RSIMPLS methods on an econometric data set, hence, making a comparison of two methods on an inflation model of Turkey. The considered methods have been compared in terms of predictive ability and goodness of fit by using a robust Root Mean Squared Error of Cross-validation (R-RMSECV), a robust R2 value and Robust Component Selection (RCS) statistic.

  6. Characterizing uncertain sea-level rise projections to support investment decisions.

    PubMed

    Sriver, Ryan L; Lempert, Robert J; Wikman-Svahn, Per; Keller, Klaus

    2018-01-01

    Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions.

  7. Characterizing uncertain sea-level rise projections to support investment decisions

    PubMed Central

    Lempert, Robert J.; Wikman-Svahn, Per; Keller, Klaus

    2018-01-01

    Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions. PMID:29414978

  8. A Fiber Optic Ammonia Sensor Using a Universal pH Indicator

    PubMed Central

    Rodríguez, Adolfo J.; Zamarreño, Carlos R.; Matías, Ignacio R.; Arregui, Francisco. J.; Domínguez Cruz, Rene F.; May-Arrioja, Daniel. A.

    2014-01-01

    A universal pH indicator is used to fabricate a fiber optic ammonia sensor. The advantage of this pH indicator is that it exhibits sensitivity to ammonia over a broad wavelength range. This provides a differential response, with a valley around 500 nm and a peak around 650 nm, which allows us to perform ratiometric measurements. The ratiometric measurements provide not only an enhanced signal, but can also eliminate any external disturbance due to humidity or temperature fluctuations. In addition, the indicator is embedded in a hydrophobic and gas permeable polyurethane film named Tecoflex®. The film provides additional advantages to the sensor, such as operation in dry environments, efficient transport of the element to be measured to the sensitive area of the sensor, and prevent leakage or detachment of the indicator. The combination of the universal pH indicator and Tecoflex® film provides a reliable and robust fiber optic ammonia sensor. PMID:24583969

  9. Universal Scaling of Robust Thermal Hot Spot and Ionic Current Enhancement by Focused Ohmic Heating in a Conic Nanopore

    PubMed Central

    Pan, Zehao; Wang, Ceming; Li, Meng; Chang, Hsueh-Chia

    2017-01-01

    A stable nanoscale thermal hot spot, with temperature approaching 100 °C, is shown to be sustained by localized Ohmic heating of a focused electric field at the tip of a slender conic nanopore. The self-similar (length-independent) conic geometry allows us to match the singular heat source at the tip to the singular radial heat loss from the slender cone to obtain a self-similar steady temperature profile along the cone and the resulting ionic current conductance enhancement due to viscosity reduction. The universal scaling, which depends only on a single dimensionless parameter Z, collapses the measured conductance data and computed temperature profiles in ion-track conic nanopores and conic nanopipettes. The collapsed numerical data reveal universal values for the hot-spot location and temperature in an aqueous electrolyte. PMID:27715110

  10. THE EFFECTS OF ANGULAR MOMENTUM ON HALO PROFILES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lentz, Erik W; Rosenberg, Leslie J; Quinn, Thomas R, E-mail: lentze@phys.washington.edu, E-mail: ljrosenberg@phys.washington.edu, E-mail: trq@astro.washington.edu

    2016-05-10

    The near universality of DM halo density profiles provided by N -body simulations proved to be robust against changes in total mass density, power spectrum, and some forms of initial velocity dispersion. Here we study the effects of coherently spinning up an isolated DM-only progenitor on halo structure. Halos with spins within several standard deviations of the simulated mean ( λ ≲ 0.20) produce profiles with negligible deviations from the universal form. Only when the spin becomes quite large ( λ ≳ 0.20) do departures become evident. The angular momentum distribution also exhibits a near universal form, which is alsomore » independent of halo spin up to λ ≲ 0.20. A correlation between these epidemic profiles and the presence of a strong bar in the virialized halo is also observed. These bar structures bear resemblance to the radial orbit instability in the rotationless limit.« less

  11. Mapping two-dimensional polar active fluids to two-dimensional soap and one-dimensional sandblasting

    NASA Astrophysics Data System (ADS)

    Chen, Leiming; Lee, Chiu Fan; Toner, John

    2016-07-01

    Active fluids and growing interfaces are two well-studied but very different non-equilibrium systems. Each exhibits non-equilibrium behaviour distinct from that of their equilibrium counterparts. Here we demonstrate a surprising connection between these two: the ordered phase of incompressible polar active fluids in two spatial dimensions without momentum conservation, and growing one-dimensional interfaces (that is, the 1+1-dimensional Kardar-Parisi-Zhang equation), in fact belong to the same universality class. This universality class also includes two equilibrium systems: two-dimensional smectic liquid crystals, and a peculiar kind of constrained two-dimensional ferromagnet. We use these connections to show that two-dimensional incompressible flocks are robust against fluctuations, and exhibit universal long-ranged, anisotropic spatio-temporal correlations of those fluctuations. We also thereby determine the exact values of the anisotropy exponent ζ and the roughness exponents χx,y that characterize these correlations.

  12. Cosmological moduli and the post-inflationary universe: A critical review

    NASA Astrophysics Data System (ADS)

    Kane, Gordon; Sinha, Kuver; Watson, Scott

    2015-06-01

    We critically review the role of cosmological moduli in determining the post-inflationary history of the universe. Moduli are ubiquitous in string and M-theory constructions of beyond the Standard Model physics, where they parametrize the geometry of the compactification manifold. For those with masses determined by supersymmetry (SUSY) breaking this leads to their eventual decay slightly before Big Bang nucleosynthesis (BBN) (without spoiling its predictions). This results in a matter dominated phase shortly after inflation ends, which can influence baryon and dark matter genesis, as well as observations of the cosmic microwave background (CMB) and the growth of large-scale structure. Given progress within fundamental theory, and guidance from dark matter and collider experiments, nonthermal histories have emerged as a robust and theoretically well-motivated alternative to a strictly thermal one. We review this approach to the early universe and discuss both the theoretical challenges and the observational implications.

  13. Mapping two-dimensional polar active fluids to two-dimensional soap and one-dimensional sandblasting.

    PubMed

    Chen, Leiming; Lee, Chiu Fan; Toner, John

    2016-07-25

    Active fluids and growing interfaces are two well-studied but very different non-equilibrium systems. Each exhibits non-equilibrium behaviour distinct from that of their equilibrium counterparts. Here we demonstrate a surprising connection between these two: the ordered phase of incompressible polar active fluids in two spatial dimensions without momentum conservation, and growing one-dimensional interfaces (that is, the 1+1-dimensional Kardar-Parisi-Zhang equation), in fact belong to the same universality class. This universality class also includes two equilibrium systems: two-dimensional smectic liquid crystals, and a peculiar kind of constrained two-dimensional ferromagnet. We use these connections to show that two-dimensional incompressible flocks are robust against fluctuations, and exhibit universal long-ranged, anisotropic spatio-temporal correlations of those fluctuations. We also thereby determine the exact values of the anisotropy exponent ζ and the roughness exponents χx,y that characterize these correlations.

  14. The prevalence of binge eating disorder and its relationship to work and classroom productivity and activity impairment.

    PubMed

    Filipova, Anna A; Stoffel, Cheri L

    2016-07-01

    The study aimed to determine the prevalence of binge eating disorder on university campus, its associations with health risk factors, and its associations with work and classroom productivity and activity impairment, adjusted for health risk factors. The study was conducted at a public midwestern university in the United States and involved 1,165 students. Data were collected online, using preestablished instruments. Descriptive, chi-square, correlation, and robust multiple regression tests were used. About 7.8% of the participants were assessed as having binge eating disorder. Binge eating disorder was more common among obese students than nonobese students. Associations were found between moderate binge eating disorder and classroom productivity and daily activity impairment; however, sleep duration and physical activity were the strongest predictors. University students are at risk of binge eating disorder. Interventions with this population should include education, screening, and clinical consultation when warranted.

  15. Risk, Robustness and Water Resources Planning Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Borgomeo, Edoardo; Mortazavi-Naeini, Mohammad; Hall, Jim W.; Guillod, Benoit P.

    2018-03-01

    Risk-based water resources planning is based on the premise that water managers should invest up to the point where the marginal benefit of risk reduction equals the marginal cost of achieving that benefit. However, this cost-benefit approach may not guarantee robustness under uncertain future conditions, for instance under climatic changes. In this paper, we expand risk-based decision analysis to explore possible ways of enhancing robustness in engineered water resources systems under different risk attitudes. Risk is measured as the expected annual cost of water use restrictions, while robustness is interpreted in the decision-theoretic sense as the ability of a water resource system to maintain performance—expressed as a tolerable risk of water use restrictions—under a wide range of possible future conditions. Linking risk attitudes with robustness allows stakeholders to explicitly trade-off incremental increases in robustness with investment costs for a given level of risk. We illustrate the framework through a case study of London's water supply system using state-of-the -art regional climate simulations to inform the estimation of risk and robustness.

  16. Robust Polymer Films: Nanoscale Stiffening as a Route to Strong Materials

    DTIC Science & Technology

    2011-10-20

    Rheological Methods," Drexel University, Philadelphia, PA, March 4, 2011. S.Xu, "Geometry and molecular architecture effects in nanobubble inflation...2007. G.B. McKenna, "The viscoelastic properties of ultrathin polymer films as measured with a novel nanobubble inflation technique.” March Meeting of...mechanical response of ultrathin polymer films using the Texas Tech nanobubble inflation technique as the means to determine the viscoelastic

  17. Medical Ultrasound Technology Research and Development at the University of Washington Center for Industrial and Medical Ultrasound

    DTIC Science & Technology

    2003-10-02

    provide a world-class, advanced research center for bioengineering development and graduate education in high-intensity, focused ultrasound ( HIFU ). This...convenient, and robust. These technological enhancements have enabled the development of HIFU arrays and image-guided ultrasound systems for greater... Ultrasound (CIMU). The many disparate facilities and technical capabilities available to CIMU staff and students were integrated and enhanced to

  18. Design for a Manufacturing Method for Memristor-Based Neuromorphic Computing Processors

    DTIC Science & Technology

    2013-03-01

    DESIGN FOR A MANUFACTURING METHOD FOR MEMRISTOR- BASED NEUROMORPHIC COMPUTING PROCESSORS UNIVERSITY OF PITTSBURGH MARCH 2013...BASED NEUROMORPHIC COMPUTING PROCESSORS 5a. CONTRACT NUMBER FA8750-11-1-0271 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6. AUTHOR(S...synapses and implemented a neuromorphic computing system based on our proposed synapse designs. The robustness of our system is also evaluated by

  19. On chirality transfer in electron donor-acceptor complexes. A prediction for the sulfinimine···BF3 system.

    PubMed

    Rode, Joanna E; Dobrowolski, Jan Cz

    2012-01-01

    Stabilization energies of the electron donor-acceptor sulfinimine···BF(3) complexes calculated at either the B3LYP/aug-cc-pVTZ or the MP2/aug-cc-pVTZ level do not allow to judge, whether the N- or O-atom in sulfinimine is stronger electron-donor to BF(3) . The problem seems to be solvable because chirality transfer phenomenon between chiral sulfinimine and achiral BF(3) is expected to be vibrational circular dichroism (VCD) active. Moreover, the bands associated with the achiral BF(3) molecule are predicted to be the most intense in the entire spectrum. However, the VCD band robustness analyses show that most of the chirality transfer modes of BF(3) are unreliable. Conversely, variation of VCD intensity with change of intermolecular distance, angle, and selected dihedrals between the complex partners shows that to establish the robustness of chirality transfer mode. It is also necessary to determine the influence of the potential energy surface (PES) shape on the VCD intensity. At the moment, there is still no universal criterion for the chirality transfer mode robustness and the conclusions formulated based on one system cannot be directly transferred even to a quite similar one. However, it is certain that more attention should be focused on relation of PES shape and the VCD mode robustness problem. Copyright © 2011 Wiley Periodicals, Inc.

  20. Flow Cytometry: Evolution of Microbiological Methods for Probiotics Enumeration.

    PubMed

    Pane, Marco; Allesina, Serena; Amoruso, Angela; Nicola, Stefania; Deidda, Francesca; Mogna, Luca

    2018-05-14

    The purpose of this trial was to verify that the analytical method ISO 19344:2015 (E)-IDF 232:2015 (E) is valid and reliable for quantifying the concentration of the probiotic Lactobacillus rhamnosus GG (ATCC 53103) in a finished product formulation. Flow cytometry assay is emerging as an alternative rapid method for microbial detection, enumeration, and population profiling. The use of flow cytometry not only permits the determination of viable cell counts but also allows for enumeration of damaged and dead cell subpopulations. Results are expressed as TFU (Total Fluorescent Units) and AFU (Active Fluorescent Units). In December 2015, the International Standard ISO 19344-IDF 232 "Milk and milk products-Starter cultures, probiotics and fermented products-Quantification of lactic acid bacteria by flow cytometry" was published. This particular ISO can be applied universally and regardless of the species of interest. Analytical method validation was conducted on 3 different industrial batches of L. rhamnosus GG according to USP39<1225>/ICH Q2R1 in term of: accuracy, precision (repeatability), intermediate precision (ruggedness), specificity, limit of quantification, linearity, range, robustness. The data obtained on the 3 batches of finished product have significantly demonstrated the validity and robustness of the cytofluorimetric analysis. On the basis of the results obtained, the ISO 19344:2015 (E)-IDF 232:2015 (E) "Quantification of lactic acid bacteria by flow cytometry" can be used for the enumeration of L. rhamnosus GG in a finished product formulation.

  1. Exploiting virtual sediment deposits to explore conceptual foundations

    NASA Astrophysics Data System (ADS)

    Dietze, Michael; Fuchs, Margret; Kreutzer, Sebastian

    2017-04-01

    Geomorphic concepts and hypotheses are usually formulated based on empiric data from the field or the laboratory (deduction). After translation into models they can be applied to case study scenarios (induction). However, the other way around - expressing hypotheses explicitly by models and test these by empiric data - is a rarely touched trail. There are several models tailored to investigate the boundary conditions and processes that generate, mobilise, route and eventually deposit sediment in a landscape. Thereby, the last part, sediment deposition, is usually omitted. Essentially, there is no model that explicitly focuses on mapping out the characteristics of sedimentary deposits - the material that is used by many disciplines to reconstruct landscape evolution. This contribution introduces the R-package sandbox, a model framework that allows creating and analysing virtual sediment sections for exploratory, explanatory, forecasting and inverse research questions. The R-package sandbox is a probabilistic and rule-based model framework for a wide range of possible applications. The model framework is used here to discuss a set of conceptual questions revolving around geochemical and geochronological methods, such as: How does sample size and sample volume affect age uncertainty? What determines the robustness of sediment fingerprinting results? How does the prepared grain size of the material of interest affect the analysis outcomes? Most of the concepts used in geosciences are underpinned by a set of assumptions, whose robustness and boundary conditions need to be assessed quantitatively. The R-package sandbox is a universal and flexible tool to engage with this challenge.

  2. Chymase Level Is a Predictive Biomarker of Dengue Hemorrhagic Fever in Pediatric and Adult Patients.

    PubMed

    Tissera, Hasitha; Rathore, Abhay P S; Leong, Wei Yee; Pike, Brian L; Warkentien, Tyler E; Farouk, Farouk S; Syenina, Ayesa; Eong Ooi, Eng; Gubler, Duane J; Wilder-Smith, Annelies; St John, Ashley L

    2017-11-27

    Most patients with dengue experience mild disease, dengue fever (DF), while few develop the life-threatening diseases dengue hemorrhagic fever (DHF) or dengue shock syndrome (DSS). No laboratory tests predict DHF or DSS. We evaluated whether the serum chymase level can predict DHF or DSS in adult and pediatric patients and the influence of preexisting conditions (PECs) on chymase levels. Serum chymase levels were measured in patients presenting with undifferentiated fever to hospitals in Colombo District, Sri Lanka. The value of serum the chymase concentration and clinical signs and symptoms as predictors of DHF and/or DSS was evaluated by multivariate analysis. We assessed the influence of age, PECs, and day after fever onset on the robustness of the chymase level as a biomarker for DHF and/or DSS. An elevated chymase level in acute phase blood samples was highly indicative of later diagnosis of DHF or DSS for pediatric and adult patients with dengue. No recorded PECs prevented an increase in the chymase level during DHF. However, certain PECs (obesity and cardiac or lung-associated diseases) resulted in a concomitant increase in chymase levels among adult patients with DHF. These results show that patients with acute dengue who present with high levels of serum chymase consistently are at greater risk of DHF. The chymase level is a robust prognostic biomarker of severe dengue for adult and pediatric patients. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  3. Relationship of cranial robusticity to cranial form, geography and climate in Homo sapiens.

    PubMed

    Baab, Karen L; Freidline, Sarah E; Wang, Steven L; Hanson, Timothy

    2010-01-01

    Variation in cranial robusticity among modern human populations is widely acknowledged but not well-understood. While the use of "robust" cranial traits in hominin systematics and phylogeny suggests that these characters are strongly heritable, this hypothesis has not been tested. Alternatively, cranial robusticity may be a response to differences in diet/mastication or it may be an adaptation to cold, harsh environments. This study quantifies the distribution of cranial robusticity in 14 geographically widespread human populations, and correlates this variation with climatic variables, neutral genetic distances, cranial size, and cranial shape. With the exception of the occipital torus region, all traits were positively correlated with each other, suggesting that they should not be treated as individual characters. While males are more robust than females within each of the populations, among the independent variables (cranial shape, size, climate, and neutral genetic distances), only shape is significantly correlated with inter-population differences in robusticity. Two-block partial least-squares analysis was used to explore the relationship between cranial shape (captured by three-dimensional landmark data) and robusticity across individuals. Weak support was found for the hypothesis that robusticity was related to mastication as the shape associated with greater robusticity was similar to that described for groups that ate harder-to-process diets. Specifically, crania with more prognathic faces, expanded glabellar and occipital regions, and (slightly) longer skulls were more robust than those with rounder vaults and more orthognathic faces. However, groups with more mechanically demanding diets (hunter-gatherers) were not always more robust than groups practicing some form of agriculture.

  4. Using Robust Variance Estimation to Combine Multiple Regression Estimates with Meta-Analysis

    ERIC Educational Resources Information Center

    Williams, Ryan

    2013-01-01

    The purpose of this study was to explore the use of robust variance estimation for combining commonly specified multiple regression models and for combining sample-dependent focal slope estimates from diversely specified models. The proposed estimator obviates traditionally required information about the covariance structure of the dependent…

  5. The Robustness of LISREL Estimates in Structural Equation Models with Categorical Variables.

    ERIC Educational Resources Information Center

    Ethington, Corinna A.

    1987-01-01

    This study examined the effect of type of correlation matrix on the robustness of LISREL maximum likelihood and unweighted least squares structural parameter estimates for models with categorical variables. The analysis of mixed matrices produced estimates that closely approximated the model parameters except where dichotomous variables were…

  6. IMPLICATIONS OF USING ROBUST BAYESIAN ANALYSIS TO REPRESENT DIVERSE SOURCES OF UNCERTAINTY IN INTEGRATED ASSESSMENT

    EPA Science Inventory

    In our previous research, we showed that robust Bayesian methods can be used in environmental modeling to define a set of probability distributions for key parameters that captures the effects of expert disagreement, ambiguity, or ignorance. This entire set can then be update...

  7. Worst-Case Flutter Margins from F/A-18 Aircraft Aeroelastic Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty

    1997-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, micron, computes a stability margin which directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The micron margins are robust margins which indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 SRA using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  8. On the robustness of a Bayes estimate. [in reliability theory

    NASA Technical Reports Server (NTRS)

    Canavos, G. C.

    1974-01-01

    This paper examines the robustness of a Bayes estimator with respect to the assigned prior distribution. A Bayesian analysis for a stochastic scale parameter of a Weibull failure model is summarized in which the natural conjugate is assigned as the prior distribution of the random parameter. The sensitivity analysis is carried out by the Monte Carlo method in which, although an inverted gamma is the assigned prior, realizations are generated using distribution functions of varying shape. For several distributional forms and even for some fixed values of the parameter, simulated mean squared errors of Bayes and minimum variance unbiased estimators are determined and compared. Results indicate that the Bayes estimator remains squared-error superior and appears to be largely robust to the form of the assigned prior distribution.

  9. Position Accuracy Analysis of a Robust Vision-Based Navigation

    NASA Astrophysics Data System (ADS)

    Gaglione, S.; Del Pizzo, S.; Troisi, S.; Angrisano, A.

    2018-05-01

    Using images to determine camera position and attitude is a consolidated method, very widespread for application like UAV navigation. In harsh environment, where GNSS could be degraded or denied, image-based positioning could represent a possible candidate for an integrated or alternative system. In this paper, such method is investigated using a system based on single camera and 3D maps. A robust estimation method is proposed in order to limit the effect of blunders or noisy measurements on position solution. The proposed approach is tested using images collected in an urban canyon, where GNSS positioning is very unaccurate. A previous photogrammetry survey has been performed to build the 3D model of tested area. The position accuracy analysis is performed and the effect of the robust method proposed is validated.

  10. Reciprocity Between Robustness of Period and Plasticity of Phase in Biological Clocks

    NASA Astrophysics Data System (ADS)

    Hatakeyama, Tetsuhiro S.; Kaneko, Kunihiko

    2015-11-01

    Circadian clocks exhibit the robustness of period and plasticity of phase against environmental changes such as temperature and nutrient conditions. Thus far, however, it is unclear how both are simultaneously achieved. By investigating distinct models of circadian clocks, we demonstrate reciprocity between robustness and plasticity: higher robustness in the period implies higher plasticity in the phase, where changes in period and in phase follow a linear relationship with a negative coefficient. The robustness of period is achieved by the adaptation on the limit cycle via a concentration change of a buffer molecule, whose temporal change leads to a phase shift following a shift of the limit-cycle orbit in phase space. Generality of reciprocity in clocks with the adaptation mechanism is confirmed with theoretical analysis of simple models, while biological significance is discussed.

  11. New robust statistical procedures for the polytomous logistic regression models.

    PubMed

    Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro

    2018-05-17

    This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.

  12. Robustness of reduced-order multivariable state-space self-tuning controller

    NASA Technical Reports Server (NTRS)

    Yuan, Zhuzhi; Chen, Zengqiang

    1994-01-01

    In this paper, we present a quantitative analysis of the robustness of a reduced-order pole-assignment state-space self-tuning controller for a multivariable adaptive control system whose order of the real process is higher than that of the model used in the controller design. The result of stability analysis shows that, under a specific bounded modelling error, the adaptively controlled closed-loop real system via the reduced-order state-space self-tuner is BIBO stable in the presence of unmodelled dynamics.

  13. LL-37 Recruits Immunosuppressive Regulatory T Cells to Ovarian Tumors

    DTIC Science & Technology

    2009-11-01

    receptor. Western blot analysis of MSC lysates showed that ERK-1 and -2 are robustly phosphorylated beginning 10 minutes after LL-37 treatment and...Carretero, Escamez et al. 2008; von Haussen, Koczulla et al. 2008). Western blot analysis of LL-37-treated SK-OV-3 cell lysates showed the robust...mesenchymal stem cells in the treatment of gliomas ." Cancer Res 65(8): 3307-18. Studeny, M., F. C. Marini, et al. (2004). "Mesenchymal stem cells: potential

  14. Robust analysis of an underwater navigational strategy in electrically heterogeneous corridors.

    PubMed

    Dimble, Kedar D; Ranganathan, Badri N; Keshavan, Jishnu; Humbert, J Sean

    2016-08-01

    Obstacles and other global stimuli provide relevant navigational cues to a weakly electric fish. In this work, robust analysis of a control strategy based on electrolocation for performing obstacle avoidance in electrically heterogeneous corridors is presented and validated. Static output feedback control is shown to achieve the desired goal of reflexive obstacle avoidance in such environments in simulation and experimentation. The proposed approach is computationally inexpensive and readily implementable on a small scale underwater vehicle, making underwater autonomous navigation feasible in real-time.

  15. Feasible logic Bell-state analysis with linear optics

    PubMed Central

    Zhou, Lan; Sheng, Yu-Bo

    2016-01-01

    We describe a feasible logic Bell-state analysis protocol by employing the logic entanglement to be the robust concatenated Greenberger-Horne-Zeilinger (C-GHZ) state. This protocol only uses polarization beam splitters and half-wave plates, which are available in current experimental technology. We can conveniently identify two of the logic Bell states. This protocol can be easily generalized to the arbitrary C-GHZ state analysis. We can also distinguish two N-logic-qubit C-GHZ states. As the previous theory and experiment both showed that the C-GHZ state has the robustness feature, this logic Bell-state analysis and C-GHZ state analysis may be essential for linear-optical quantum computation protocols whose building blocks are logic-qubit entangled state. PMID:26877208

  16. Feasible logic Bell-state analysis with linear optics.

    PubMed

    Zhou, Lan; Sheng, Yu-Bo

    2016-02-15

    We describe a feasible logic Bell-state analysis protocol by employing the logic entanglement to be the robust concatenated Greenberger-Horne-Zeilinger (C-GHZ) state. This protocol only uses polarization beam splitters and half-wave plates, which are available in current experimental technology. We can conveniently identify two of the logic Bell states. This protocol can be easily generalized to the arbitrary C-GHZ state analysis. We can also distinguish two N-logic-qubit C-GHZ states. As the previous theory and experiment both showed that the C-GHZ state has the robustness feature, this logic Bell-state analysis and C-GHZ state analysis may be essential for linear-optical quantum computation protocols whose building blocks are logic-qubit entangled state.

  17. Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr

    2010-03-24

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less

  18. Robust demarcation of basal cell carcinoma by dependent component analysis-based segmentation of multi-spectral fluorescence images.

    PubMed

    Kopriva, Ivica; Persin, Antun; Puizina-Ivić, Neira; Mirić, Lina

    2010-07-02

    This study was designed to demonstrate robust performance of the novel dependent component analysis (DCA)-based approach to demarcation of the basal cell carcinoma (BCC) through unsupervised decomposition of the red-green-blue (RGB) fluorescent image of the BCC. Robustness to intensity fluctuation is due to the scale invariance property of DCA algorithms, which exploit spectral and spatial diversities between the BCC and the surrounding tissue. Used filtering-based DCA approach represents an extension of the independent component analysis (ICA) and is necessary in order to account for statistical dependence that is induced by spectral similarity between the BCC and surrounding tissue. This generates weak edges what represents a challenge for other segmentation methods as well. By comparative performance analysis with state-of-the-art image segmentation methods such as active contours (level set), K-means clustering, non-negative matrix factorization, ICA and ratio imaging we experimentally demonstrate good performance of DCA-based BCC demarcation in two demanding scenarios where intensity of the fluorescent image has been varied almost two orders of magnitude. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Boundedness and global robust stability analysis of delayed complex-valued neural networks with interval parameter uncertainties.

    PubMed

    Song, Qiankun; Yu, Qinqin; Zhao, Zhenjiang; Liu, Yurong; Alsaadi, Fuad E

    2018-07-01

    In this paper, the boundedness and robust stability for a class of delayed complex-valued neural networks with interval parameter uncertainties are investigated. By using Homomorphic mapping theorem, Lyapunov method and inequality techniques, sufficient condition to guarantee the boundedness of networks and the existence, uniqueness and global robust stability of equilibrium point is derived for the considered uncertain neural networks. The obtained robust stability criterion is expressed in complex-valued LMI, which can be calculated numerically using YALMIP with solver of SDPT3 in MATLAB. An example with simulations is supplied to show the applicability and advantages of the acquired result. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Adjustment of Adaptive Gain with Bounded Linear Stability Analysis to Improve Time-Delay Margin for Metrics-Driven Adaptive Control

    NASA Technical Reports Server (NTRS)

    Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje Srinvas

    2009-01-01

    This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a linear damaged twin-engine generic transport model of aircraft. The analysis shows that the system with the adjusted adaptive gain becomes more robust to unmodeled dynamics or time delay.

  1. VCD Robustness of the Amide-I and Amide-II Vibrational Modes of Small Peptide Models.

    PubMed

    Góbi, Sándor; Magyarfalvi, Gábor; Tarczay, György

    2015-09-01

    The rotational strengths and the robustness values of amide-I and amide-II vibrational modes of For(AA)n NHMe (where AA is Val, Asn, Asp, or Cys, n = 1-5 for Val and Asn; n = 1 for Asp and Cys) model peptides with α-helix and β-sheet backbone conformations were computed by density functional methods. The robustness results verify empirical rules drawn from experiments and from computed rotational strengths linking amide-I and amide-II patterns in the vibrational circular dichroism (VCD) spectra of peptides with their backbone structures. For peptides with at least three residues (n ≥ 3) these characteristic patterns from coupled amide vibrational modes have robust signatures. For shorter peptide models many vibrational modes are nonrobust, and the robust modes can be dependent on the residues or on their side chain conformations in addition to backbone conformations. These robust VCD bands, however, provide information for the detailed structural analysis of these smaller systems. © 2015 Wiley Periodicals, Inc.

  2. Research on robust optimization of emergency logistics network considering the time dependence characteristic

    NASA Astrophysics Data System (ADS)

    WANG, Qingrong; ZHU, Changfeng; LI, Ying; ZHANG, Zhengkun

    2017-06-01

    Considering the time dependence of emergency logistic network and complexity of the environment that the network exists in, in this paper the time dependent network optimization theory and robust discrete optimization theory are combined, and the emergency logistics dynamic network optimization model with characteristics of robustness is built to maximize the timeliness of emergency logistics. On this basis, considering the complexity of dynamic network and the time dependence of edge weight, an improved ant colony algorithm is proposed to realize the coupling of the optimization algorithm and the network time dependence and robustness. Finally, a case study has been carried out in order to testify validity of this robustness optimization model and its algorithm, and the value of different regulation factors was analyzed considering the importance of the value of the control factor in solving the optimal path. Analysis results show that this model and its algorithm above-mentioned have good timeliness and strong robustness.

  3. Principal coordinate analysis assisted chromatographic analysis of bacterial cell wall collection: A robust classification approach.

    PubMed

    Kumar, Keshav; Cava, Felipe

    2018-04-10

    In the present work, Principal coordinate analysis (PCoA) is introduced to develop a robust model to classify the chromatographic data sets of peptidoglycan sample. PcoA captures the heterogeneity present in the data sets by using the dissimilarity matrix as input. Thus, in principle, it can even capture the subtle differences in the bacterial peptidoglycan composition and can provide a more robust and fast approach for classifying the bacterial collection and identifying the novel cell wall targets for further biological and clinical studies. The utility of the proposed approach is successfully demonstrated by analysing the two different kind of bacterial collections. The first set comprised of peptidoglycan sample belonging to different subclasses of Alphaproteobacteria. Whereas, the second set that is relatively more intricate for the chemometric analysis consist of different wild type Vibrio Cholerae and its mutants having subtle differences in their peptidoglycan composition. The present work clearly proposes a useful approach that can classify the chromatographic data sets of chromatographic peptidoglycan samples having subtle differences. Furthermore, present work clearly suggest that PCoA can be a method of choice in any data analysis workflow. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Breakdown of interdependent directed networks.

    PubMed

    Liu, Xueming; Stanley, H Eugene; Gao, Jianxi

    2016-02-02

    Increasing evidence shows that real-world systems interact with one another via dependency connectivities. Failing connectivities are the mechanism behind the breakdown of interacting complex systems, e.g., blackouts caused by the interdependence of power grids and communication networks. Previous research analyzing the robustness of interdependent networks has been limited to undirected networks. However, most real-world networks are directed, their in-degrees and out-degrees may be correlated, and they are often coupled to one another as interdependent directed networks. To understand the breakdown and robustness of interdependent directed networks, we develop a theoretical framework based on generating functions and percolation theory. We find that for interdependent Erdős-Rényi networks the directionality within each network increases their vulnerability and exhibits hybrid phase transitions. We also find that the percolation behavior of interdependent directed scale-free networks with and without degree correlations is so complex that two criteria are needed to quantify and compare their robustness: the percolation threshold and the integrated size of the giant component during an entire attack process. Interestingly, we find that the in-degree and out-degree correlations in each network layer increase the robustness of interdependent degree heterogeneous networks that most real networks are, but decrease the robustness of interdependent networks with homogeneous degree distribution and with strong coupling strengths. Moreover, by applying our theoretical analysis to real interdependent international trade networks, we find that the robustness of these real-world systems increases with the in-degree and out-degree correlations, confirming our theoretical analysis.

  5. Reliability of simulated robustness testing in fast liquid chromatography, using state-of-the-art column technology, instrumentation and modelling software.

    PubMed

    Kormány, Róbert; Fekete, Jenő; Guillarme, Davy; Fekete, Szabolcs

    2014-02-01

    The goal of this study was to evaluate the accuracy of simulated robustness testing using commercial modelling software (DryLab) and state-of-the-art stationary phases. For this purpose, a mixture of amlodipine and its seven related impurities was analyzed on short narrow bore columns (50×2.1mm, packed with sub-2μm particles) providing short analysis times. The performance of commercial modelling software for robustness testing was systematically compared to experimental measurements and DoE based predictions. We have demonstrated that the reliability of predictions was good, since the predicted retention times and resolutions were in good agreement with the experimental ones at the edges of the design space. In average, the retention time relative errors were <1.0%, while the predicted critical resolution errors were comprised between 6.9 and 17.2%. Because the simulated robustness testing requires significantly less experimental work than the DoE based predictions, we think that robustness could now be investigated in the early stage of method development. Moreover, the column interchangeability, which is also an important part of robustness testing, was investigated considering five different C8 and C18 columns packed with sub-2μm particles. Again, thanks to modelling software, we proved that the separation was feasible on all columns within the same analysis time (less than 4min), by proper adjustments of variables. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Using dynamic mode decomposition to extract cyclic behavior in the stock market

    NASA Astrophysics Data System (ADS)

    Hua, Jia-Chen; Roy, Sukesh; McCauley, Joseph L.; Gunaratne, Gemunu H.

    2016-04-01

    The presence of cyclic expansions and contractions in the economy has been known for over a century. The work reported here searches for similar cyclic behavior in stock valuations. The variations are subtle and can only be extracted through analysis of price variations of a large number of stocks. Koopman mode analysis is a natural approach to establish such collective oscillatory behavior. The difficulty is that even non-cyclic and stochastic constituents of a finite data set may be interpreted as a sum of periodic motions. However, deconvolution of these irregular dynamical facets may be expected to be non-robust, i.e., to depend on specific data set. We propose an approach to differentiate robust and non-robust features in a time series; it is based on identifying robust features with reproducible Koopman modes, i.e., those that persist between distinct sub-groupings of the data. Our analysis of stock data discovered four reproducible modes, one of which has period close to the number of trading days/year. To the best of our knowledge these cycles were not reported previously. It is particularly interesting that the cyclic behaviors persisted through the great recession even though phase relationships between stocks within the modes evolved in the intervening period.

  7. Data depth based clustering analysis

    DOE PAGES

    Jeong, Myeong -Hun; Cai, Yaping; Sullivan, Clair J.; ...

    2016-01-01

    Here, this paper proposes a new algorithm for identifying patterns within data, based on data depth. Such a clustering analysis has an enormous potential to discover previously unknown insights from existing data sets. Many clustering algorithms already exist for this purpose. However, most algorithms are not affine invariant. Therefore, they must operate with different parameters after the data sets are rotated, scaled, or translated. Further, most clustering algorithms, based on Euclidean distance, can be sensitive to noises because they have no global perspective. Parameter selection also significantly affects the clustering results of each algorithm. Unlike many existing clustering algorithms, themore » proposed algorithm, called data depth based clustering analysis (DBCA), is able to detect coherent clusters after the data sets are affine transformed without changing a parameter. It is also robust to noises because using data depth can measure centrality and outlyingness of the underlying data. Further, it can generate relatively stable clusters by varying the parameter. The experimental comparison with the leading state-of-the-art alternatives demonstrates that the proposed algorithm outperforms DBSCAN and HDBSCAN in terms of affine invariance, and exceeds or matches the ro-bustness to noises of DBSCAN or HDBSCAN. The robust-ness to parameter selection is also demonstrated through the case study of clustering twitter data.« less

  8. Robust Stability Analysis of the Space Launch System Control Design: A Singular Value Approach

    NASA Technical Reports Server (NTRS)

    Pei, Jing; Newsome, Jerry R.

    2015-01-01

    Classical stability analysis consists of breaking the feedback loops one at a time and determining separately how much gain or phase variations would destabilize the stable nominal feedback system. For typical launch vehicle control design, classical control techniques are generally employed. In addition to stability margins, frequency domain Monte Carlo methods are used to evaluate the robustness of the design. However, such techniques were developed for Single-Input-Single-Output (SISO) systems and do not take into consideration the off-diagonal terms in the transfer function matrix of Multi-Input-Multi-Output (MIMO) systems. Robust stability analysis techniques such as H(sub infinity) and mu are applicable to MIMO systems but have not been adopted as standard practices within the launch vehicle controls community. This paper took advantage of a simple singular-value-based MIMO stability margin evaluation method based on work done by Mukhopadhyay and Newsom and applied it to the SLS high-fidelity dynamics model. The method computes a simultaneous multi-loop gain and phase margin that could be related back to classical margins. The results presented in this paper suggest that for the SLS system, traditional SISO stability margins are similar to the MIMO margins. This additional level of verification provides confidence in the robustness of the control design.

  9. Parenchymal texture analysis in digital mammography: robust texture feature identification and equivalence across devices.

    PubMed

    Keller, Brad M; Oustimov, Andrew; Wang, Yan; Chen, Jinbo; Acciavatti, Raymond J; Zheng, Yuanjie; Ray, Shonket; Gee, James C; Maidment, Andrew D A; Kontos, Despina

    2015-04-01

    An analytical framework is presented for evaluating the equivalence of parenchymal texture features across different full-field digital mammography (FFDM) systems using a physical breast phantom. Phantom images (FOR PROCESSING) are acquired from three FFDM systems using their automated exposure control setting. A panel of texture features, including gray-level histogram, co-occurrence, run length, and structural descriptors, are extracted. To identify features that are robust across imaging systems, a series of equivalence tests are performed on the feature distributions, in which the extent of their intersystem variation is compared to their intrasystem variation via the Hodges-Lehmann test statistic. Overall, histogram and structural features tend to be most robust across all systems, and certain features, such as edge enhancement, tend to be more robust to intergenerational differences between detectors of a single vendor than to intervendor differences. Texture features extracted from larger regions of interest (i.e., [Formula: see text]) and with a larger offset length (i.e., [Formula: see text]), when applicable, also appear to be more robust across imaging systems. This framework and observations from our experiments may benefit applications utilizing mammographic texture analysis on images acquired in multivendor settings, such as in multicenter studies of computer-aided detection and breast cancer risk assessment.

  10. Scaled test statistics and robust standard errors for non-normal data in covariance structure analysis: a Monte Carlo study.

    PubMed

    Chou, C P; Bentler, P M; Satorra, A

    1991-11-01

    Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.

  11. [Research on K-means clustering segmentation method for MRI brain image based on selecting multi-peaks in gray histogram].

    PubMed

    Chen, Zhaoxue; Yu, Haizhong; Chen, Hao

    2013-12-01

    To solve the problem of traditional K-means clustering in which initial clustering centers are selected randomly, we proposed a new K-means segmentation algorithm based on robustly selecting 'peaks' standing for White Matter, Gray Matter and Cerebrospinal Fluid in multi-peaks gray histogram of MRI brain image. The new algorithm takes gray value of selected histogram 'peaks' as the initial K-means clustering center and can segment the MRI brain image into three parts of tissue more effectively, accurately, steadily and successfully. Massive experiments have proved that the proposed algorithm can overcome many shortcomings caused by traditional K-means clustering method such as low efficiency, veracity, robustness and time consuming. The histogram 'peak' selecting idea of the proposed segmentootion method is of more universal availability.

  12. Human region segmentation and description methods for domiciliary healthcare monitoring using chromatic methodology

    NASA Astrophysics Data System (ADS)

    Al-Temeemy, Ali A.

    2018-03-01

    A descriptor is proposed for use in domiciliary healthcare monitoring systems. The descriptor is produced from chromatic methodology to extract robust features from the monitoring system's images. It has superior discrimination capabilities, is robust to events that normally disturb monitoring systems, and requires less computational time and storage space to achieve recognition. A method of human region segmentation is also used with this descriptor. The performance of the proposed descriptor was evaluated using experimental data sets, obtained through a series of experiments performed in the Centre for Intelligent Monitoring Systems, University of Liverpool. The evaluation results show high recognition performance for the proposed descriptor in comparison to traditional descriptors, such as moments invariant. The results also show the effectiveness of the proposed segmentation method regarding distortion effects associated with domiciliary healthcare systems.

  13. A New Quantum Watermarking Based on Quantum Wavelet Transforms

    NASA Astrophysics Data System (ADS)

    Heidari, Shahrokh; Naseri, Mosayeb; Gheibi, Reza; Baghfalaki, Masoud; Rasoul Pourarian, Mohammad; Farouk, Ahmed

    2017-06-01

    Quantum watermarking is a technique to embed specific information, usually the owner’s identification, into quantum cover data such for copyright protection purposes. In this paper, a new scheme for quantum watermarking based on quantum wavelet transforms is proposed which includes scrambling, embedding and extracting procedures. The invisibility and robustness performances of the proposed watermarking method is confirmed by simulation technique. The invisibility of the scheme is examined by the peak-signal-to-noise ratio (PSNR) and the histogram calculation. Furthermore the robustness of the scheme is analyzed by the Bit Error Rate (BER) and the Correlation Two-Dimensional (Corr 2-D) calculation. The simulation results indicate that the proposed watermarking scheme indicate not only acceptable visual quality but also a good resistance against different types of attack. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, Iran

  14. The real-world navigator

    NASA Technical Reports Server (NTRS)

    Balabanovic, Marko; Becker, Craig; Morse, Sarah K.; Nourbakhsh, Illah R.

    1994-01-01

    The success of every mobile robot application hinges on the ability to navigate robustly in the real world. The problem of robust navigation is separable from the challenges faced by any particular robot application. We offer the Real-World Navigator as a solution architecture that includes a path planner, a map-based localizer, and a motion control loop that combines reactive avoidance modules with deliberate goal-based motion. Our architecture achieves a high degree of reliability by maintaining and reasoning about an explicit description of positional uncertainty. We provide two implementations of real-world robot systems that incorporate the Real-World Navigator. The Vagabond Project culminated in a robot that successfully navigated a portion of the Stanford University campus. The Scimmer project developed successful entries for the AIAA 1993 Robotics Competition, placing first in one of the two contests entered.

  15. Integrated analysis of numerous heterogeneous gene expression profiles for detecting robust disease-specific biomarkers and proposing drug targets.

    PubMed

    Amar, David; Hait, Tom; Izraeli, Shai; Shamir, Ron

    2015-09-18

    Genome-wide expression profiling has revolutionized biomedical research; vast amounts of expression data from numerous studies of many diseases are now available. Making the best use of this resource in order to better understand disease processes and treatment remains an open challenge. In particular, disease biomarkers detected in case-control studies suffer from low reliability and are only weakly reproducible. Here, we present a systematic integrative analysis methodology to overcome these shortcomings. We assembled and manually curated more than 14,000 expression profiles spanning 48 diseases and 18 expression platforms. We show that when studying a particular disease, judicious utilization of profiles from other diseases and information on disease hierarchy improves classification quality, avoids overoptimistic evaluation of that quality, and enhances disease-specific biomarker discovery. This approach yielded specific biomarkers for 24 of the analyzed diseases. We demonstrate how to combine these biomarkers with large-scale interaction, mutation and drug target data, forming a highly valuable disease summary that suggests novel directions in disease understanding and drug repurposing. Our analysis also estimates the number of samples required to reach a desired level of biomarker stability. This methodology can greatly improve the exploitation of the mountain of expression profiles for better disease analysis. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. A cost-effectiveness analysis of two different antimicrobial stewardship programs.

    PubMed

    Okumura, Lucas Miyake; Riveros, Bruno Salgado; Gomes-da-Silva, Monica Maria; Veroneze, Izelandia

    2016-01-01

    There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70), however, it was more efficient (US$ 27,549.15 vs 29,011.46). Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective. Copyright © 2016 Elsevier Editora Ltda. All rights reserved.

  17. An Improved Analysis of the Sevoflurane-Benzene Structure by Chirped Pulse Ftmw Spectroscopy

    NASA Astrophysics Data System (ADS)

    Seifert, Nathan A.; Perez, Cristobal; Zaleski, Daniel P.; Neill, Justin L.; Pate, Brooks H.; Lesarri, Alberto; Vallejo, Montserrat; Cocinero, Emilio J.; Castano, Fernando; Kleiner, Isabelle

    2013-06-01

    Recent improvements to the 2-8 GHz CP-FTMW spectrometer at University of Virginia have improved the structural and spectroscopic analysis of the sevoflurane-benzene cluster. Previously reported results, although robust, were limited to a fit of the a-type transitions of the normal species in the determination of the six-fold barrier to benzene internal rotation. Structural analysis was limited to the benzene hydrogen atom positions using benzene-d_{1}. The increased sensitivity of the new 2-8 GHz setup allows for a full internal rotation analysis of the a- and c-type transitions of the normal species, which was performed with BELGI. A fit value for V_{6} of 32.868(11) cm^{-1} is determined. Additionally, a full substitution structure of the benzene carbon atom positions was determined in natural abundance. Also, new measurements of a sevoflurane/benzene-d_{1} mixture enabled detection of 33 of the 60 possible ^{2}D / ^{13}C double isotopologues. This abundance of isotopic data, a total of 45 isotopologues, enabled a full heavy atom least-squares r_{0} structure fit for the complex, including positions for all seven fluorines in sevoflurane. N. A. Seifert, D. P. Zaleski, J. L. Neill, B. H. Pate, A. Lesarri, M. Vallejo, E. J. Cocinero, F. Castańo. 67th OSU Int. Symp. On Mol. Spectrosc., Columbus, OH, 2012, MH13.

  18. Testing Universal Relations of Neutron Stars with a Nonlinear Matter-Gravity Coupling Theory

    NASA Astrophysics Data System (ADS)

    Sham, Y.-H.; Lin, L.-M.; Leung, P. T.

    2014-02-01

    Due to our ignorance of the equation of state (EOS) beyond nuclear density, there is still no unique theoretical model for neutron stars (NSs). It is therefore surprising that universal EOS-independent relations connecting different physical quantities of NSs can exist. Lau et al. found that the frequency of the f-mode oscillation, the mass, and the moment of inertia are connected by universal relations. More recently, Yagi and Yunes discovered the I-Love-Q universal relations among the mass, the moment of inertia, the Love number, and the quadrupole moment. In this paper, we study these universal relations in the Eddington-inspired Born-Infeld (EiBI) gravity. This theory differs from general relativity (GR) significantly only at high densities due to the nonlinear coupling between matter and gravity. It thus provides us an ideal case to test how robust the universal relations of NSs are with respect to the change of the gravity theory. Due to the apparent EOS formulation of EiBI gravity developed recently by Delsate and Steinhoff, we are able to study the universal relations in EiBI gravity using the same techniques as those in GR. We find that the universal relations in EiBI gravity are essentially the same as those in GR. Our work shows that, within the currently viable coupling constant, there exists at least one modified gravity theory that is indistinguishable from GR in view of the unexpected universal relations.

  19. Testing universal relations of neutron stars with a nonlinear matter-gravity coupling theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sham, Y.-H.; Lin, L.-M.; Leung, P. T., E-mail: yhsham@phy.cuhk.edu.hk, E-mail: lmlin@phy.cuhk.edu.hk, E-mail: ptleung@phy.cuhk.edu.hk

    Due to our ignorance of the equation of state (EOS) beyond nuclear density, there is still no unique theoretical model for neutron stars (NSs). It is therefore surprising that universal EOS-independent relations connecting different physical quantities of NSs can exist. Lau et al. found that the frequency of the f-mode oscillation, the mass, and the moment of inertia are connected by universal relations. More recently, Yagi and Yunes discovered the I-Love-Q universal relations among the mass, the moment of inertia, the Love number, and the quadrupole moment. In this paper, we study these universal relations in the Eddington-inspired Born-Infeld (EiBI)more » gravity. This theory differs from general relativity (GR) significantly only at high densities due to the nonlinear coupling between matter and gravity. It thus provides us an ideal case to test how robust the universal relations of NSs are with respect to the change of the gravity theory. Due to the apparent EOS formulation of EiBI gravity developed recently by Delsate and Steinhoff, we are able to study the universal relations in EiBI gravity using the same techniques as those in GR. We find that the universal relations in EiBI gravity are essentially the same as those in GR. Our work shows that, within the currently viable coupling constant, there exists at least one modified gravity theory that is indistinguishable from GR in view of the unexpected universal relations.« less

  20. Microscopy image segmentation tool: Robust image data analysis

    NASA Astrophysics Data System (ADS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  1. Robust model-based analysis of single-particle tracking experiments with Spot-On

    PubMed Central

    Grimm, Jonathan B; Lavis, Luke D

    2018-01-01

    Single-particle tracking (SPT) has become an important method to bridge biochemistry and cell biology since it allows direct observation of protein binding and diffusion dynamics in live cells. However, accurately inferring information from SPT studies is challenging due to biases in both data analysis and experimental design. To address analysis bias, we introduce ‘Spot-On’, an intuitive web-interface. Spot-On implements a kinetic modeling framework that accounts for known biases, including molecules moving out-of-focus, and robustly infers diffusion constants and subpopulations from pooled single-molecule trajectories. To minimize inherent experimental biases, we implement and validate stroboscopic photo-activation SPT (spaSPT), which minimizes motion-blur bias and tracking errors. We validate Spot-On using experimentally realistic simulations and show that Spot-On outperforms other methods. We then apply Spot-On to spaSPT data from live mammalian cells spanning a wide range of nuclear dynamics and demonstrate that Spot-On consistently and robustly infers subpopulation fractions and diffusion constants. PMID:29300163

  2. Robust model-based analysis of single-particle tracking experiments with Spot-On.

    PubMed

    Hansen, Anders S; Woringer, Maxime; Grimm, Jonathan B; Lavis, Luke D; Tjian, Robert; Darzacq, Xavier

    2018-01-04

    Single-particle tracking (SPT) has become an important method to bridge biochemistry and cell biology since it allows direct observation of protein binding and diffusion dynamics in live cells. However, accurately inferring information from SPT studies is challenging due to biases in both data analysis and experimental design. To address analysis bias, we introduce 'Spot-On', an intuitive web-interface. Spot-On implements a kinetic modeling framework that accounts for known biases, including molecules moving out-of-focus, and robustly infers diffusion constants and subpopulations from pooled single-molecule trajectories. To minimize inherent experimental biases, we implement and validate stroboscopic photo-activation SPT (spaSPT), which minimizes motion-blur bias and tracking errors. We validate Spot-On using experimentally realistic simulations and show that Spot-On outperforms other methods. We then apply Spot-On to spaSPT data from live mammalian cells spanning a wide range of nuclear dynamics and demonstrate that Spot-On consistently and robustly infers subpopulation fractions and diffusion constants. © 2018, Hansen et al.

  3. Project risk management in the construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  4. LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS

    PubMed Central

    Einstein, Daniel R.; Dyedov, Vladimir

    2010-01-01

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546

  5. Multilayer Perceptron for Robust Nonlinear Interval Regression Analysis Using Genetic Algorithms

    PubMed Central

    2014-01-01

    On the basis of fuzzy regression, computational models in intelligence such as neural networks have the capability to be applied to nonlinear interval regression analysis for dealing with uncertain and imprecise data. When training data are not contaminated by outliers, computational models perform well by including almost all given training data in the data interval. Nevertheless, since training data are often corrupted by outliers, robust learning algorithms employed to resist outliers for interval regression analysis have been an interesting area of research. Several approaches involving computational intelligence are effective for resisting outliers, but the required parameters for these approaches are related to whether the collected data contain outliers or not. Since it seems difficult to prespecify the degree of contamination beforehand, this paper uses multilayer perceptron to construct the robust nonlinear interval regression model using the genetic algorithm. Outliers beyond or beneath the data interval will impose slight effect on the determination of data interval. Simulation results demonstrate that the proposed method performs well for contaminated datasets. PMID:25110755

  6. Multilayer perceptron for robust nonlinear interval regression analysis using genetic algorithms.

    PubMed

    Hu, Yi-Chung

    2014-01-01

    On the basis of fuzzy regression, computational models in intelligence such as neural networks have the capability to be applied to nonlinear interval regression analysis for dealing with uncertain and imprecise data. When training data are not contaminated by outliers, computational models perform well by including almost all given training data in the data interval. Nevertheless, since training data are often corrupted by outliers, robust learning algorithms employed to resist outliers for interval regression analysis have been an interesting area of research. Several approaches involving computational intelligence are effective for resisting outliers, but the required parameters for these approaches are related to whether the collected data contain outliers or not. Since it seems difficult to prespecify the degree of contamination beforehand, this paper uses multilayer perceptron to construct the robust nonlinear interval regression model using the genetic algorithm. Outliers beyond or beneath the data interval will impose slight effect on the determination of data interval. Simulation results demonstrate that the proposed method performs well for contaminated datasets.

  7. What Climate Information Do Water Managers Need to Make Robust, Long-Term Plans?

    NASA Astrophysics Data System (ADS)

    Duran, R.; Lempert, R.; Groves, D.

    2008-12-01

    What climate information do water managers need to respond to threat of climate change? Southern California's Inland Empire Utilities Agency (IEUA) completed a long-range water resource management plan in 2005 that addressed expected economic and population growth in their service region, but did not consider the potential impacts of climate change. Using a robust decision making (RDM) approach for policy under deep uncertainty, we recently worked with IEUA to conduct a climate-change vulnerability and response options analysis of the agency's long-range plans. This analysis suggests that IEUA is vulnerable to future climate change, but can significantly reduce this vulnerability by increasing their near-term conservation programs and careful monitoring and updating to adjust their plan in the years ahead. In addition to helping IEUA, this analysis provides important guidance on the types of climate and other information that can be most useful for water managers as they attempt to take robust, near-term actions to increaase their resilience to climate change.

  8. High-Throughput Histopathological Image Analysis via Robust Cell Segmentation and Hashing

    PubMed Central

    Zhang, Xiaofan; Xing, Fuyong; Su, Hai; Yang, Lin; Zhang, Shaoting

    2015-01-01

    Computer-aided diagnosis of histopathological images usually requires to examine all cells for accurate diagnosis. Traditional computational methods may have efficiency issues when performing cell-level analysis. In this paper, we propose a robust and scalable solution to enable such analysis in a real-time fashion. Specifically, a robust segmentation method is developed to delineate cells accurately using Gaussian-based hierarchical voting and repulsive balloon model. A large-scale image retrieval approach is also designed to examine and classify each cell of a testing image by comparing it with a massive database, e.g., half-million cells extracted from the training dataset. We evaluate this proposed framework on a challenging and important clinical use case, i.e., differentiation of two types of lung cancers (the adenocarcinoma and squamous carcinoma), using thousands of lung microscopic tissue images extracted from hundreds of patients. Our method has achieved promising accuracy and running time by searching among half-million cells. PMID:26599156

  9. A probabilistic approach to aircraft design emphasizing stability and control uncertainties

    NASA Astrophysics Data System (ADS)

    Delaurentis, Daniel Andrew

    In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.

  10. GPU-accelerated automatic identification of robust beam setups for proton and carbon-ion radiotherapy

    NASA Astrophysics Data System (ADS)

    Ammazzalorso, F.; Bednarz, T.; Jelen, U.

    2014-03-01

    We demonstrate acceleration on graphic processing units (GPU) of automatic identification of robust particle therapy beam setups, minimizing negative dosimetric effects of Bragg peak displacement caused by treatment-time patient positioning errors. Our particle therapy research toolkit, RobuR, was extended with OpenCL support and used to implement calculation on GPU of the Port Homogeneity Index, a metric scoring irradiation port robustness through analysis of tissue density patterns prior to dose optimization and computation. Results were benchmarked against an independent native CPU implementation. Numerical results were in agreement between the GPU implementation and native CPU implementation. For 10 skull base cases, the GPU-accelerated implementation was employed to select beam setups for proton and carbon ion treatment plans, which proved to be dosimetrically robust, when recomputed in presence of various simulated positioning errors. From the point of view of performance, average running time on the GPU decreased by at least one order of magnitude compared to the CPU, rendering the GPU-accelerated analysis a feasible step in a clinical treatment planning interactive session. In conclusion, selection of robust particle therapy beam setups can be effectively accelerated on a GPU and become an unintrusive part of the particle therapy treatment planning workflow. Additionally, the speed gain opens new usage scenarios, like interactive analysis manipulation (e.g. constraining of some setup) and re-execution. Finally, through OpenCL portable parallelism, the new implementation is suitable also for CPU-only use, taking advantage of multiple cores, and can potentially exploit types of accelerators other than GPUs.

  11. Robustness Analysis and Reliable Flight Regime Estimation of an Integrated Resilent Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2008-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.

  12. Cloud-based interactive analytics for terabytes of genomic variants data.

    PubMed

    Pan, Cuiping; McInnes, Gregory; Deflaux, Nicole; Snyder, Michael; Bingham, Jonathan; Datta, Somalee; Tsao, Philip S

    2017-12-01

    Large scale genomic sequencing is now widely used to decipher questions in diverse realms such as biological function, human diseases, evolution, ecosystems, and agriculture. With the quantity and diversity these data harbor, a robust and scalable data handling and analysis solution is desired. We present interactive analytics using a cloud-based columnar database built on Dremel to perform information compression, comprehensive quality controls, and biological information retrieval in large volumes of genomic data. We demonstrate such Big Data computing paradigms can provide orders of magnitude faster turnaround for common genomic analyses, transforming long-running batch jobs submitted via a Linux shell into questions that can be asked from a web browser in seconds. Using this method, we assessed a study population of 475 deeply sequenced human genomes for genomic call rate, genotype and allele frequency distribution, variant density across the genome, and pharmacogenomic information. Our analysis framework is implemented in Google Cloud Platform and BigQuery. Codes are available at https://github.com/StanfordBioinformatics/mvp_aaa_codelabs. cuiping@stanford.edu or ptsao@stanford.edu. Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2017. This work is written by US Government employees and are in the public domain in the US.

  13. Anticipated debt and financial stress in medical students.

    PubMed

    Morra, Dante J; Regehr, Glenn; Ginsburg, Shiphra

    2008-01-01

    While medical student debt is increasing, the effect of debt on student well-being and performance remains unclear. As a part of a larger study examining medical student views of their future profession, data were collected to examine the role that current and anticipated debt has in predicting stress among medical students. A survey was administered to medical students in all four years at the University of Toronto. Of the 804 potential respondents across the four years of training, 549 surveys had sufficient data for inclusion in this analysis, for a response rate of 68%. Through multiple regression analysis, we evaluated the correlation between current and anticipated debt and financial stress. Although perceived financial stress correlates with both current and anticipated debt levels, anticipated debt was able to account for an additional 11.5% of variance in reported stress when compared to current debt levels alone. This study demonstrates a relationship between perceived financial stress and debt levels, and suggests that anticipated debt levels might be a more robust metric to capture financial burden, as it standardizes for year of training and captures future financial liabilities (future tuition and other future expenses).

  14. An Integrated Systems Genetics and Omics Toolkit to Probe Gene Function.

    PubMed

    Li, Hao; Wang, Xu; Rukina, Daria; Huang, Qingyao; Lin, Tao; Sorrentino, Vincenzo; Zhang, Hongbo; Bou Sleiman, Maroun; Arends, Danny; McDaid, Aaron; Luan, Peiling; Ziari, Naveed; Velázquez-Villegas, Laura A; Gariani, Karim; Kutalik, Zoltan; Schoonjans, Kristina; Radcliffe, Richard A; Prins, Pjotr; Morgenthaler, Stephan; Williams, Robert W; Auwerx, Johan

    2018-01-24

    Identifying genetic and environmental factors that impact complex traits and common diseases is a high biomedical priority. Here, we developed, validated, and implemented a series of multi-layered systems approaches, including (expression-based) phenome-wide association, transcriptome-/proteome-wide association, and (reverse-) mediation analysis, in an open-access web server (systems-genetics.org) to expedite the systems dissection of gene function. We applied these approaches to multi-omics datasets from the BXD mouse genetic reference population, and identified and validated associations between genes and clinical and molecular phenotypes, including previously unreported links between Rpl26 and body weight, and Cpt1a and lipid metabolism. Furthermore, through mediation and reverse-mediation analysis we established regulatory relations between genes, such as the co-regulation of BCKDHA and BCKDHB protein levels, and identified targets of transcription factors E2F6, ZFP277, and ZKSCAN1. Our multifaceted toolkit enabled the identification of gene-gene and gene-phenotype links that are robust and that translate well across populations and species, and can be universally applied to any populations with multi-omics datasets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Assessment the impact of samplers change on the uncertainty related to geothermalwater sampling

    NASA Astrophysics Data System (ADS)

    Wątor, Katarzyna; Mika, Anna; Sekuła, Klaudia; Kmiecik, Ewa

    2018-02-01

    The aim of this study is to assess the impact of samplers change on the uncertainty associated with the process of the geothermal water sampling. The study was carried out on geothermal water exploited in Podhale region, southern Poland (Małopolska province). To estimate the uncertainty associated with sampling the results of determinations of metasilicic acid (H2SiO3) in normal and duplicate samples collected in two series were used (in each series the samples were collected by qualified sampler). Chemical analyses were performed using ICP-OES method in the certified Hydrogeochemical Laboratory of the Hydrogeology and Engineering Geology Department at the AGH University of Science and Technology in Krakow (Certificate of Polish Centre for Accreditation No. AB 1050). To evaluate the uncertainty arising from sampling the empirical approach was implemented, based on double analysis of normal and duplicate samples taken from the same well in the series of testing. The analyses of the results were done using ROBAN software based on technique of robust statistics analysis of variance (rANOVA). Conducted research proved that in the case of qualified and experienced samplers uncertainty connected with the sampling can be reduced what results in small measurement uncertainty.

  16. A data grid for imaging-based clinical trials

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng; Chao, Sander S.; Lee, Jasper; Liu, Brent; Documet, Jorge; Huang, H. K.

    2007-03-01

    Clinical trials play a crucial role in testing new drugs or devices in modern medicine. Medical imaging has also become an important tool in clinical trials because images provide a unique and fast diagnosis with visual observation and quantitative assessment. A typical imaging-based clinical trial consists of: 1) A well-defined rigorous clinical trial protocol, 2) a radiology core that has a quality control mechanism, a biostatistics component, and a server for storing and distributing data and analysis results; and 3) many field sites that generate and send image studies to the radiology core. As the number of clinical trials increases, it becomes a challenge for a radiology core servicing multiple trials to have a server robust enough to administrate and quickly distribute information to participating radiologists/clinicians worldwide. The Data Grid can satisfy the aforementioned requirements of imaging based clinical trials. In this paper, we present a Data Grid architecture for imaging-based clinical trials. A Data Grid prototype has been implemented in the Image Processing and Informatics (IPI) Laboratory at the University of Southern California to test and evaluate performance in storing trial images and analysis results for a clinical trial. The implementation methodology and evaluation protocol of the Data Grid are presented.

  17. Super-resolution method for face recognition using nonlinear mappings on coherent features.

    PubMed

    Huang, Hua; He, Huiting

    2011-01-01

    Low-resolution (LR) of face images significantly decreases the performance of face recognition. To address this problem, we present a super-resolution method that uses nonlinear mappings to infer coherent features that favor higher recognition of the nearest neighbor (NN) classifiers for recognition of single LR face image. Canonical correlation analysis is applied to establish the coherent subspaces between the principal component analysis (PCA) based features of high-resolution (HR) and LR face images. Then, a nonlinear mapping between HR/LR features can be built by radial basis functions (RBFs) with lower regression errors in the coherent feature space than in the PCA feature space. Thus, we can compute super-resolved coherent features corresponding to an input LR image according to the trained RBF model efficiently and accurately. And, face identity can be obtained by feeding these super-resolved features to a simple NN classifier. Extensive experiments on the Facial Recognition Technology, University of Manchester Institute of Science and Technology, and Olivetti Research Laboratory databases show that the proposed method outperforms the state-of-the-art face recognition algorithms for single LR image in terms of both recognition rate and robustness to facial variations of pose and expression.

  18. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    PubMed Central

    Elgendi, Mohamed

    2016-01-01

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852

  19. Global QCD Analysis of the Nucleon Tensor Charge with Lattice QCD Constraints

    NASA Astrophysics Data System (ADS)

    Shows, Harvey, III; Melnitchouk, Wally; Sato, Nobuo

    2017-09-01

    By studying the parton distribution functions (PDFs) of a nucleon, we probe the partonic scale of nature, exploring what it means to be a nucleon. In this study, we are interested in the transversity PDF-the least studied of the three collinear PDFs. By conducting a global analysis on experimental data from semi-inclusive deep inelastic scattering (SIDIS), as well as single-inclusive e+e- annihilation (SIA), we extract the fit parameters needed to describe the transverse moment dependent (TMD) transversity PDF, as well as the Collins fragmentation function. Once the collinear transversity PDF is obtained by integrating the extracted TMD PDF, we wish to resolve discrepancies between lattice QCD calculations and phenomenological extractions of the tensor charge from data. Here we show our results for the transversity distribution and tensor charge. Using our method of iterative Monte Carlo, we now have a more robust understanding of the transversity PDF. With these results we are able to progress in our understanding of TMD PDFs, as well as testify to the efficacy of current lattice QCD calculations. This work is made possible through support from NSF award 1659177 to Old Dominion University.

  20. Validation of a robust proteomic analysis carried out on formalin-fixed paraffin-embedded tissues of the pancreas obtained from mouse and human.

    PubMed

    Kojima, Kyoko; Bowersock, Gregory J; Kojima, Chinatsu; Klug, Christopher A; Grizzle, William E; Mobley, James A

    2012-11-01

    A number of reports have recently emerged with focus on extraction of proteins from formalin-fixed paraffin-embedded (FFPE) tissues for MS analysis; however, reproducibility and robustness as compared to flash frozen controls is generally overlooked. The goal of this study was to identify and validate a practical and highly robust approach for the proteomics analysis of FFPE tissues. FFPE and matched frozen pancreatic tissues obtained from mice (n = 8) were analyzed using 1D-nanoLC-MS(MS)(2) following work up with commercially available kits. The chosen approach for FFPE tissues was found to be highly comparable to that of frozen. In addition, the total number of unique peptides identified between the two groups was highly similar, with 958 identified for FFPE and 1070 identified for frozen, with protein identifications that corresponded by approximately 80%. This approach was then applied to archived human FFPE pancreatic cancer specimens (n = 11) as compared to uninvolved tissues (n = 8), where 47 potential pancreatic ductal adenocarcinoma markers were identified as significantly increased, of which 28 were previously reported. Further, these proteins share strongly overlapping pathway associations to pancreatic cancer that include estrogen receptor α. Together, these data support the validation of an approach for the proteomic analysis of FFPE tissues that is straightforward and highly robust, which can also be effectively applied toward translational studies of disease. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Top