Sample records for identifying robust process

  1. A P-Norm Robust Feature Extraction Method for Identifying Differentially Expressed Genes

    PubMed Central

    Liu, Jian; Liu, Jin-Xing; Gao, Ying-Lian; Kong, Xiang-Zhen; Wang, Xue-Song; Wang, Dong

    2015-01-01

    In current molecular biology, it becomes more and more important to identify differentially expressed genes closely correlated with a key biological process from gene expression data. In this paper, based on the Schatten p-norm and Lp-norm, a novel p-norm robust feature extraction method is proposed to identify the differentially expressed genes. In our method, the Schatten p-norm is used as the regularization function to obtain a low-rank matrix and the Lp-norm is taken as the error function to improve the robustness to outliers in the gene expression data. The results on simulation data show that our method can obtain higher identification accuracies than the competitive methods. Numerous experiments on real gene expression data sets demonstrate that our method can identify more differentially expressed genes than the others. Moreover, we confirmed that the identified genes are closely correlated with the corresponding gene expression data. PMID:26201006

  2. A P-Norm Robust Feature Extraction Method for Identifying Differentially Expressed Genes.

    PubMed

    Liu, Jian; Liu, Jin-Xing; Gao, Ying-Lian; Kong, Xiang-Zhen; Wang, Xue-Song; Wang, Dong

    2015-01-01

    In current molecular biology, it becomes more and more important to identify differentially expressed genes closely correlated with a key biological process from gene expression data. In this paper, based on the Schatten p-norm and Lp-norm, a novel p-norm robust feature extraction method is proposed to identify the differentially expressed genes. In our method, the Schatten p-norm is used as the regularization function to obtain a low-rank matrix and the Lp-norm is taken as the error function to improve the robustness to outliers in the gene expression data. The results on simulation data show that our method can obtain higher identification accuracies than the competitive methods. Numerous experiments on real gene expression data sets demonstrate that our method can identify more differentially expressed genes than the others. Moreover, we confirmed that the identified genes are closely correlated with the corresponding gene expression data.

  3. An Optimal Mean Based Block Robust Feature Extraction Method to Identify Colorectal Cancer Genes with Integrated Data.

    PubMed

    Liu, Jian; Cheng, Yuhu; Wang, Xuesong; Zhang, Lin; Liu, Hui

    2017-08-17

    It is urgent to diagnose colorectal cancer in the early stage. Some feature genes which are important to colorectal cancer development have been identified. However, for the early stage of colorectal cancer, less is known about the identity of specific cancer genes that are associated with advanced clinical stage. In this paper, we conducted a feature extraction method named Optimal Mean based Block Robust Feature Extraction method (OMBRFE) to identify feature genes associated with advanced colorectal cancer in clinical stage by using the integrated colorectal cancer data. Firstly, based on the optimal mean and L 2,1 -norm, a novel feature extraction method called Optimal Mean based Robust Feature Extraction method (OMRFE) is proposed to identify feature genes. Then the OMBRFE method which introduces the block ideology into OMRFE method is put forward to process the colorectal cancer integrated data which includes multiple genomic data: copy number alterations, somatic mutations, methylation expression alteration, as well as gene expression changes. Experimental results demonstrate that the OMBRFE is more effective than previous methods in identifying the feature genes. Moreover, genes identified by OMBRFE are verified to be closely associated with advanced colorectal cancer in clinical stage.

  4. Robust adaptive multichannel SAR processing based on covariance matrix reconstruction

    NASA Astrophysics Data System (ADS)

    Tan, Zhen-ya; He, Feng

    2018-04-01

    With the combination of digital beamforming (DBF) processing, multichannel synthetic aperture radar(SAR) systems in azimuth promise well in high-resolution and wide-swath imaging, whereas conventional processing methods don't take the nonuniformity of scattering coefficient into consideration. This paper brings up a robust adaptive Multichannel SAR processing method which utilizes the Capon spatial spectrum estimator to obtain the spatial spectrum distribution over all ambiguous directions first, and then the interference-plus-noise covariance Matrix is reconstructed based on definition to acquire the Multichannel SAR processing filter. The performance of processing under nonuniform scattering coefficient is promoted by this novel method and it is robust again array errors. The experiments with real measured data demonstrate the effectiveness and robustness of the proposed method.

  5. Comparison of fMRI paradigms assessing visuospatial processing: Robustness and reproducibility

    PubMed Central

    Herholz, Peer; Zimmermann, Kristin M.; Westermann, Stefan; Frässle, Stefan; Jansen, Andreas

    2017-01-01

    The development of brain imaging techniques, in particular functional magnetic resonance imaging (fMRI), made it possible to non-invasively study the hemispheric lateralization of cognitive brain functions in large cohorts. Comprehensive models of hemispheric lateralization are, however, still missing and should not only account for the hemispheric specialization of individual brain functions, but also for the interactions among different lateralized cognitive processes (e.g., language and visuospatial processing). This calls for robust and reliable paradigms to study hemispheric lateralization for various cognitive functions. While numerous reliable imaging paradigms have been developed for language, which represents the most prominent left-lateralized brain function, the reliability of imaging paradigms investigating typically right-lateralized brain functions, such as visuospatial processing, has received comparatively less attention. In the present study, we aimed to establish an fMRI paradigm that robustly and reliably identifies right-hemispheric activation evoked by visuospatial processing in individual subjects. In a first study, we therefore compared three frequently used paradigms for assessing visuospatial processing and evaluated their utility to robustly detect right-lateralized brain activity on a single-subject level. In a second study, we then assessed the test-retest reliability of the so-called Landmark task–the paradigm that yielded the most robust results in study 1. At the single-voxel level, we found poor reliability of the brain activation underlying visuospatial attention. This suggests that poor signal-to-noise ratios can become a limiting factor for test-retest reliability. This represents a common detriment of fMRI paradigms investigating visuospatial attention in general and therefore highlights the need for careful considerations of both the possibilities and limitations of the respective fMRI paradigm–in particular, when being

  6. On-Line Robust Modal Stability Prediction using Wavelet Processing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Lind, Rick

    1998-01-01

    Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.

  7. On adaptive robustness approach to Anti-Jam signal processing

    NASA Astrophysics Data System (ADS)

    Poberezhskiy, Y. S.; Poberezhskiy, G. Y.

    An effective approach to exploiting statistical differences between desired and jamming signals named adaptive robustness is proposed and analyzed in this paper. It combines conventional Bayesian, adaptive, and robust approaches that are complementary to each other. This combining strengthens the advantages and mitigates the drawbacks of the conventional approaches. Adaptive robustness is equally applicable to both jammers and their victim systems. The capabilities required for realization of adaptive robustness in jammers and victim systems are determined. The employment of a specific nonlinear robust algorithm for anti-jam (AJ) processing is described and analyzed. Its effectiveness in practical situations has been proven analytically and confirmed by simulation. Since adaptive robustness can be used by both sides in electronic warfare, it is more advantageous for the fastest and most intelligent side. Many results obtained and discussed in this paper are also applicable to commercial applications such as communications in unregulated or poorly regulated frequency ranges and systems with cognitive capabilities.

  8. Development of a Robust Identifier for NPPs Transients Combining ARIMA Model and EBP Algorithm

    NASA Astrophysics Data System (ADS)

    Moshkbar-Bakhshayesh, Khalil; Ghofrani, Mohammad B.

    2014-08-01

    This study introduces a novel identification method for recognition of nuclear power plants (NPPs) transients by combining the autoregressive integrated moving-average (ARIMA) model and the neural network with error backpropagation (EBP) learning algorithm. The proposed method consists of three steps. First, an EBP based identifier is adopted to distinguish the plant normal states from the faulty ones. In the second step, ARIMA models use integrated (I) process to convert non-stationary data of the selected variables into stationary ones. Subsequently, ARIMA processes, including autoregressive (AR), moving-average (MA), or autoregressive moving-average (ARMA) are used to forecast time series of the selected plant variables. In the third step, for identification the type of transients, the forecasted time series are fed to the modular identifier which has been developed using the latest advances of EBP learning algorithm. Bushehr nuclear power plant (BNPP) transients are probed to analyze the ability of the proposed identifier. Recognition of transient is based on similarity of its statistical properties to the reference one, rather than the values of input patterns. More robustness against noisy data and improvement balance between memorization and generalization are salient advantages of the proposed identifier. Reduction of false identification, sole dependency of identification on the sign of each output signal, selection of the plant variables for transients training independent of each other, and extendibility for identification of more transients without unfavorable effects are other merits of the proposed identifier.

  9. Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy

    NASA Astrophysics Data System (ADS)

    Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.

    2011-08-01

    The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.

  10. A robust clustering algorithm for identifying problematic samples in genome-wide association studies.

    PubMed

    Bellenguez, Céline; Strange, Amy; Freeman, Colin; Donnelly, Peter; Spencer, Chris C A

    2012-01-01

    High-throughput genotyping arrays provide an efficient way to survey single nucleotide polymorphisms (SNPs) across the genome in large numbers of individuals. Downstream analysis of the data, for example in genome-wide association studies (GWAS), often involves statistical models of genotype frequencies across individuals. The complexities of the sample collection process and the potential for errors in the experimental assay can lead to biases and artefacts in an individual's inferred genotypes. Rather than attempting to model these complications, it has become a standard practice to remove individuals whose genome-wide data differ from the sample at large. Here we describe a simple, but robust, statistical algorithm to identify samples with atypical summaries of genome-wide variation. Its use as a semi-automated quality control tool is demonstrated using several summary statistics, selected to identify different potential problems, and it is applied to two different genotyping platforms and sample collections. The algorithm is written in R and is freely available at www.well.ox.ac.uk/chris-spencer chris.spencer@well.ox.ac.uk Supplementary data are available at Bioinformatics online.

  11. Processing Robustness for A Phenylethynyl Terminated Polyimide Composite

    NASA Technical Reports Server (NTRS)

    Hou, Tan-Hung

    2004-01-01

    The processability of a phenylethynyl terminated imide resin matrix (designated as PETI-5) composite is investigated. Unidirectional prepregs are made by coating an N-methylpyrrolidone solution of the amide acid oligomer (designated as PETAA-5/NMP) onto unsized IM7 fibers. Two batches of prepregs are used: one is made by NASA in-house, and the other is from an industrial source. The composite processing robustness is investigated with respect to the prepreg shelf life, the effect of B-staging conditions, and the optimal processing window. Prepreg rheology and open hole compression (OHC) strengths are found not to be affected by prolonged (i.e., up to 60 days) ambient storage. Rheological measurements indicate that the PETAA-5/NMP processability is only slightly affected over a wide range of B-stage temperatures from 250 deg C to 300 deg C. The OHC strength values are statistically indistinguishable among laminates consolidated using various B-staging conditions. An optimal processing window is established by means of the response surface methodology. IM7/PETAA-5/NMP prepreg is more sensitive to consolidation temperature than to pressure. A good consolidation is achievable at 371 deg C (700 deg F)/100 Psi, which yields an RT OHC strength of 62 Ksi. However, processability declines dramatically at temperatures below 350 deg C (662 deg F), as evidenced by the OHC strength values. The processability of the IM7/LARC(TM) PETI-5 prepreg was found to be robust.

  12. Robust Intratumor Partitioning to Identify High-Risk Subregions in Lung Cancer: A Pilot Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jia; Gensheimer, Michael F.; Dong, Xinzhe

    2016-08-01

    Purpose: To develop an intratumor partitioning framework for identifying high-risk subregions from {sup 18}F-fluorodeoxyglucose positron emission tomography (FDG-PET) and computed tomography (CT) imaging and to test whether tumor burden associated with the high-risk subregions is prognostic of outcomes in lung cancer. Methods and Materials: In this institutional review board–approved retrospective study, we analyzed the pretreatment FDG-PET and CT scans of 44 lung cancer patients treated with radiation therapy. A novel, intratumor partitioning method was developed, based on a 2-stage clustering process: first at the patient level, each tumor was over-segmented into many superpixels by k-means clustering of integrated PET andmore » CT images; next, tumor subregions were identified by merging previously defined superpixels via population-level hierarchical clustering. The volume associated with each of the subregions was evaluated using Kaplan-Meier analysis regarding its prognostic capability in predicting overall survival (OS) and out-of-field progression (OFP). Results: Three spatially distinct subregions were identified within each tumor that were highly robust to uncertainty in PET/CT co-registration. Among these, the volume of the most metabolically active and metabolically heterogeneous solid component of the tumor was predictive of OS and OFP on the entire cohort, with a concordance index or CI of 0.66-0.67. When restricting the analysis to patients with stage III disease (n=32), the same subregion achieved an even higher CI of 0.75 (hazard ratio 3.93, log-rank P=.002) for predicting OS, and a CI of 0.76 (hazard ratio 4.84, log-rank P=.002) for predicting OFP. In comparison, conventional imaging markers, including tumor volume, maximum standardized uptake value, and metabolic tumor volume using threshold of 50% standardized uptake value maximum, were not predictive of OS or OFP, with CI mostly below 0.60 (log-rank P>.05). Conclusion: We propose a robust

  13. Robust Intratumor Partitioning to Identify High-Risk Subregions in Lung Cancer: A Pilot Study.

    PubMed

    Wu, Jia; Gensheimer, Michael F; Dong, Xinzhe; Rubin, Daniel L; Napel, Sandy; Diehn, Maximilian; Loo, Billy W; Li, Ruijiang

    2016-08-01

    To develop an intratumor partitioning framework for identifying high-risk subregions from (18)F-fluorodeoxyglucose positron emission tomography (FDG-PET) and computed tomography (CT) imaging and to test whether tumor burden associated with the high-risk subregions is prognostic of outcomes in lung cancer. In this institutional review board-approved retrospective study, we analyzed the pretreatment FDG-PET and CT scans of 44 lung cancer patients treated with radiation therapy. A novel, intratumor partitioning method was developed, based on a 2-stage clustering process: first at the patient level, each tumor was over-segmented into many superpixels by k-means clustering of integrated PET and CT images; next, tumor subregions were identified by merging previously defined superpixels via population-level hierarchical clustering. The volume associated with each of the subregions was evaluated using Kaplan-Meier analysis regarding its prognostic capability in predicting overall survival (OS) and out-of-field progression (OFP). Three spatially distinct subregions were identified within each tumor that were highly robust to uncertainty in PET/CT co-registration. Among these, the volume of the most metabolically active and metabolically heterogeneous solid component of the tumor was predictive of OS and OFP on the entire cohort, with a concordance index or CI of 0.66-0.67. When restricting the analysis to patients with stage III disease (n=32), the same subregion achieved an even higher CI of 0.75 (hazard ratio 3.93, log-rank P=.002) for predicting OS, and a CI of 0.76 (hazard ratio 4.84, log-rank P=.002) for predicting OFP. In comparison, conventional imaging markers, including tumor volume, maximum standardized uptake value, and metabolic tumor volume using threshold of 50% standardized uptake value maximum, were not predictive of OS or OFP, with CI mostly below 0.60 (log-rank P>.05). We propose a robust intratumor partitioning method to identify clinically relevant, high

  14. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    NASA Astrophysics Data System (ADS)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  15. Robust fusion-based processing for military polarimetric imaging systems

    NASA Astrophysics Data System (ADS)

    Hickman, Duncan L.; Smith, Moira I.; Kim, Kyung Su; Choi, Hyun-Jin

    2017-05-01

    Polarisation information within a scene can be exploited in military systems to give enhanced automatic target detection and recognition (ATD/R) performance. However, the performance gain achieved is highly dependent on factors such as the geometry, viewing conditions, and the surface finish of the target. Such performance sensitivities are highly undesirable in many tactical military systems where operational conditions can vary significantly and rapidly during a mission. Within this paper, a range of processing architectures and fusion methods is considered in terms of their practical viability and operational robustness for systems requiring ATD/R. It is shown that polarisation information can give useful performance gains but, to retained system robustness, the introduction of polarimetric processing should be done in such a way as to not compromise other discriminatory scene information in the spectral and spatial domains. The analysis concludes that polarimetric data can be effectively integrated with conventional intensity-based ATD/R by either adapting the ATD/R processing function based on the scene polarisation or else by detection-level fusion. Both of these approaches avoid the introduction of processing bottlenecks and limit the impact of processing on system latency.

  16. Improving tablet coating robustness by selecting critical process parameters from retrospective data.

    PubMed

    Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M

    2016-09-01

    Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.

  17. Robust global identifiability theory using potentials--Application to compartmental models.

    PubMed

    Wongvanich, N; Hann, C E; Sirisena, H R

    2015-04-01

    This paper presents a global practical identifiability theory for analyzing and identifying linear and nonlinear compartmental models. The compartmental system is prolonged onto the potential jet space to formulate a set of input-output equations that are integrals in terms of the measured data, which allows for robust identification of parameters without requiring any simulation of the model differential equations. Two classes of linear and non-linear compartmental models are considered. The theory is first applied to analyze the linear nitrous oxide (N2O) uptake model. The fitting accuracy of the identified models from differential jet space and potential jet space identifiability theories is compared with a realistic noise level of 3% which is derived from sensor noise data in the literature. The potential jet space approach gave a match that was well within the coefficient of variation. The differential jet space formulation was unstable and not suitable for parameter identification. The proposed theory is then applied to a nonlinear immunological model for mastitis in cows. In addition, the model formulation is extended to include an iterative method which allows initial conditions to be accurately identified. With up to 10% noise, the potential jet space theory predicts the normalized population concentration infected with pathogens, to within 9% of the true curve. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    NASA Astrophysics Data System (ADS)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  19. A preferential design approach for energy-efficient and robust implantable neural signal processing hardware.

    PubMed

    Narasimhan, Seetharam; Chiel, Hillel J; Bhunia, Swarup

    2009-01-01

    For implantable neural interface applications, it is important to compress data and analyze spike patterns across multiple channels in real time. Such a computational task for online neural data processing requires an innovative circuit-architecture level design approach for low-power, robust and area-efficient hardware implementation. Conventional microprocessor or Digital Signal Processing (DSP) chips would dissipate too much power and are too large in size for an implantable system. In this paper, we propose a novel hardware design approach, referred to as "Preferential Design" that exploits the nature of the neural signal processing algorithm to achieve a low-voltage, robust and area-efficient implementation using nanoscale process technology. The basic idea is to isolate the critical components with respect to system performance and design them more conservatively compared to the noncritical ones. This allows aggressive voltage scaling for low power operation while ensuring robustness and area efficiency. We have applied the proposed approach to a neural signal processing algorithm using the Discrete Wavelet Transform (DWT) and observed significant improvement in power and robustness over conventional design.

  20. Robust media processing on programmable power-constrained systems

    NASA Astrophysics Data System (ADS)

    McVeigh, Jeff

    2005-03-01

    To achieve consumer-level quality, media systems must process continuous streams of audio and video data while maintaining exacting tolerances on sampling rate, jitter, synchronization, and latency. While it is relatively straightforward to design fixed-function hardware implementations to satisfy worst-case conditions, there is a growing trend to utilize programmable multi-tasking solutions for media applications. The flexibility of these systems enables support for multiple current and future media formats, which can reduce design costs and time-to-market. This paper provides practical engineering solutions to achieve robust media processing on such systems, with specific attention given to power-constrained platforms. The techniques covered in this article utilize the fundamental concepts of algorithm and software optimization, software/hardware partitioning, stream buffering, hierarchical prioritization, and system resource and power management. A novel enhancement to dynamically adjust processor voltage and frequency based on buffer fullness to reduce system power consumption is examined in detail. The application of these techniques is provided in a case study of a portable video player implementation based on a general-purpose processor running a non real-time operating system that achieves robust playback of synchronized H.264 video and MP3 audio from local storage and streaming over 802.11.

  1. Improving Robustness of Hydrologic Ensemble Predictions Through Probabilistic Pre- and Post-Processing in Sequential Data Assimilation

    NASA Astrophysics Data System (ADS)

    Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.

    2018-03-01

    Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.

  2. Exploring critical pathways for urban water management to identify robust strategies under deep uncertainties.

    PubMed

    Urich, Christian; Rauch, Wolfgang

    2014-12-01

    Long-term projections for key drivers needed in urban water infrastructure planning such as climate change, population growth, and socio-economic changes are deeply uncertain. Traditional planning approaches heavily rely on these projections, which, if a projection stays unfulfilled, can lead to problematic infrastructure decisions causing high operational costs and/or lock-in effects. New approaches based on exploratory modelling take a fundamentally different view. Aim of these is, to identify an adaptation strategy that performs well under many future scenarios, instead of optimising a strategy for a handful. However, a modelling tool to support strategic planning to test the implication of adaptation strategies under deeply uncertain conditions for urban water management does not exist yet. This paper presents a first step towards a new generation of such strategic planning tools, by combing innovative modelling tools, which coevolve the urban environment and urban water infrastructure under many different future scenarios, with robust decision making. The developed approach is applied to the city of Innsbruck, Austria, which is spatially explicitly evolved 20 years into the future under 1000 scenarios to test the robustness of different adaptation strategies. Key findings of this paper show that: (1) Such an approach can be used to successfully identify parameter ranges of key drivers in which a desired performance criterion is not fulfilled, which is an important indicator for the robustness of an adaptation strategy; and (2) Analysis of the rich dataset gives new insights into the adaptive responses of agents to key drivers in the urban system by modifying a strategy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Data-Driven Robust RVFLNs Modeling of a Blast Furnace Iron-Making Process Using Cauchy Distribution Weighted M-Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ping; Lv, Youbin; Wang, Hong

    Optimal operation of a practical blast furnace (BF) ironmaking process depends largely on a good measurement of molten iron quality (MIQ) indices. However, measuring the MIQ online is not feasible using the available techniques. In this paper, a novel data-driven robust modeling is proposed for online estimation of MIQ using improved random vector functional-link networks (RVFLNs). Since the output weights of traditional RVFLNs are obtained by the least squares approach, a robustness problem may occur when the training dataset is contaminated with outliers. This affects the modeling accuracy of RVFLNs. To solve this problem, a Cauchy distribution weighted M-estimation basedmore » robust RFVLNs is proposed. Since the weights of different outlier data are properly determined by the Cauchy distribution, their corresponding contribution on modeling can be properly distinguished. Thus robust and better modeling results can be achieved. Moreover, given that the BF is a complex nonlinear system with numerous coupling variables, the data-driven canonical correlation analysis is employed to identify the most influential components from multitudinous factors that affect the MIQ indices to reduce the model dimension. Finally, experiments using industrial data and comparative studies have demonstrated that the obtained model produces a better modeling and estimating accuracy and stronger robustness than other modeling methods.« less

  4. Robust Principal Component Analysis Regularized by Truncated Nuclear Norm for Identifying Differentially Expressed Genes.

    PubMed

    Wang, Ya-Xuan; Gao, Ying-Lian; Liu, Jin-Xing; Kong, Xiang-Zhen; Li, Hai-Jun

    2017-09-01

    Identifying differentially expressed genes from the thousands of genes is a challenging task. Robust principal component analysis (RPCA) is an efficient method in the identification of differentially expressed genes. RPCA method uses nuclear norm to approximate the rank function. However, theoretical studies showed that the nuclear norm minimizes all singular values, so it may not be the best solution to approximate the rank function. The truncated nuclear norm is defined as the sum of some smaller singular values, which may achieve a better approximation of the rank function than nuclear norm. In this paper, a novel method is proposed by replacing nuclear norm of RPCA with the truncated nuclear norm, which is named robust principal component analysis regularized by truncated nuclear norm (TRPCA). The method decomposes the observation matrix of genomic data into a low-rank matrix and a sparse matrix. Because the significant genes can be considered as sparse signals, the differentially expressed genes are viewed as the sparse perturbation signals. Thus, the differentially expressed genes can be identified according to the sparse matrix. The experimental results on The Cancer Genome Atlas data illustrate that the TRPCA method outperforms other state-of-the-art methods in the identification of differentially expressed genes.

  5. Robust extrema features for time-series data analysis.

    PubMed

    Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N

    2013-06-01

    The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.

  6. Mechanisms for Robust Cognition.

    PubMed

    Walsh, Matthew M; Gluck, Kevin A

    2015-08-01

    To function well in an unpredictable environment using unreliable components, a system must have a high degree of robustness. Robustness is fundamental to biological systems and is an objective in the design of engineered systems such as airplane engines and buildings. Cognitive systems, like biological and engineered systems, exist within variable environments. This raises the question, how do cognitive systems achieve similarly high degrees of robustness? The aim of this study was to identify a set of mechanisms that enhance robustness in cognitive systems. We identify three mechanisms that enhance robustness in biological and engineered systems: system control, redundancy, and adaptability. After surveying the psychological literature for evidence of these mechanisms, we provide simulations illustrating how each contributes to robust cognition in a different psychological domain: psychomotor vigilance, semantic memory, and strategy selection. These simulations highlight features of a mathematical approach for quantifying robustness, and they provide concrete examples of mechanisms for robust cognition. © 2014 Cognitive Science Society, Inc.

  7. Robustness of the Process of Nucleoid Exclusion of Protein Aggregates in Escherichia coli

    PubMed Central

    Neeli-Venkata, Ramakanth; Martikainen, Antti; Gupta, Abhishekh; Gonçalves, Nadia; Fonseca, Jose

    2016-01-01

    ABSTRACT Escherichia coli segregates protein aggregates to the poles by nucleoid exclusion. Combined with cell divisions, this generates heterogeneous aggregate distributions in subsequent cell generations. We studied the robustness of this process with differing medium richness and antibiotics stress, which affect nucleoid size, using multimodal, time-lapse microscopy of live cells expressing both a fluorescently tagged chaperone (IbpA), which identifies in vivo the location of aggregates, and HupA-mCherry, a fluorescent variant of a nucleoid-associated protein. We find that the relative sizes of the nucleoid's major and minor axes change widely, in a positively correlated fashion, with medium richness and antibiotic stress. The aggregate's distribution along the major cell axis also changes between conditions and in agreement with the nucleoid exclusion phenomenon. Consequently, the fraction of aggregates at the midcell region prior to cell division differs between conditions, which will affect the degree of asymmetries in the partitioning of aggregates between cells of future generations. Finally, from the location of the peak of anisotropy in the aggregate displacement distribution, the nucleoid relative size, and the spatiotemporal aggregate distribution, we find that the exclusion of detectable aggregates from midcell is most pronounced in cells with mid-sized nucleoids, which are most common under optimal conditions. We conclude that the aggregate management mechanisms of E. coli are significantly robust but are not immune to stresses due to the tangible effect that these have on nucleoid size. IMPORTANCE Escherichia coli segregates protein aggregates to the poles by nucleoid exclusion. From live single-cell microscopy studies of the robustness of this process to various stresses known to affect nucleoid size, we find that nucleoid size and aggregate preferential locations change concordantly between conditions. Also, the degree of influence of the nucleoid

  8. Robust path planning for flexible needle insertion using Markov decision processes.

    PubMed

    Tan, Xiaoyu; Yu, Pengqian; Lim, Kah-Bin; Chui, Chee-Kong

    2018-05-11

    Flexible needle has the potential to accurately navigate to a treatment region in the least invasive manner. We propose a new planning method using Markov decision processes (MDPs) for flexible needle navigation that can perform robust path planning and steering under the circumstance of complex tissue-needle interactions. This method enhances the robustness of flexible needle steering from three different perspectives. First, the method considers the problem caused by soft tissue deformation. The method then resolves the common needle penetration failure caused by patterns of targets, while the last solution addresses the uncertainty issues in flexible needle motion due to complex and unpredictable tissue-needle interaction. Computer simulation and phantom experimental results show that the proposed method can perform robust planning and generate a secure control policy for flexible needle steering. Compared with a traditional method using MDPs, the proposed method achieves higher accuracy and probability of success in avoiding obstacles under complicated and uncertain tissue-needle interactions. Future work will involve experiment with biological tissue in vivo. The proposed robust path planning method can securely steer flexible needle within soft phantom tissues and achieve high adaptability in computer simulation.

  9. Biological robustness.

    PubMed

    Kitano, Hiroaki

    2004-11-01

    Robustness is a ubiquitously observed property of biological systems. It is considered to be a fundamental feature of complex evolvable systems. It is attained by several underlying principles that are universal to both biological organisms and sophisticated engineering systems. Robustness facilitates evolvability and robust traits are often selected by evolution. Such a mutually beneficial process is made possible by specific architectural features observed in robust systems. But there are trade-offs between robustness, fragility, performance and resource demands, which explain system behaviour, including the patterns of failure. Insights into inherent properties of robust systems will provide us with a better understanding of complex diseases and a guiding principle for therapy design.

  10. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  11. Noise suppression methods for robust speech processing

    NASA Astrophysics Data System (ADS)

    Boll, S. F.; Ravindra, H.; Randall, G.; Armantrout, R.; Power, R.

    1980-05-01

    Robust speech processing in practical operating environments requires effective environmental and processor noise suppression. This report describes the technical findings and accomplishments during this reporting period for the research program funded to develop real time, compressed speech analysis synthesis algorithms whose performance in invariant under signal contamination. Fulfillment of this requirement is necessary to insure reliable secure compressed speech transmission within realistic military command and control environments. Overall contributions resulting from this research program include the understanding of how environmental noise degrades narrow band, coded speech, development of appropriate real time noise suppression algorithms, and development of speech parameter identification methods that consider signal contamination as a fundamental element in the estimation process. This report describes the current research and results in the areas of noise suppression using the dual input adaptive noise cancellation using the short time Fourier transform algorithms, articulation rate change techniques, and a description of an experiment which demonstrated that the spectral subtraction noise suppression algorithm can improve the intelligibility of 2400 bps, LPC 10 coded, helicopter speech by 10.6 point.

  12. An Intercompany Perspective on Biopharmaceutical Drug Product Robustness Studies.

    PubMed

    Morar-Mitrica, Sorina; Adams, Monica L; Crotts, George; Wurth, Christine; Ihnat, Peter M; Tabish, Tanvir; Antochshuk, Valentyn; DiLuzio, Willow; Dix, Daniel B; Fernandez, Jason E; Gupta, Kapil; Fleming, Michael S; He, Bing; Kranz, James K; Liu, Dingjiang; Narasimhan, Chakravarthy; Routhier, Eric; Taylor, Katherine D; Truong, Nobel; Stokes, Elaine S E

    2018-02-01

    The Biophorum Development Group (BPDG) is an industry-wide consortium enabling networking and sharing of best practices for the development of biopharmaceuticals. To gain a better understanding of current industry approaches for establishing biopharmaceutical drug product (DP) robustness, the BPDG-Formulation Point Share group conducted an intercompany collaboration exercise, which included a bench-marking survey and extensive group discussions around the scope, design, and execution of robustness studies. The results of this industry collaboration revealed several key common themes: (1) overall DP robustness is defined by both the formulation and the manufacturing process robustness; (2) robustness integrates the principles of quality by design (QbD); (3) DP robustness is an important factor in setting critical quality attribute control strategies and commercial specifications; (4) most companies employ robustness studies, along with prior knowledge, risk assessments, and statistics, to develop the DP design space; (5) studies are tailored to commercial development needs and the practices of each company. Three case studies further illustrate how a robustness study design for a biopharmaceutical DP balances experimental complexity, statistical power, scientific understanding, and risk assessment to provide the desired product and process knowledge. The BPDG-Formulation Point Share discusses identified industry challenges with regard to biopharmaceutical DP robustness and presents some recommendations for best practices. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  13. Robust multi-site MR data processing: iterative optimization of bias correction, tissue classification, and registration.

    PubMed

    Young Kim, Eun; Johnson, Hans J

    2013-01-01

    A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.

  14. Robustness

    NASA Astrophysics Data System (ADS)

    Ryan, R.

    1993-03-01

    Robustness is a buzz word common to all newly proposed space systems design as well as many new commercial products. The image that one conjures up when the word appears is a 'Paul Bunyon' (lumberjack design), strong and hearty; healthy with margins in all aspects of the design. In actuality, robustness is much broader in scope than margins, including such factors as simplicity, redundancy, desensitization to parameter variations, control of parameter variations (environments flucation), and operational approaches. These must be traded with concepts, materials, and fabrication approaches against the criteria of performance, cost, and reliability. This includes manufacturing, assembly, processing, checkout, and operations. The design engineer or project chief is faced with finding ways and means to inculcate robustness into an operational design. First, however, be sure he understands the definition and goals of robustness. This paper will deal with these issues as well as the need for the requirement for robustness.

  15. Robustness

    NASA Technical Reports Server (NTRS)

    Ryan, R.

    1993-01-01

    Robustness is a buzz word common to all newly proposed space systems design as well as many new commercial products. The image that one conjures up when the word appears is a 'Paul Bunyon' (lumberjack design), strong and hearty; healthy with margins in all aspects of the design. In actuality, robustness is much broader in scope than margins, including such factors as simplicity, redundancy, desensitization to parameter variations, control of parameter variations (environments flucation), and operational approaches. These must be traded with concepts, materials, and fabrication approaches against the criteria of performance, cost, and reliability. This includes manufacturing, assembly, processing, checkout, and operations. The design engineer or project chief is faced with finding ways and means to inculcate robustness into an operational design. First, however, be sure he understands the definition and goals of robustness. This paper will deal with these issues as well as the need for the requirement for robustness.

  16. Robust Selection Algorithm (RSA) for Multi-Omic Biomarker Discovery; Integration with Functional Network Analysis to Identify miRNA Regulated Pathways in Multiple Cancers.

    PubMed

    Sehgal, Vasudha; Seviour, Elena G; Moss, Tyler J; Mills, Gordon B; Azencott, Robert; Ram, Prahlad T

    2015-01-01

    MicroRNAs (miRNAs) play a crucial role in the maintenance of cellular homeostasis by regulating the expression of their target genes. As such, the dysregulation of miRNA expression has been frequently linked to cancer. With rapidly accumulating molecular data linked to patient outcome, the need for identification of robust multi-omic molecular markers is critical in order to provide clinical impact. While previous bioinformatic tools have been developed to identify potential biomarkers in cancer, these methods do not allow for rapid classification of oncogenes versus tumor suppressors taking into account robust differential expression, cutoffs, p-values and non-normality of the data. Here, we propose a methodology, Robust Selection Algorithm (RSA) that addresses these important problems in big data omics analysis. The robustness of the survival analysis is ensured by identification of optimal cutoff values of omics expression, strengthened by p-value computed through intensive random resampling taking into account any non-normality in the data and integration into multi-omic functional networks. Here we have analyzed pan-cancer miRNA patient data to identify functional pathways involved in cancer progression that are associated with selected miRNA identified by RSA. Our approach demonstrates the way in which existing survival analysis techniques can be integrated with a functional network analysis framework to efficiently identify promising biomarkers and novel therapeutic candidates across diseases.

  17. Identifying a Robust and Practical Quasar Accretion-Rate Indicator Using the Chandra Archive

    NASA Astrophysics Data System (ADS)

    Shemmer, Ohad

    2017-09-01

    Understanding the rapid growth of supermassive black holes and the assembly of their host galaxies is severely limited by the lack of reliable estimates of black-hole mass and accretion rate in distant quasars. We propose to utilize the Chandra archive to identify the most reliable and practical Eddington-ratio indicator by investigating diagnostics of quasar accretion power in the hard-X-ray, C IV, and Hbeta spectral bands of a carefully-selected sample of optically-selected sources. We will perform a ``stress test'' to each of these diagnostics, relying critically on the hard-X-ray observable properties, and deliver a prescription for the most robust Eddington-ratio estimate that can be utilized economically at the highest accessible redshifts.

  18. Multi-objective robust design of energy-absorbing components using coupled process-performance simulations

    NASA Astrophysics Data System (ADS)

    Najafi, Ali; Acar, Erdem; Rais-Rohani, Masoud

    2014-02-01

    The stochastic uncertainties associated with the material, process and product are represented and propagated to process and performance responses. A finite element-based sequential coupled process-performance framework is used to simulate the forming and energy absorption responses of a thin-walled tube in a manner that both material properties and component geometry can evolve from one stage to the next for better prediction of the structural performance measures. Metamodelling techniques are used to develop surrogate models for manufacturing and performance responses. One set of metamodels relates the responses to the random variables whereas the other relates the mean and standard deviation of the responses to the selected design variables. A multi-objective robust design optimization problem is formulated and solved to illustrate the methodology and the influence of uncertainties on manufacturability and energy absorption of a metallic double-hat tube. The results are compared with those of deterministic and augmented robust optimization problems.

  19. Robust Kriged Kalman Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baingana, Brian; Dall'Anese, Emiliano; Mateos, Gonzalo

    2015-11-11

    Although the kriged Kalman filter (KKF) has well-documented merits for prediction of spatial-temporal processes, its performance degrades in the presence of outliers due to anomalous events, or measurement equipment failures. This paper proposes a robust KKF model that explicitly accounts for presence of measurement outliers. Exploiting outlier sparsity, a novel l1-regularized estimator that jointly predicts the spatial-temporal process at unmonitored locations, while identifying measurement outliers is put forth. Numerical tests are conducted on a synthetic Internet protocol (IP) network, and real transformer load data. Test results corroborate the effectiveness of the novel estimator in joint spatial prediction and outlier identification.

  20. Robust design optimization using the price of robustness, robust least squares and regularization methods

    NASA Astrophysics Data System (ADS)

    Bukhari, Hassan J.

    2017-12-01

    In this paper a framework for robust optimization of mechanical design problems and process systems that have parametric uncertainty is presented using three different approaches. Robust optimization problems are formulated so that the optimal solution is robust which means it is minimally sensitive to any perturbations in parameters. The first method uses the price of robustness approach which assumes the uncertain parameters to be symmetric and bounded. The robustness for the design can be controlled by limiting the parameters that can perturb.The second method uses the robust least squares method to determine the optimal parameters when data itself is subjected to perturbations instead of the parameters. The last method manages uncertainty by restricting the perturbation on parameters to improve sensitivity similar to Tikhonov regularization. The methods are implemented on two sets of problems; one linear and the other non-linear. This methodology will be compared with a prior method using multiple Monte Carlo simulation runs which shows that the approach being presented in this paper results in better performance.

  1. Identifying critical success factors for designing selection processes into postgraduate specialty training: the case of UK general practice.

    PubMed

    Plint, Simon; Patterson, Fiona

    2010-06-01

    The UK national recruitment process into general practice training has been developed over several years, with incremental introduction of stages which have been piloted and validated. Previously independent processes, which encouraged multiple applications and produced inconsistent outcomes, have been replaced by a robust national process which has high reliability and predictive validity, and is perceived to be fair by candidates and allocates applicants equitably across the country. Best selection practice involves a job analysis which identifies required competencies, then designs reliable assessment methods to measure them, and over the long term ensures that the process has predictive validity against future performance. The general practitioner recruitment process introduced machine markable short listing assessments for the first time in the UK postgraduate recruitment context, and also adopted selection centre workplace simulations. The key success factors have been identified as corporate commitment to the goal of a national process, with gradual convergence maintaining locus of control rather than the imposition of change without perceived legitimate authority.

  2. Re-thinking our understanding of immunity: Robustness in the tissue reconstruction system.

    PubMed

    Truchetet, Marie-Elise; Pradeu, Thomas

    2018-04-01

    Robustness, understood as the maintenance of specific functionalities of a given system against internal and external perturbations, is pervasive in today's biology. Yet precise applications of this notion to the immune system have been scarce. Here we show that the concept of robustness sheds light on tissue repair, and particularly on the crucial role the immune system plays in this process. We describe the specific mechanisms, including plasticity and redundancy, by which robustness is achieved in the tissue reconstruction system (TRS). In turn, tissue repair offers a very important test case for assessing the usefulness of the concept of robustness, and identifying different varieties of robustness. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. The effectiveness of robust RMCD control chart as outliers’ detector

    NASA Astrophysics Data System (ADS)

    Darmanto; Astutik, Suci

    2017-12-01

    A well-known control chart to monitor a multivariate process is Hotelling’s T 2 which its parameters are estimated classically, very sensitive and also marred by masking and swamping of outliers data effect. To overcome these situation, robust estimators are strongly recommended. One of robust estimators is re-weighted minimum covariance determinant (RMCD) which has robust characteristics as same as MCD. In this paper, the effectiveness term is accuracy of the RMCD control chart in detecting outliers as real outliers. In other word, how effectively this control chart can identify and remove masking and swamping effects of outliers. We assessed the effectiveness the robust control chart based on simulation by considering different scenarios: n sample sizes, proportion of outliers, number of p quality characteristics. We found that in some scenarios, this RMCD robust control chart works effectively.

  4. Adaptive GSA-based optimal tuning of PI controlled servo systems with reduced process parametric sensitivity, robust stability and controller robustness.

    PubMed

    Precup, Radu-Emil; David, Radu-Codrut; Petriu, Emil M; Radac, Mircea-Bogdan; Preitl, Stefan

    2014-11-01

    This paper suggests a new generation of optimal PI controllers for a class of servo systems characterized by saturation and dead zone static nonlinearities and second-order models with an integral component. The objective functions are expressed as the integral of time multiplied by absolute error plus the weighted sum of the integrals of output sensitivity functions of the state sensitivity models with respect to two process parametric variations. The PI controller tuning conditions applied to a simplified linear process model involve a single design parameter specific to the extended symmetrical optimum (ESO) method which offers the desired tradeoff to several control system performance indices. An original back-calculation and tracking anti-windup scheme is proposed in order to prevent the integrator wind-up and to compensate for the dead zone nonlinearity of the process. The minimization of the objective functions is carried out in the framework of optimization problems with inequality constraints which guarantee the robust stability with respect to the process parametric variations and the controller robustness. An adaptive gravitational search algorithm (GSA) solves the optimization problems focused on the optimal tuning of the design parameter specific to the ESO method and of the anti-windup tracking gain. A tuning method for PI controllers is proposed as an efficient approach to the design of resilient control systems. The tuning method and the PI controllers are experimentally validated by the adaptive GSA-based tuning of PI controllers for the angular position control of a laboratory servo system.

  5. Robustness Elasticity in Complex Networks

    PubMed Central

    Matisziw, Timothy C.; Grubesic, Tony H.; Guo, Junyu

    2012-01-01

    Network robustness refers to a network’s resilience to stress or damage. Given that most networks are inherently dynamic, with changing topology, loads, and operational states, their robustness is also likely subject to change. However, in most analyses of network structure, it is assumed that interaction among nodes has no effect on robustness. To investigate the hypothesis that network robustness is not sensitive or elastic to the level of interaction (or flow) among network nodes, this paper explores the impacts of network disruption, namely arc deletion, over a temporal sequence of observed nodal interactions for a large Internet backbone system. In particular, a mathematical programming approach is used to identify exact bounds on robustness to arc deletion for each epoch of nodal interaction. Elasticity of the identified bounds relative to the magnitude of arc deletion is assessed. Results indicate that system robustness can be highly elastic to spatial and temporal variations in nodal interactions within complex systems. Further, the presence of this elasticity provides evidence that a failure to account for nodal interaction can confound characterizations of complex networked systems. PMID:22808060

  6. A Non-parametric Cutout Index for Robust Evaluation of Identified Proteins*

    PubMed Central

    Serang, Oliver; Paulo, Joao; Steen, Hanno; Steen, Judith A.

    2013-01-01

    This paper proposes a novel, automated method for evaluating sets of proteins identified using mass spectrometry. The remaining peptide-spectrum match score distributions of protein sets are compared to an empirical absent peptide-spectrum match score distribution, and a Bayesian non-parametric method reminiscent of the Dirichlet process is presented to accurately perform this comparison. Thus, for a given protein set, the process computes the likelihood that the proteins identified are correctly identified. First, the method is used to evaluate protein sets chosen using different protein-level false discovery rate (FDR) thresholds, assigning each protein set a likelihood. The protein set assigned the highest likelihood is used to choose a non-arbitrary protein-level FDR threshold. Because the method can be used to evaluate any protein identification strategy (and is not limited to mere comparisons of different FDR thresholds), we subsequently use the method to compare and evaluate multiple simple methods for merging peptide evidence over replicate experiments. The general statistical approach can be applied to other types of data (e.g. RNA sequencing) and generalizes to multivariate problems. PMID:23292186

  7. Robustness surfaces of complex networks

    NASA Astrophysics Data System (ADS)

    Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis

    2014-09-01

    Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared.

  8. Robustness surfaces of complex networks

    PubMed Central

    Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis

    2014-01-01

    Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared. PMID:25178402

  9. Robustness surfaces of complex networks.

    PubMed

    Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis

    2014-09-02

    Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared.

  10. Robust interval-based regulation for anaerobic digestion processes.

    PubMed

    Alcaraz-González, V; Harmand, J; Rapaport, A; Steyer, J P; González-Alvarez, V; Pelayo-Ortiz, C

    2005-01-01

    A robust regulation law is applied to the stabilization of a class of biochemical reactors exhibiting partially known highly nonlinear dynamic behavior. An uncertain environment with the presence of unknown inputs is considered. Based on some structural and operational conditions, this regulation law is shown to exponentially stabilize the aforementioned bioreactors around a desired set-point. This approach is experimentally applied and validated on a pilot-scale (1 m3) anaerobic digestion process for the treatment of raw industrial wine distillery wastewater where the objective is the regulation of the chemical oxygen demand (COD) by using the dilution rate as the manipulated variable. Despite large disturbances on the input COD and state and parametric uncertainties, this regulation law gave excellent performances leading the output COD towards its set-point and keeping it inside a pre-specified interval.

  11. Processing of Perceptual Information Is More Robust than Processing of Conceptual Information in Preschool-Age Children: Evidence from Costs of Switching

    ERIC Educational Resources Information Center

    Fisher, Anna V.

    2011-01-01

    Is processing of conceptual information as robust as processing of perceptual information early in development? Existing empirical evidence is insufficient to answer this question. To examine this issue, 3- to 5-year-old children were presented with a flexible categorization task, in which target items (e.g., an open red umbrella) shared category…

  12. Magnetoencephalographic Signals Identify Stages in Real-Life Decision Processes

    PubMed Central

    Braeutigam, Sven; Stins, John F.; Rose, Steven P. R.; Swithenby, Stephen J.; Ambler, Tim

    2001-01-01

    We used magnetoencephalography (MEG) to study the dynamics of neural responses in eight subjects engaged in shopping for day-to-day items from supermarket shelves. This behavior not only has personal and economic importance but also provides an example of an experience that is both personal and shared between individuals. The shopping experience enables the exploration of neural mechanisms underlying choice based on complex memories. Choosing among different brands of closely related products activated a robust sequence of signals within the first second after the presentation of the choice images. This sequence engaged first the visual cortex (80-100 ms), then as the images were analyzed, predominantly the left temporal regions (310-340 ms). At longer latency, characteristic neural activetion was found in motor speech areas (500-520 ms) for images requiring low salience choices with respect to previous (brand) memory, and in right parietal cortex for high salience choices (850-920 ms). We argue that the neural processes associated with the particular brand-choice stimulus can be separated into identifiable stages through observation of MEG responses and knowledge of functional anatomy. PMID:12018772

  13. Comparing Four Instructional Techniques for Promoting Robust Knowledge

    ERIC Educational Resources Information Center

    Richey, J. Elizabeth; Nokes-Malach, Timothy J.

    2015-01-01

    Robust knowledge serves as a common instructional target in academic settings. Past research identifying characteristics of experts' knowledge across many domains can help clarify the features of robust knowledge as well as ways of assessing it. We review the expertise literature and identify three key features of robust knowledge (deep,…

  14. Robust input design for nonlinear dynamic modeling of AUV.

    PubMed

    Nouri, Nowrouz Mohammad; Valadi, Mehrdad

    2017-09-01

    Input design has a dominant role in developing the dynamic model of autonomous underwater vehicles (AUVs) through system identification. Optimal input design is the process of generating informative inputs that can be used to generate the good quality dynamic model of AUVs. In a problem with optimal input design, the desired input signal depends on the unknown system which is intended to be identified. In this paper, the input design approach which is robust to uncertainties in model parameters is used. The Bayesian robust design strategy is applied to design input signals for dynamic modeling of AUVs. The employed approach can design multiple inputs and apply constraints on an AUV system's inputs and outputs. Particle swarm optimization (PSO) is employed to solve the constraint robust optimization problem. The presented algorithm is used for designing the input signals for an AUV, and the estimate obtained by robust input design is compared with that of the optimal input design. According to the results, proposed input design can satisfy both robustness of constraints and optimality. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Design of robust flow processing networks with time-programmed responses

    NASA Astrophysics Data System (ADS)

    Kaluza, P.; Mikhailov, A. S.

    2012-04-01

    Can artificially designed networks reach the levels of robustness against local damage which are comparable with those of the biochemical networks of a living cell? We consider a simple model where the flow applied to an input node propagates through the network and arrives at different times to the output nodes, thus generating a pattern of coordinated responses. By using evolutionary optimization algorithms, functional networks - with required time-programmed responses - were constructed. Then, continuing the evolution, such networks were additionally optimized for robustness against deletion of individual nodes or links. In this manner, large ensembles of functional networks with different kinds of robustness were obtained, making statistical investigations and comparison of their structural properties possible. We have found that, generally, different architectures are needed for various kinds of robustness. The differences are statistically revealed, for example, in the Laplacian spectra of the respective graphs. On the other hand, motif distributions of robust networks do not differ from those of the merely functional networks; they are found to belong to the first Alon superfamily, the same as that of the gene transcription networks of single-cell organisms.

  16. Confronting Oahu's Water Woes: Identifying Scenarios for a Robust Evaluation of Policy Alternatives

    NASA Astrophysics Data System (ADS)

    van Rees, C. B.; Garcia, M. E.; Alarcon, T.; Sixt, G.

    2013-12-01

    The Pearl Harbor aquifer is the most important freshwater resource on Oahu (Hawaii, U.S.A), providing water to nearly half a million people. Recent studies show that current water use is reaching or exceeding sustainable yield. Climate change and increasing resident and tourist populations are predicted to further stress the aquifer. The island has lost huge tracts of freshwater and estuarine wetlands since human settlement; the dependence of many endemic, endangered species on these wetlands, as well as ecosystem benefits from wetlands, link humans and wildlife through water management. After the collapse of the sugar industry on Oahu (mid-1990s), the Waiahole ditch--a massive stream diversion bringing water from the island's windward to the leeward side--became a hotly disputed resource. Commercial interests and traditional farmers have clashed over the water, which could also serve to support the Pearl Harbor aquifer. Considering competing interests, impending scarcity, and uncertain future conditions, how can groundwater be managed most effectively? Complex water networks like this are characterized by conflicts between stakeholders, coupled human-natural systems, and future uncertainty. The Water Diplomacy Framework offers a model for analyzing such complex issues by integrating multiple disciplinary perspectives, identifying intervention points, and proposing sustainable solutions. The Water Diplomacy Framework is a theory and practice of implementing adaptive water management for complex problems by shifting the discussion from 'allocation of water' to 'benefit from water resources'. This is accomplished through an interactive process that includes stakeholder input, joint fact finding, collaborative scenario development, and a negotiated approach to value creation. Presented here are the results of the initial steps in a long term project to resolve water limitations on Oahu. We developed a conceptual model of the Pearl Harbor Aquifer system and identified

  17. Towards a more efficient and robust representation of subsurface hydrological processes in Earth System Models

    NASA Astrophysics Data System (ADS)

    Rosolem, R.; Rahman, M.; Kollet, S. J.; Wagener, T.

    2017-12-01

    Understanding the impacts of land cover and climate changes on terrestrial hydrometeorology is important across a range of spatial and temporal scales. Earth System Models (ESMs) provide a robust platform for evaluating these impacts. However, current ESMs lack the representation of key hydrological processes (e.g., preferential water flow, and direct interactions with aquifers) in general. The typical "free drainage" conceptualization of land models can misrepresent the magnitude of those interactions, consequently affecting the exchange of energy and water at the surface as well as estimates of groundwater recharge. Recent studies show the benefits of explicitly simulating the interactions between subsurface and surface processes in similar models. However, such parameterizations are often computationally demanding resulting in limited application for large/global-scale studies. Here, we take a different approach in developing a novel parameterization for groundwater dynamics. Instead of directly adding another complex process to an established land model, we examine a set of comprehensive experimental scenarios using a very robust and establish three-dimensional hydrological model to develop a simpler parameterization that represents the aquifer to land surface interactions. The main goal of our developed parameterization is to simultaneously maximize the computational gain (i.e., "efficiency") while minimizing simulation errors in comparison to the full 3D model (i.e., "robustness") to allow for easy implementation in ESMs globally. Our study focuses primarily on understanding both the dynamics for groundwater recharge and discharge, respectively. Preliminary results show that our proposed approach significantly reduced the computational demand while model deviations from the full 3D model are considered to be small for these processes.

  18. Enabling Rapid and Robust Structural Analysis During Conceptual Design

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.; Padula, Sharon L.; Li, Wu

    2015-01-01

    This paper describes a multi-year effort to add a structural analysis subprocess to a supersonic aircraft conceptual design process. The desired capabilities include parametric geometry, automatic finite element mesh generation, static and aeroelastic analysis, and structural sizing. The paper discusses implementation details of the new subprocess, captures lessons learned, and suggests future improvements. The subprocess quickly compares concepts and robustly handles large changes in wing or fuselage geometry. The subprocess can rank concepts with regard to their structural feasibility and can identify promising regions of the design space. The automated structural analysis subprocess is deemed robust and rapid enough to be included in multidisciplinary conceptual design and optimization studies.

  19. Robustness. [in space systems

    NASA Technical Reports Server (NTRS)

    Ryan, Robert

    1993-01-01

    The concept of rubustness includes design simplicity, component and path redundancy, desensitization to the parameter and environment variations, control of parameter variations, and punctual operations. These characteristics must be traded with functional concepts, materials, and fabrication approach against the criteria of performance, cost, and reliability. The paper describes the robustness design process, which includes the following seven major coherent steps: translation of vision into requirements, definition of the robustness characteristics desired, criteria formulation of required robustness, concept selection, detail design, manufacturing and verification, operations.

  20. Improved process robustness by using closed loop control in deep drawing applications

    NASA Astrophysics Data System (ADS)

    Barthau, M.; Liewald, M.; Christian, Held

    2017-09-01

    The production of irregular shaped deep-drawing parts with high quality requirements, which are common in today’s automotive production, permanently challenges production processes. High requirements on lightweight construction of passenger car bodies following European regulations until 2020 have been massively increasing the use of high strength steels substantially for years and are also leading to bigger challenges in sheet metal part production. Of course, the more and more complex shapes of today’s car body shells also intensify the issue due to modern and future design criteria. The metal forming technology tries to meet these challenges by developing a highly sophisticated layout of deep drawing dies that consider part quality requirements, process robustness and controlled material flow during the deep or stretch drawing process phase. A new method for controlling material flow using a closed loop system was developed at the IFU Stuttgart. In contrast to previous approaches, this new method allows a control intervention during the deep-drawing stroke. The blank holder force around the outline of the drawn part is used as control variable. The closed loop is designed as trajectory follow up with feed forward control. The used command variable is the part-wall stress that is measured with a piezo-electric measuring pin. In this paper the used control loop will be described in detail. The experimental tool that was built for testing the new control approach is explained here with its features. A method for gaining the follow up trajectories from simulation will also be presented. Furthermore, experimental results considering the robustness of the deep drawing process and the gain in process performance with developed control loop will be shown. Finally, a new procedure for the industrial application of the new control method of deep drawing will be presented by using a new kind of active element to influence the local blank holder pressure onto part

  1. A robust human face detection algorithm

    NASA Astrophysics Data System (ADS)

    Raviteja, Thaluru; Karanam, Srikrishna; Yeduguru, Dinesh Reddy V.

    2012-01-01

    Human face detection plays a vital role in many applications like video surveillance, managing a face image database, human computer interface among others. This paper proposes a robust algorithm for face detection in still color images that works well even in a crowded environment. The algorithm uses conjunction of skin color histogram, morphological processing and geometrical analysis for detecting human faces. To reinforce the accuracy of face detection, we further identify mouth and eye regions to establish the presence/absence of face in a particular region of interest.

  2. Modeling stochasticity and robustness in gene regulatory networks.

    PubMed

    Garg, Abhishek; Mohanram, Kartik; Di Cara, Alessandro; De Micheli, Giovanni; Xenarios, Ioannis

    2009-06-15

    Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.

  3. Robust iterative learning control for multi-phase batch processes: an average dwell-time method with 2D convergence indexes

    NASA Astrophysics Data System (ADS)

    Wang, Limin; Shen, Yiteng; Yu, Jingxian; Li, Ping; Zhang, Ridong; Gao, Furong

    2018-01-01

    In order to cope with system disturbances in multi-phase batch processes with different dimensions, a hybrid robust control scheme of iterative learning control combined with feedback control is proposed in this paper. First, with a hybrid iterative learning control law designed by introducing the state error, the tracking error and the extended information, the multi-phase batch process is converted into a two-dimensional Fornasini-Marchesini (2D-FM) switched system with different dimensions. Second, a switching signal is designed using the average dwell-time method integrated with the related switching conditions to give sufficient conditions ensuring stable running for the system. Finally, the minimum running time of the subsystems and the control law gains are calculated by solving the linear matrix inequalities. Meanwhile, a compound 2D controller with robust performance is obtained, which includes a robust extended feedback control for ensuring the steady-state tracking error to converge rapidly. The application on an injection molding process displays the effectiveness and superiority of the proposed strategy.

  4. A Two-Step Method to Identify Positive Deviant Physician Organizations of Accountable Care Organizations with Robust Performance Management Systems.

    PubMed

    Pimperl, Alexander F; Rodriguez, Hector P; Schmittdiel, Julie A; Shortell, Stephen M

    2018-06-01

    To identify positive deviant (PD) physician organizations of Accountable Care Organizations (ACOs) with robust performance management systems (PMSYS). Third National Survey of Physician Organizations (NSPO3, n = 1,398). Organizational and external factors from NSPO3 were analyzed. Linear regression estimated the association of internal and contextual factors on PMSYS. Two cutpoints (75th/90th percentiles) identified PDs with the largest residuals and highest PMSYS scores. A total of 65 and 41 PDs were identified using 75th and 90th percentiles cutpoints, respectively. The 90th percentile more strongly differentiated PDs from non-PDs. Having a high proportion of vulnerable patients appears to constrain PMSYS development. Our PD identification method increases the likelihood that PD organizations selected for in-depth inquiry are high-performing organizations that exceed expectations. © Health Research and Educational Trust.

  5. A bottom-up robust optimization framework for identifying river basin development pathways under deep climate uncertainty

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Ray, P.; Brown, C.

    2016-12-01

    Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.

  6. Robust Fault Detection Using Robust Z1 Estimation and Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Curry, Tramone; Collins, Emmanuel G., Jr.; Selekwa, Majura; Guo, Ten-Huei (Technical Monitor)

    2001-01-01

    This research considers the application of robust Z(sub 1), estimation in conjunction with fuzzy logic to robust fault detection for an aircraft fight control system. It begins with the development of robust Z(sub 1) estimators based on multiplier theory and then develops a fixed threshold approach to fault detection (FD). It then considers the use of fuzzy logic for robust residual evaluation and FD. Due to modeling errors and unmeasurable disturbances, it is difficult to distinguish between the effects of an actual fault and those caused by uncertainty and disturbance. Hence, it is the aim of a robust FD system to be sensitive to faults while remaining insensitive to uncertainty and disturbances. While fixed thresholds only allow a decision on whether a fault has or has not occurred, it is more valuable to have the residual evaluation lead to a conclusion related to the degree of, or probability of, a fault. Fuzzy logic is a viable means of determining the degree of a fault and allows the introduction of human observations that may not be incorporated in the rigorous threshold theory. Hence, fuzzy logic can provide a more reliable and informative fault detection process. Using an aircraft flight control system, the results of FD using robust Z(sub 1) estimation with a fixed threshold are demonstrated. FD that combines robust Z(sub 1) estimation and fuzzy logic is also demonstrated. It is seen that combining the robust estimator with fuzzy logic proves to be advantageous in increasing the sensitivity to smaller faults while remaining insensitive to uncertainty and disturbances.

  7. Robustness of movement models: can models bridge the gap between temporal scales of data sets and behavioural processes?

    PubMed

    Schlägel, Ulrike E; Lewis, Mark A

    2016-12-01

    Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.

  8. OGS#PETSc approach for robust and efficient simulations of strongly coupled hydrothermal processes in EGS reservoirs

    NASA Astrophysics Data System (ADS)

    Watanabe, Norihiro; Blucher, Guido; Cacace, Mauro; Kolditz, Olaf

    2016-04-01

    A robust and computationally efficient solution is important for 3D modelling of EGS reservoirs. This is particularly the case when the reservoir model includes hydraulic conduits such as induced or natural fractures, fault zones, and wellbore open-hole sections. The existence of such hydraulic conduits results in heterogeneous flow fields and in a strengthened coupling between fluid flow and heat transport processes via temperature dependent fluid properties (e.g. density and viscosity). A commonly employed partitioned solution (or operator-splitting solution) may not robustly work for such strongly coupled problems its applicability being limited by small time step sizes (e.g. 5-10 days) whereas the processes have to be simulated for 10-100 years. To overcome this limitation, an alternative approach is desired which can guarantee a robust solution of the coupled problem with minor constraints on time step sizes. In this work, we present a Newton-Raphson based monolithic coupling approach implemented in the OpenGeoSys simulator (OGS) combined with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library. The PETSc library is used for both linear and nonlinear solvers as well as MPI-based parallel computations. The suggested method has been tested by application to the 3D reservoir site of Groß Schönebeck, in northern Germany. Results show that the exact Newton-Raphson approach can also be limited to small time step sizes (e.g. one day) due to slight oscillations in the temperature field. The usage of a line search technique and modification of the Jacobian matrix were necessary to achieve robust convergence of the nonlinear solution. For the studied example, the proposed monolithic approach worked even with a very large time step size of 3.5 years.

  9. Robust Low Cost Aerospike/RLV Combustion Chamber by Advanced Vacuum Plasma Process

    NASA Technical Reports Server (NTRS)

    Holmes, Richard; Ellis, David; McKechnie

    1999-01-01

    Next-generation, regeneratively cooled rocket engines will require materials that can withstand high temperatures while retaining high thermal conductivity. At the same time, fabrication techniques must be cost efficient so that engine components can be manufactured within the constraints of a shrinking NASA budget. In recent years, combustion chambers of equivalent size to the Aerospike chamber have been fabricated at NASA-Marshall Space Flight Center (MSFC) using innovative, relatively low-cost, vacuum-plasma-spray (VPS) techniques. Typically, such combustion chambers are made of the copper alloy NARloy-Z. However, current research and development conducted by NASA-Lewis Research Center (LeRC) has identified a Cu-8Cr-4Nb alloy which possesses excellent high-temperature strength, creep resistance, and low cycle fatigue behavior combined with exceptional thermal stability. In fact, researchers at NASA-LeRC have demonstrated that powder metallurgy (P/M) Cu-8Cr-4Nb exhibits better mechanical properties at 1,200 F than NARloy-Z does at 1,000 F. The objective of this program was to develop and demonstrate the technology to fabricate high-performance, robust, inexpensive combustion chambers for advanced propulsion systems (such as Lockheed-Martin's VentureStar and NASA's Reusable Launch Vehicle, RLV) using the low-cost, VPS process to deposit Cu-8Cr-4Nb with mechanical properties that match or exceed those of P/M Cu-8Cr-4Nb. In addition, oxidation resistant and thermal barrier coatings can be incorporated as an integral part of the hot wall of the liner during the VPS process. Tensile properties of Cu-8Cr-4Nb material produced by VPS are reviewed and compared to material produced previously by extrusion. VPS formed combustion chamber liners have also been prepared and will be reported on following scheduled hot firing tests at NASA-Lewis.

  10. A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: I. Robust Linear Optimization and Robust Mixed Integer Linear Optimization

    PubMed Central

    Li, Zukui; Ding, Ran; Floudas, Christodoulos A.

    2011-01-01

    Robust counterpart optimization techniques for linear optimization and mixed integer linear optimization problems are studied in this paper. Different uncertainty sets, including those studied in literature (i.e., interval set; combined interval and ellipsoidal set; combined interval and polyhedral set) and new ones (i.e., adjustable box; pure ellipsoidal; pure polyhedral; combined interval, ellipsoidal, and polyhedral set) are studied in this work and their geometric relationship is discussed. For uncertainty in the left hand side, right hand side, and objective function of the optimization problems, robust counterpart optimization formulations induced by those different uncertainty sets are derived. Numerical studies are performed to compare the solutions of the robust counterpart optimization models and applications in refinery production planning and batch process scheduling problem are presented. PMID:21935263

  11. More About Robustness of Coherence

    NASA Astrophysics Data System (ADS)

    Li, Pi-Yu; Liu, Feng; Xu, Yan-Qin; La, Dong-Sheng

    2018-07-01

    Quantum coherence is an important physical resource in quantum computation and quantum information processing. In this paper, the distribution of the robustness of coherence in multipartite quantum system is considered. It is shown that the additivity of the robustness of coherence is not always valid for general quantum state, but the robustness of coherence is decreasing under partial trace for any bipartite quantum system. The ordering states with the coherence measures RoC, the l 1 norm of coherence C_{l1} and the relative entropy of coherence C r are also discussed.

  12. Optimal robust control strategy of a solid oxide fuel cell system

    NASA Astrophysics Data System (ADS)

    Wu, Xiaojuan; Gao, Danhui

    2018-01-01

    Optimal control can ensure system safe operation with a high efficiency. However, only a few papers discuss optimal control strategies for solid oxide fuel cell (SOFC) systems. Moreover, the existed methods ignore the impact of parameter uncertainty on system instantaneous performance. In real SOFC systems, several parameters may vary with the variation of operation conditions and can not be identified exactly, such as load current. Therefore, a robust optimal control strategy is proposed, which involves three parts: a SOFC model with parameter uncertainty, a robust optimizer and robust controllers. During the model building process, boundaries of the uncertain parameter are extracted based on Monte Carlo algorithm. To achieve the maximum efficiency, a two-space particle swarm optimization approach is employed to obtain optimal operating points, which are used as the set points of the controllers. To ensure the SOFC safe operation, two feed-forward controllers and a higher-order robust sliding mode controller are presented to control fuel utilization ratio, air excess ratio and stack temperature afterwards. The results show the proposed optimal robust control method can maintain the SOFC system safe operation with a maximum efficiency under load and uncertainty variations.

  13. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, A.; Künsch, H. R.; Schwierz, C.; Stahel, W. A.

    2012-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outlying observations may results from errors (e.g. in data transcription) or from local perturbations in the processes that are responsible for a given pattern of spatial variation. As an example, the spatial distribution of some trace metal in the soils of a region may be distorted by emissions of local anthropogenic sources. Outliers affect the modelling of the large-scale spatial variation, the so-called external drift or trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) [2] proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) [1] for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation. Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled

  14. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  15. Robust syntaxin-4 immunoreactivity in mammalian horizontal cell processes

    PubMed Central

    HIRANO, ARLENE A.; BRANDSTÄTTER, JOHANN HELMUT; VILA, ALEJANDRO; BRECHA, NICHOLAS C.

    2009-01-01

    Horizontal cells mediate inhibitory feed-forward and feedback communication in the outer retina; however, mechanisms that underlie transmitter release from mammalian horizontal cells are poorly understood. Toward determining whether the molecular machinery for exocytosis is present in horizontal cells, we investigated the localization of syntaxin-4, a SNARE protein involved in targeting vesicles to the plasma membrane, in mouse, rat, and rabbit retinae using immunocytochemistry. We report robust expression of syntaxin-4 in the outer plexiform layer of all three species. Syntaxin-4 occurred in processes and tips of horizontal cells, with regularly spaced, thicker sandwich-like structures along the processes. Double labeling with syntaxin-4 and calbindin antibodies, a horizontal cell marker, demonstrated syntaxin-4 localization to horizontal cell processes; whereas, double labeling with PKC antibodies, a rod bipolar cell (RBC) marker, showed a lack of co-localization, with syntaxin-4 immunolabeling occurring just distal to RBC dendritic tips. Syntaxin-4 immunolabeling occurred within VGLUT-1-immunoreactive photoreceptor terminals and underneath synaptic ribbons, labeled by CtBP2/RIBEYE antibodies, consistent with localization in invaginating horizontal cell tips at photoreceptor triad synapses. Vertical sections of retina immunostained for syntaxin-4 and peanut agglutinin (PNA) established that the prominent patches of syntaxin-4 immunoreactivity were adjacent to the base of cone pedicles. Horizontal sections through the OPL indicate a one-to-one co-localization of syntaxin-4 densities at likely all cone pedicles, with syntaxin-4 immunoreactivity interdigitating with PNA labeling. Pre-embedding immuno-electron microscopy confirmed the subcellular localization of syntaxin-4 labeling to lateral elements at both rod and cone triad synapses. Finally, co-localization with SNAP-25, a possible binding partner of syntaxin-4, indicated co-expression of these SNARE proteins in

  16. Molecular mechanisms governing differential robustness of development and environmental responses in plants

    PubMed Central

    Lachowiec, Jennifer; Queitsch, Christine; Kliebenstein, Daniel J.

    2016-01-01

    Background Robustness to genetic and environmental perturbation is a salient feature of multicellular organisms. Loss of developmental robustness can lead to severe phenotypic defects and fitness loss. However, perfect robustness, i.e. no variation at all, is evolutionarily unfit as organisms must be able to change phenotype to properly respond to changing environments and biotic challenges. Plasticity is the ability to adjust phenotypes predictably in response to specific environmental stimuli, which can be considered a transient shift allowing an organism to move from one robust phenotypic state to another. Plants, as sessile organisms that undergo continuous development, are particularly dependent on an exquisite fine-tuning of the processes that balance robustness and plasticity to maximize fitness. Scope and Conclusions This paper reviews recently identified mechanisms, both systems-level and molecular, that modulate robustness, and discusses their implications for the optimization of plant fitness. Robustness in living systems arises from the structure of genetic networks, the specific molecular functions of the underlying genes, and their interactions. This very same network responsible for the robustness of specific developmental states also has to be built such that it enables plastic yet robust shifts in response to environmental changes. In plants, the interactions and functions of signal transduction pathways activated by phytohormones and the tendency for plants to tolerate whole-genome duplications, tandem gene duplication and hybridization are emerging as major regulators of robustness in development. Despite their obvious implications for plant evolution and plant breeding, the mechanistic underpinnings by which plants modulate precise levels of robustness, plasticity and evolvability in networks controlling different phenotypes are under-studied. PMID:26473020

  17. Application of NMR Methods to Identify Detection Reagents for Use in the Development of Robust Nanosensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cosman, M; Krishnan, V V; Balhorn, R

    2004-04-29

    Nuclear Magnetic Resonance (NMR) spectroscopy is a powerful technique for studying bi-molecular interactions at the atomic scale. Our NMR lab is involved in the identification of small molecules, or ligands that bind to target protein receptors, such as tetanus (TeNT) and botulinum (BoNT) neurotoxins, anthrax proteins and HLA-DR10 receptors on non-Hodgkin's lymphoma cancer cells. Once low affinity binders are identified, they can be linked together to produce multidentate synthetic high affinity ligands (SHALs) that have very high specificity for their target protein receptors. An important nanotechnology application for SHALs is their use in the development of robust chemical sensors ormore » biochips for the detection of pathogen proteins in environmental samples or body fluids. Here, we describe a recently developed NMR competition assay based on transferred nuclear Overhauser effect spectroscopy (trNOESY) that enables the identification of sets of ligands that bind to the same site, or a different site, on the surface of TeNT fragment C (TetC) than a known ''marker'' ligand, doxorubicin. Using this assay, we can identify the optimal pairs of ligands to be linked together for creating detection reagents, as well as estimate the relative binding constants for ligands competing for the same site.« less

  18. On the robustness of Herlihy's hierarchy

    NASA Technical Reports Server (NTRS)

    Jayanti, Prasad

    1993-01-01

    A wait-free hierarchy maps object types to levels in Z(+) U (infinity) and has the following property: if a type T is at level N, and T' is an arbitrary type, then there is a wait-free implementation of an object of type T', for N processes, using only registers and objects of type T. The infinite hierarchy defined by Herlihy is an example of a wait-free hierarchy. A wait-free hierarchy is robust if it has the following property: if T is at level N, and S is a finite set of types belonging to levels N - 1 or lower, then there is no wait-free implementation of an object of type T, for N processes, using any number and any combination of objects belonging to the types in S. Robustness implies that there are no clever ways of combining weak shared objects to obtain stronger ones. Contrary to what many researchers believe, we prove that Herlihy's hierarchy is not robust. We then define some natural variants of Herlihy's hierarchy, which are also infinite wait-free hierarchies. With the exception of one, which is still open, these are not robust either. We conclude with the open question of whether non-trivial robust wait-free hierarchies exist.

  19. Robust detection-isolation-accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Weiss, J. L.; Pattipati, K. R.; Willsky, A. S.; Eterno, J. S.; Crawford, J. T.

    1985-01-01

    The results of a one year study to: (1) develop a theory for Robust Failure Detection and Identification (FDI) in the presence of model uncertainty, (2) develop a design methodology which utilizes the robust FDI ththeory, (3) apply the methodology to a sensor FDI problem for the F-100 jet engine, and (4) demonstrate the application of the theory to the evaluation of alternative FDI schemes are presented. Theoretical results in statistical discrimination are used to evaluate the robustness of residual signals (or parity relations) in terms of their usefulness for FDI. Furthermore, optimally robust parity relations are derived through the optimization of robustness metrics. The result is viewed as decentralization of the FDI process. A general structure for decentralized FDI is proposed and robustness metrics are used for determining various parameters of the algorithm.

  20. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Tradeoff on Phenotype Robustness in Biological Networks Part II: Ecological Networks

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    In ecological networks, network robustness should be large enough to confer intrinsic robustness for tolerating intrinsic parameter fluctuations, as well as environmental robustness for resisting environmental disturbances, so that the phenotype stability of ecological networks can be maintained, thus guaranteeing phenotype robustness. However, it is difficult to analyze the network robustness of ecological systems because they are complex nonlinear partial differential stochastic systems. This paper develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance sensitivity in ecological networks. We found that the phenotype robustness criterion for ecological networks is that if intrinsic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations and environmental disturbances. These results in robust ecological networks are similar to that in robust gene regulatory networks and evolutionary networks even they have different spatial-time scales. PMID:23515112

  1. Redundancy relations and robust failure detection

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Lou, X. C.; Verghese, G. C.; Willsky, A. S.

    1984-01-01

    All failure detection methods are based on the use of redundancy, that is on (possible dynamic) relations among the measured variables. Consequently the robustness of the failure detection process depends to a great degree on the reliability of the redundancy relations given the inevitable presence of model uncertainties. The problem of determining redundancy relations which are optimally robust in a sense which includes the major issues of importance in practical failure detection is addressed. A significant amount of intuition concerning the geometry of robust failure detection is provided.

  2. Robust information propagation through noisy neural circuits

    PubMed Central

    Pouget, Alexandre

    2017-01-01

    Sensory neurons give highly variable responses to stimulation, which can limit the amount of stimulus information available to downstream circuits. Much work has investigated the factors that affect the amount of information encoded in these population responses, leading to insights about the role of covariability among neurons, tuning curve shape, etc. However, the informativeness of neural responses is not the only relevant feature of population codes; of potentially equal importance is how robustly that information propagates to downstream structures. For instance, to quantify the retina’s performance, one must consider not only the informativeness of the optic nerve responses, but also the amount of information that survives the spike-generating nonlinearity and noise corruption in the next stage of processing, the lateral geniculate nucleus. Our study identifies the set of covariance structures for the upstream cells that optimize the ability of information to propagate through noisy, nonlinear circuits. Within this optimal family are covariances with “differential correlations”, which are known to reduce the information encoded in neural population activities. Thus, covariance structures that maximize information in neural population codes, and those that maximize the ability of this information to propagate, can be very different. Moreover, redundancy is neither necessary nor sufficient to make population codes robust against corruption by noise: redundant codes can be very fragile, and synergistic codes can—in some cases—optimize robustness against noise. PMID:28419098

  3. How robust is a robust policy? A comparative analysis of alternative robustness metrics for supporting robust decision analysis.

    NASA Astrophysics Data System (ADS)

    Kwakkel, Jan; Haasnoot, Marjolijn

    2015-04-01

    In response to climate and socio-economic change, in various policy domains there is increasingly a call for robust plans or policies. That is, plans or policies that performs well in a very large range of plausible futures. In the literature, a wide range of alternative robustness metrics can be found. The relative merit of these alternative conceptualizations of robustness has, however, received less attention. Evidently, different robustness metrics can result in different plans or policies being adopted. This paper investigates the consequences of several robustness metrics on decision making, illustrated here by the design of a flood risk management plan. A fictitious case, inspired by a river reach in the Netherlands is used. The performance of this system in terms of casualties, damages, and costs for flood and damage mitigation actions is explored using a time horizon of 100 years, and accounting for uncertainties pertaining to climate change and land use change. A set of candidate policy options is specified up front. This set of options includes dike raising, dike strengthening, creating more space for the river, and flood proof building and evacuation options. The overarching aim is to design an effective flood risk mitigation strategy that is designed from the outset to be adapted over time in response to how the future actually unfolds. To this end, the plan will be based on the dynamic adaptive policy pathway approach (Haasnoot, Kwakkel et al. 2013) being used in the Dutch Delta Program. The policy problem is formulated as a multi-objective robust optimization problem (Kwakkel, Haasnoot et al. 2014). We solve the multi-objective robust optimization problem using several alternative robustness metrics, including both satisficing robustness metrics and regret based robustness metrics. Satisficing robustness metrics focus on the performance of candidate plans across a large ensemble of plausible futures. Regret based robustness metrics compare the

  4. Completed Ensemble Empirical Mode Decomposition: a Robust Signal Processing Tool to Identify Sequence Strata

    NASA Astrophysics Data System (ADS)

    Purba, H.; Musu, J. T.; Diria, S. A.; Permono, W.; Sadjati, O.; Sopandi, I.; Ruzi, F.

    2018-03-01

    Well logging data provide many geological information and its trends resemble nonlinear or non-stationary signals. As long well log data recorded, there will be external factors can interfere or influence its signal resolution. A sensitive signal analysis is required to improve the accuracy of logging interpretation which it becomes an important thing to determine sequence stratigraphy. Complete Ensemble Empirical Mode Decomposition (CEEMD) is one of nonlinear and non-stationary signal analysis method which decomposes complex signal into a series of intrinsic mode function (IMF). Gamma Ray and Spontaneous Potential well log parameters decomposed into IMF-1 up to IMF-10 and each of its combination and correlation makes physical meaning identification. It identifies the stratigraphy and cycle sequence and provides an effective signal treatment method for sequence interface. This method was applied to BRK- 30 and BRK-13 well logging data. The result shows that the combination of IMF-5, IMF-6, and IMF-7 pattern represent short-term and middle-term while IMF-9 and IMF-10 represent the long-term sedimentation which describe distal front and delta front facies, and inter-distributary mouth bar facies, respectively. Thus, CEEMD clearly can determine the different sedimentary layer interface and better identification of the cycle of stratigraphic base level.

  5. Robust watermark technique using masking and Hermite transform.

    PubMed

    Coronel, Sandra L Gomez; Ramírez, Boris Escalante; Mosqueda, Marco A Acevedo

    2016-01-01

    The following paper evaluates a watermark algorithm designed for digital images by using a perceptive mask and a normalization process, thus preventing human eye detection, as well as ensuring its robustness against common processing and geometric attacks. The Hermite transform is employed because it allows a perfect reconstruction of the image, while incorporating human visual system properties; moreover, it is based on the Gaussian functions derivates. The applied watermark represents information of the digital image proprietor. The extraction process is blind, because it does not require the original image. The following techniques were utilized in the evaluation of the algorithm: peak signal-to-noise ratio, the structural similarity index average, the normalized crossed correlation, and bit error rate. Several watermark extraction tests were performed, with against geometric and common processing attacks. It allowed us to identify how many bits in the watermark can be modified for its adequate extraction.

  6. Robust Learning Control Design for Quantum Unitary Transformations.

    PubMed

    Wu, Chengzhi; Qi, Bo; Chen, Chunlin; Dong, Daoyi

    2017-12-01

    Robust control design for quantum unitary transformations has been recognized as a fundamental and challenging task in the development of quantum information processing due to unavoidable decoherence or operational errors in the experimental implementation of quantum operations. In this paper, we extend the systematic methodology of sampling-based learning control (SLC) approach with a gradient flow algorithm for the design of robust quantum unitary transformations. The SLC approach first uses a "training" process to find an optimal control strategy robust against certain ranges of uncertainties. Then a number of randomly selected samples are tested and the performance is evaluated according to their average fidelity. The approach is applied to three typical examples of robust quantum transformation problems including robust quantum transformations in a three-level quantum system, in a superconducting quantum circuit, and in a spin chain system. Numerical results demonstrate the effectiveness of the SLC approach and show its potential applications in various implementation of quantum unitary transformations.

  7. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to

  8. Formulation of an integrated robust design and tactics optimization process for undersea weapon systems

    NASA Astrophysics Data System (ADS)

    Frits, Andrew P.

    In the current Navy environment of undersea weapons development, the engineering aspect of design is decoupled from the development of the tactics with which the weapon is employed. Tactics are developed by intelligence experts, warfighters, and wargamers, while torpedo design is handled by engineers and contractors. This dissertation examines methods by which the conceptual design process of undersea weapon systems, including both torpedo systems and mine counter-measure systems, can be improved. It is shown that by simultaneously designing the torpedo and the tactics with which undersea weapons are used, a more effective overall weapon system can be created. In addition to integrating torpedo tactics with design, the thesis also looks at design methods to account for uncertainty. The uncertainty is attributable to multiple sources, including: lack of detailed analysis tools early in the design process, incomplete knowledge of the operational environments, and uncertainty in the performance of potential technologies. A robust design process is introduced to account for this uncertainty in the analysis and optimization of torpedo systems through the combination of Monte Carlo simulation with response surface methodology and metamodeling techniques. Additionally, various other methods that are appropriate to uncertainty analysis are discussed and analyzed. The thesis also advances a new approach towards examining robustness and risk: the treatment of probability of success (POS) as an independent variable. Examining the cost and performance tradeoffs between high and low probability of success designs, the decision-maker can make better informed decisions as to what designs are most promising and determine the optimal balance of risk, cost, and performance. Finally, the thesis examines the use of non-dimensionalization of parameters for torpedo design. The thesis shows that the use of non-dimensional torpedo parameters leads to increased knowledge about the

  9. Processing and Properties of Fiber Reinforced Polymeric Matrix Composites. Part 2; Processing Robustness of IM7/PETI Polyimide Composites

    NASA Technical Reports Server (NTRS)

    Hou, Tan-Hung

    1996-01-01

    The processability of a phenylethynyl terminated imide (PETI) resin matrix composite was investigated. Unidirectional prepregs were made by coating an N-methylpyrrolidone solution of the amide acid oligomer onto unsized IM7. Two batches of prepregs were used: one was made by NASA in-house, and the other was from an industrial source. The composite processing robustness was investigated with respect to the effect of B-staging conditions, the prepreg shelf life, and the optimal processing window. Rheological measurements indicated that PETI's processability was only slightly affected over a wide range of B-staging temperatures (from 250 C to 300 C). The open hole compression (OHC) strength values were statistically indistinguishable among specimens consolidated using various B-staging conditions. Prepreg rheology and OHC strengths were also found not to be affected by prolonged (i.e., up to 60 days) ambient storage. An optimal processing window was established using response surface methodology. It was found that IM7/PETI composite is more sensitive to the consolidation temperature than to the consolidation pressure. A good consolidation was achievable at 371 C/100 Psi, which yielded an OHC strength of 62 Ksi at room temperature. However, processability declined dramatically at temperatures below 350 C.

  10. Robustness analysis of non-ordinary Petri nets for flexible assembly/disassembly processes based on structural decomposition

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2011-03-01

    Design of robust supervisory controllers for manufacturing systems with unreliable resources has received significant attention recently. Robustness analysis provides an alternative way to analyse a perturbed system to quickly respond to resource failures. Although we have analysed the robustness properties of several subclasses of ordinary Petri nets (PNs), analysis for non-ordinary PNs has not been done. Non-ordinary PNs have weighted arcs and have the advantage to compactly model operations requiring multiple parts or resources. In this article, we consider a class of flexible assembly/disassembly manufacturing systems and propose a non-ordinary flexible assembly/disassembly Petri net (NFADPN) model for this class of systems. As the class of flexible assembly/disassembly manufacturing systems can be regarded as the integration and interactions of a set of assembly/disassembly subprocesses, a bottom-up approach is adopted in this article to construct the NFADPN models. Due to the routing flexibility in NFADPN, there may exist different ways to accomplish the tasks. To characterise different ways to accomplish the tasks, we propose the concept of completely connected subprocesses. As long as there exists a set of completely connected subprocesses for certain type of products, the production of that type of products can still be maintained without requiring the whole NFADPN to be live. To take advantage of the alternative routes without enforcing liveness for the whole system, we generalise the concept of persistent production proposed to NFADPN. We propose a condition for persistent production based on the concept of completely connected subprocesses. We extend robustness analysis to NFADPN by exploiting its structure. We identify several patterns of resource failures and characterise the conditions to maintain operation in the presence of resource failures.

  11. Characterization, optimisation and process robustness of a co-processed mannitol for the development of orally disintegrating tablets.

    PubMed

    Soh, Josephine Lay Peng; Grachet, Maud; Whitlock, Mark; Lukas, Timothy

    2013-02-01

    This is a study to fully assess a commercially available co-processed mannitol for its usefulness as an off-the-shelf excipient for developing orally disintegrating tablets (ODTs) by direct compression on a pilot scale (up to 4 kg). This work encompassed material characterization, formulation optimisation and process robustness. Overall, this co-processed mannitol possessed favourable physical attributes including low hygroscopicity and compactibility. Two design-of-experiments (DoEs) were used to screen and optimise the placebo formulation. Xylitol and crospovidone concentrations were found to have the most significant impact on disintegration time (p < 0.05). Higher xylitol concentrations retarded disintegration. Avicel PH102 promoted faster disintegration than PH101, at higher levels of xylitol. Without xylitol, higher crospovidone concentrations yielded faster disintegration and reduced tablet friability. Lubrication sensitivity studies were later conducted at two fill loads, three levels for lubricant concentration and number of blend rotations. Even at 75% fill load, the design space plot showed that 1.5% lubricant and 300 blend revolutions were sufficient to manufacture ODTs with ≤ 0.1% friability and disintegrated within 15 s. This study also describes results using a modified disintegration method based on the texture analyzer as an alternative to the USP method.

  12. Design principles for robust oscillatory behavior.

    PubMed

    Castillo-Hair, Sebastian M; Villota, Elizabeth R; Coronado, Alberto M

    2015-09-01

    Oscillatory responses are ubiquitous in regulatory networks of living organisms, a fact that has led to extensive efforts to study and replicate the circuits involved. However, to date, design principles that underlie the robustness of natural oscillators are not completely known. Here we study a three-component enzymatic network model in order to determine the topological requirements for robust oscillation. First, by simulating every possible topological arrangement and varying their parameter values, we demonstrate that robust oscillators can be obtained by augmenting the number of both negative feedback loops and positive autoregulations while maintaining an appropriate balance of positive and negative interactions. We then identify network motifs, whose presence in more complex topologies is a necessary condition for obtaining oscillatory responses. Finally, we pinpoint a series of simple architectural patterns that progressively render more robust oscillators. Together, these findings can help in the design of more reliable synthetic biomolecular networks and may also have implications in the understanding of other oscillatory systems.

  13. Identifying Core Concepts of Cybersecurity: Results of Two Delphi Processes

    ERIC Educational Resources Information Center

    Parekh, Geet; DeLatte, David; Herman, Geoffrey L.; Oliva, Linda; Phatak, Dhananjay; Scheponik, Travis; Sherman, Alan T.

    2018-01-01

    This paper presents and analyzes results of two Delphi processes that polled cybersecurity experts to rate cybersecurity topics based on importance, difficulty, and timelessness. These ratings can be used to identify core concepts--cross-cutting ideas that connect knowledge in the discipline. The first Delphi process identified core concepts that…

  14. The Problem of Size in Robust Design

    NASA Technical Reports Server (NTRS)

    Koch, Patrick N.; Allen, Janet K.; Mistree, Farrokh; Mavris, Dimitri

    1997-01-01

    To facilitate the effective solution of multidisciplinary, multiobjective complex design problems, a departure from the traditional parametric design analysis and single objective optimization approaches is necessary in the preliminary stages of design. A necessary tradeoff becomes one of efficiency vs. accuracy as approximate models are sought to allow fast analysis and effective exploration of a preliminary design space. In this paper we apply a general robust design approach for efficient and comprehensive preliminary design to a large complex system: a high speed civil transport (HSCT) aircraft. Specifically, we investigate the HSCT wing configuration design, incorporating life cycle economic uncertainties to identify economically robust solutions. The approach is built on the foundation of statistical experimentation and modeling techniques and robust design principles, and is specialized through incorporation of the compromise Decision Support Problem for multiobjective design. For large problems however, as in the HSCT example, this robust design approach developed for efficient and comprehensive design breaks down with the problem of size - combinatorial explosion in experimentation and model building with number of variables -and both efficiency and accuracy are sacrificed. Our focus in this paper is on identifying and discussing the implications and open issues associated with the problem of size for the preliminary design of large complex systems.

  15. Analytical redundancy and the design of robust failure detection systems

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Willsky, A. S.

    1984-01-01

    The Failure Detection and Identification (FDI) process is viewed as consisting of two stages: residual generation and decision making. It is argued that a robust FDI system can be achieved by designing a robust residual generation process. Analytical redundancy, the basis for residual generation, is characterized in terms of a parity space. Using the concept of parity relations, residuals can be generated in a number of ways and the design of a robust residual generation process can be formulated as a minimax optimization problem. An example is included to illustrate this design methodology. Previously announcedd in STAR as N83-20653

  16. Robust Control Design for Systems With Probabilistic Uncertainty

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.

  17. Temperature-Robust Neural Function from Activity-Dependent Ion Channel Regulation.

    PubMed

    O'Leary, Timothy; Marder, Eve

    2016-11-07

    Many species of cold-blooded animals experience substantial and rapid fluctuations in body temperature. Because biological processes are differentially temperature dependent, it is difficult to understand how physiological processes in such animals can be temperature robust [1-8]. Experiments have shown that core neural circuits, such as the pyloric circuit of the crab stomatogastric ganglion (STG), exhibit robust neural activity in spite of large (20°C) temperature fluctuations [3, 5, 7, 8]. This robustness is surprising because (1) each neuron has many different kinds of ion channels with different temperature dependencies (Q 10 s) that interact in a highly nonlinear way to produce firing patterns and (2) across animals there is substantial variability in conductance densities that nonetheless produce almost identical firing properties. The high variability in conductance densities in these neurons [9, 10] appears to contradict the possibility that robustness is achieved through precise tuning of key temperature-dependent processes. In this paper, we develop a theoretical explanation for how temperature robustness can emerge from a simple regulatory control mechanism that is compatible with highly variable conductance densities [11-13]. The resulting model suggests a general mechanism for how nervous systems and excitable tissues can exploit degenerate relationships among temperature-sensitive processes to achieve robust function. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Robust control of accelerators

    NASA Astrophysics Data System (ADS)

    Joel, W.; Johnson, D.; Chaouki, Abdallah T.

    1991-07-01

    The problem of controlling the variations in the rf power system can be effectively cast as an application of modern control theory. Two components of this theory are obtaining a model and a feedback structure. The model inaccuracies influence the choice of a particular controller structure. Because of the modelling uncertainty, one has to design either a variable, adaptive controller or a fixed, robust controller to achieve the desired objective. The adaptive control scheme usually results in very complex hardware; and, therefore, shall not be pursued in this research. In contrast, the robust control method leads to simpler hardware. However, robust control requires a more accurate mathematical model of the physical process than is required by adaptive control. Our research at the Los Alamos National Laboratory (LANL) and the University of New Mexico (UNM) has led to the development and implementation of a new robust rf power feedback system. In this article, we report on our research progress. In section 1, the robust control problem for the rf power system and the philosophy adopted for the beginning phase of our research is presented. In section 2, the results of our proof-of-principle experiments are presented. In section 3, we describe the actual controller configuration that is used in LANL FEL physics experiments. The novelty of our approach is that the control hardware is implemented directly in rf. without demodulating, compensating, and then remodulating.

  19. Panaceas, uncertainty, and the robust control framework in sustainability science

    PubMed Central

    Anderies, John M.; Rodriguez, Armando A.; Janssen, Marco A.; Cifdaloz, Oguzhan

    2007-01-01

    A critical challenge faced by sustainability science is to develop strategies to cope with highly uncertain social and ecological dynamics. This article explores the use of the robust control framework toward this end. After briefly outlining the robust control framework, we apply it to the traditional Gordon–Schaefer fishery model to explore fundamental performance–robustness and robustness–vulnerability trade-offs in natural resource management. We find that the classic optimal control policy can be very sensitive to parametric uncertainty. By exploring a large class of alternative strategies, we show that there are no panaceas: even mild robustness properties are difficult to achieve, and increasing robustness to some parameters (e.g., biological parameters) results in decreased robustness with respect to others (e.g., economic parameters). On the basis of this example, we extract some broader themes for better management of resources under uncertainty and for sustainability science in general. Specifically, we focus attention on the importance of a continual learning process and the use of robust control to inform this process. PMID:17881574

  20. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  1. Robustness testing in pharmaceutical freeze-drying: inter-relation of process conditions and product quality attributes studied for a vaccine formulation.

    PubMed

    Schneid, Stefan C; Stärtzel, Peter M; Lettner, Patrick; Gieseler, Henning

    2011-01-01

    The recent US Food and Drug Administration (FDA) legislation has introduced the evaluation of the Design Space of critical process parameters in manufacturing processes. In freeze-drying, a "formulation" is expected to be robust when minor deviations of the product temperature do not negatively affect the final product quality attributes. To evaluate "formulation" robustness by investigating the effect of elevated product temperature on product quality using a bacterial vaccine solution. The vaccine solution was characterized by freeze-dry microscopy to determine the critical formulation temperature. A conservative cycle was developed using the SMART™ mode of a Lyostar II freeze dryer. Product temperature was elevated to imitate intermediate and aggressive cycle conditions. The final product was analyzed using X-ray powder diffraction (XRPD), scanning electron microscopy (SEM), Karl Fischer, and modulated differential scanning calorimetry (MDSC), and the life cell count (LCC) during accelerated stability testing. The cakes processed at intermediate and aggressive conditions displayed larger pores with microcollapse of walls and stronger loss in LCC than the conservatively processed product, especially during stability testing. For all process conditions, a loss of the majority of cells was observed during storage. For freeze-drying of life bacterial vaccine solutions, the product temperature profile during primary drying appeared to be inter-related to product quality attributes.

  2. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  3. Robust efficient video fingerprinting

    NASA Astrophysics Data System (ADS)

    Puri, Manika; Lubin, Jeffrey

    2009-02-01

    We have developed a video fingerprinting system with robustness and efficiency as the primary and secondary design criteria. In extensive testing, the system has shown robustness to cropping, letter-boxing, sub-titling, blur, drastic compression, frame rate changes, size changes and color changes, as well as to the geometric distortions often associated with camcorder capture in cinema settings. Efficiency is afforded by a novel two-stage detection process in which a fast matching process first computes a number of likely candidates, which are then passed to a second slower process that computes the overall best match with minimal false alarm probability. One key component of the algorithm is a maximally stable volume computation - a three-dimensional generalization of maximally stable extremal regions - that provides a content-centric coordinate system for subsequent hash function computation, independent of any affine transformation or extensive cropping. Other key features include an efficient bin-based polling strategy for initial candidate selection, and a final SIFT feature-based computation for final verification. We describe the algorithm and its performance, and then discuss additional modifications that can provide further improvement to efficiency and accuracy.

  4. Ultra-low-power and robust digital-signal-processing hardware for implantable neural interface microsystems.

    PubMed

    Narasimhan, S; Chiel, H J; Bhunia, S

    2011-04-01

    Implantable microsystems for monitoring or manipulating brain activity typically require on-chip real-time processing of multichannel neural data using ultra low-power, miniaturized electronics. In this paper, we propose an integrated-circuit/architecture-level hardware design framework for neural signal processing that exploits the nature of the signal-processing algorithm. First, we consider different power reduction techniques and compare the energy efficiency between the ultra-low frequency subthreshold and conventional superthreshold design. We show that the superthreshold design operating at a much higher frequency can achieve comparable energy dissipation by taking advantage of extensive power gating. It also provides significantly higher robustness of operation and yield under large process variations. Next, we propose an architecture level preferential design approach for further energy reduction by isolating the critical computation blocks (with respect to the quality of the output signal) and assigning them higher delay margins compared to the noncritical ones. Possible delay failures under parameter variations are confined to the noncritical components, allowing graceful degradation in quality under voltage scaling. Simulation results using prerecorded neural data from the sea-slug (Aplysia californica) show that the application of the proposed design approach can lead to significant improvement in total energy, without compromising the output signal quality under process variations, compared to conventional design approaches.

  5. Development of a method of robust rain gauge network optimization based on intensity-duration-frequency results

    NASA Astrophysics Data System (ADS)

    Chebbi, A.; Bargaoui, Z. K.; da Conceição Cunha, M.

    2012-12-01

    Based on rainfall intensity-duration-frequency (IDF) curves, a robust optimization approach is proposed to identify the best locations to install new rain gauges. The advantage of robust optimization is that the resulting design solutions yield networks which behave acceptably under hydrological variability. Robust optimisation can overcome the problem of selecting representative rainfall events when building the optimization process. This paper reports an original approach based on Montana IDF model parameters. The latter are assumed to be geostatistical variables and their spatial interdependence is taken into account through the adoption of cross-variograms in the kriging process. The problem of optimally locating a fixed number of new monitoring stations based on an existing rain gauge network is addressed. The objective function is based on the mean spatial kriging variance and rainfall variogram structure using a variance-reduction method. Hydrological variability was taken into account by considering and implementing several return periods to define the robust objective function. Variance minimization is performed using a simulated annealing algorithm. In addition, knowledge of the time horizon is needed for the computation of the robust objective function. A short and a long term horizon were studied, and optimal networks are identified for each. The method developed is applied to north Tunisia (area = 21 000 km2). Data inputs for the variogram analysis were IDF curves provided by the hydrological bureau and available for 14 tipping bucket type rain gauges. The recording period was from 1962 to 2001, depending on the station. The study concerns an imaginary network augmentation based on the network configuration in 1973, which is a very significant year in Tunisia because there was an exceptional regional flood event in March 1973. This network consisted of 13 stations and did not meet World Meteorological Organization (WMO) recommendations for the minimum

  6. Strain-Dependent Transcriptome Signatures for Robustness in Lactococcus lactis

    PubMed Central

    Dijkstra, Annereinou R.; Alkema, Wynand; Starrenburg, Marjo J. C.; van Hijum, Sacha A. F. T.; Bron, Peter A.

    2016-01-01

    Recently, we demonstrated that fermentation conditions have a strong impact on subsequent survival of Lactococcus lactis strain MG1363 during heat and oxidative stress, two important parameters during spray drying. Moreover, employment of a transcriptome-phenotype matching approach revealed groups of genes associated with robustness towards heat and/or oxidative stress. To investigate if other strains have similar or distinct transcriptome signatures for robustness, we applied an identical transcriptome-robustness phenotype matching approach on the L. lactis strains IL1403, KF147 and SK11, which have previously been demonstrated to display highly diverse robustness phenotypes. These strains were subjected to an identical fermentation regime as was performed earlier for strain MG1363 and consisted of twelve conditions, varying in the level of salt and/or oxygen, as well as fermentation temperature and pH. In the exponential phase of growth, cells were harvested for transcriptome analysis and assessment of heat and oxidative stress survival phenotypes. The variation in fermentation conditions resulted in differences in heat and oxidative stress survival of up to five 10-log units. Effects of the fermentation conditions on stress survival of the L. lactis strains were typically strain-dependent, although the fermentation conditions had mainly similar effects on the growth characteristics of the different strains. By association of the transcriptomes and robustness phenotypes highly strain-specific transcriptome signatures for robustness towards heat and oxidative stress were identified, indicating that multiple mechanisms exist to increase robustness and, as a consequence, robustness of each strain requires individual optimization. However, a relatively small overlap in the transcriptome responses of the strains was also identified and this generic transcriptome signature included genes previously associated with stress (ctsR and lplL) and novel genes, including nan

  7. Individualized relapse prediction: Personality measures and striatal and insular activity during reward-processing robustly predict relapse.

    PubMed

    Gowin, Joshua L; Ball, Tali M; Wittmann, Marc; Tapert, Susan F; Paulus, Martin P

    2015-07-01

    Nearly half of individuals with substance use disorders relapse in the year after treatment. A diagnostic tool to help clinicians make decisions regarding treatment does not exist for psychiatric conditions. Identifying individuals with high risk for relapse to substance use following abstinence has profound clinical consequences. This study aimed to develop neuroimaging as a robust tool to predict relapse. 68 methamphetamine-dependent adults (15 female) were recruited from 28-day inpatient treatment. During treatment, participants completed a functional MRI scan that examined brain activation during reward processing. Patients were followed 1 year later to assess abstinence. We examined brain activation during reward processing between relapsing and abstaining individuals and employed three random forest prediction models (clinical and personality measures, neuroimaging measures, a combined model) to generate predictions for each participant regarding their relapse likelihood. 18 individuals relapsed. There were significant group by reward-size interactions for neural activation in the left insula and right striatum for rewards. Abstaining individuals showed increased activation for large, risky relative to small, safe rewards, whereas relapsing individuals failed to show differential activation between reward types. All three random forest models yielded good test characteristics such that a positive test for relapse yielded a likelihood ratio 2.63, whereas a negative test had a likelihood ratio of 0.48. These findings suggest that neuroimaging can be developed in combination with other measures as an instrument to predict relapse, advancing tools providers can use to make decisions about individualized treatment of substance use disorders. Published by Elsevier Ireland Ltd.

  8. Robust resolution enhancement optimization methods to process variations based on vector imaging model

    NASA Astrophysics Data System (ADS)

    Ma, Xu; Li, Yanqiu; Guo, Xuejia; Dong, Lisong

    2012-03-01

    Optical proximity correction (OPC) and phase shifting mask (PSM) are the most widely used resolution enhancement techniques (RET) in the semiconductor industry. Recently, a set of OPC and PSM optimization algorithms have been developed to solve for the inverse lithography problem, which are only designed for the nominal imaging parameters without giving sufficient attention to the process variations due to the aberrations, defocus and dose variation. However, the effects of process variations existing in the practical optical lithography systems become more pronounced as the critical dimension (CD) continuously shrinks. On the other hand, the lithography systems with larger NA (NA>0.6) are now extensively used, rendering the scalar imaging models inadequate to describe the vector nature of the electromagnetic field in the current optical lithography systems. In order to tackle the above problems, this paper focuses on developing robust gradient-based OPC and PSM optimization algorithms to the process variations under a vector imaging model. To achieve this goal, an integrative and analytic vector imaging model is applied to formulate the optimization problem, where the effects of process variations are explicitly incorporated in the optimization framework. The steepest descent algorithm is used to optimize the mask iteratively. In order to improve the efficiency of the proposed algorithms, a set of algorithm acceleration techniques (AAT) are exploited during the optimization procedure.

  9. Robust Alternatives to the Standard Deviation in Processing of Physics Experimental Data

    NASA Astrophysics Data System (ADS)

    Shulenin, V. P.

    2016-10-01

    Properties of robust estimations of the scale parameter are studied. It is noted that the median of absolute deviations and the modified estimation of the average Gini differences have asymptotically normal distributions and bounded influence functions, are B-robust estimations, and hence, unlike the estimation of the standard deviation, are protected from the presence of outliers in the sample. Results of comparison of estimations of the scale parameter are given for a Gaussian model with contamination. An adaptive variant of the modified estimation of the average Gini differences is considered.

  10. Robust estimation for ordinary differential equation models.

    PubMed

    Cao, J; Wang, L; Xu, J

    2011-12-01

    Applied scientists often like to use ordinary differential equations (ODEs) to model complex dynamic processes that arise in biology, engineering, medicine, and many other areas. It is interesting but challenging to estimate ODE parameters from noisy data, especially when the data have some outliers. We propose a robust method to address this problem. The dynamic process is represented with a nonparametric function, which is a linear combination of basis functions. The nonparametric function is estimated by a robust penalized smoothing method. The penalty term is defined with the parametric ODE model, which controls the roughness of the nonparametric function and maintains the fidelity of the nonparametric function to the ODE model. The basis coefficients and ODE parameters are estimated in two nested levels of optimization. The coefficient estimates are treated as an implicit function of ODE parameters, which enables one to derive the analytic gradients for optimization using the implicit function theorem. Simulation studies show that the robust method gives satisfactory estimates for the ODE parameters from noisy data with outliers. The robust method is demonstrated by estimating a predator-prey ODE model from real ecological data. © 2011, The International Biometric Society.

  11. The developmental genetics of biological robustness

    PubMed Central

    Mestek Boukhibar, Lamia; Barkoulas, Michalis

    2016-01-01

    Background Living organisms are continuously confronted with perturbations, such as environmental changes that include fluctuations in temperature and nutrient availability, or genetic changes such as mutations. While some developmental systems are affected by such challenges and display variation in phenotypic traits, others continue consistently to produce invariable phenotypes despite perturbation. This ability of a living system to maintain an invariable phenotype in the face of perturbations is termed developmental robustness. Biological robustness is a phenomenon observed across phyla, and studying its mechanisms is central to deciphering the genotype–phenotype relationship. Recent work in yeast, animals and plants has shown that robustness is genetically controlled and has started to reveal the underlying mechinisms behind it. Scope and Conclusions Studying biological robustness involves focusing on an important property of developmental traits, which is the phenotypic distribution within a population. This is often neglected because the vast majority of developmental biology studies instead focus on population aggregates, such as trait averages. By drawing on findings in animals and yeast, this Viewpoint considers how studies on plant developmental robustness may benefit from strict definitions of what is the developmental system of choice and what is the relevant perturbation, and also from clear distinctions between gene effects on the trait mean and the trait variance. Recent advances in quantitative developmental biology and high-throughput phenotyping now allow the design of targeted genetic screens to identify genes that amplify or restrict developmental trait variance and to study how variation propagates across different phenotypic levels in biological systems. The molecular characterization of more quantitative trait loci affecting trait variance will provide further insights into the evolution of genes modulating developmental robustness. The

  12. Optimization of Tape Winding Process Parameters to Enhance the Performance of Solid Rocket Nozzle Throat Back Up Liners using Taguchi's Robust Design Methodology

    NASA Astrophysics Data System (ADS)

    Nath, Nayani Kishore

    2017-08-01

    The throat back up liners is used to protect the nozzle structural members from the severe thermal environment in solid rocket nozzles. The throat back up liners is made with E-glass phenolic prepregs by tape winding process. The objective of this work is to demonstrate the optimization of process parameters of tape winding process to achieve better insulative resistance using Taguchi's robust design methodology. In this method four control factors machine speed, roller pressure, tape tension, tape temperature that were investigated for the tape winding process. The presented work was to study the cogency and acceptability of Taguchi's methodology in manufacturing of throat back up liners. The quality characteristic identified was Back wall temperature. Experiments carried out using L 9 ' (34) orthogonal array with three levels of four different control factors. The test results were analyzed using smaller the better criteria for Signal to Noise ratio in order to optimize the process. The experimental results were analyzed conformed and successfully used to achieve the minimum back wall temperature of the throat back up liners. The enhancement in performance of the throat back up liners was observed by carrying out the oxy-acetylene tests. The influence of back wall temperature on the performance of throat back up liners was verified by ground firing test.

  13. Network Robustness: the whole story

    NASA Astrophysics Data System (ADS)

    Longjas, A.; Tejedor, A.; Zaliapin, I. V.; Ambroj, S.; Foufoula-Georgiou, E.

    2014-12-01

    A multitude of actual processes operating on hydrological networks may exhibit binary outcomes such as clean streams in a river network that may become contaminated. These binary outcomes can be modeled by node removal processes (attacks) acting in a network. Network robustness against attacks has been widely studied in fields as diverse as the Internet, power grids and human societies. However, the current definition of robustness is only accounting for the connectivity of the nodes unaffected by the attack. Here, we put forward the idea that the connectivity of the affected nodes can play a crucial role in proper evaluation of the overall network robustness and its future recovery from the attack. Specifically, we propose a dual perspective approach wherein at any instant in the network evolution under attack, two distinct networks are defined: (i) the Active Network (AN) composed of the unaffected nodes and (ii) the Idle Network (IN) composed of the affected nodes. The proposed robustness metric considers both the efficiency of destroying the AN and the efficiency of building-up the IN. This approach is motivated by concrete applied problems, since, for example, if we study the dynamics of contamination in river systems, it is necessary to know both the connectivity of the healthy and contaminated parts of the river to assess its ecological functionality. We show that trade-offs between the efficiency of the Active and Idle network dynamics give rise to surprising crossovers and re-ranking of different attack strategies, pointing to significant implications for decision making.

  14. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental

  15. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology.

    PubMed

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental

  16. Robust climate policies under uncertainty: a comparison of robust decision making and info-gap methods.

    PubMed

    Hall, Jim W; Lempert, Robert J; Keller, Klaus; Hackbarth, Andrew; Mijere, Christophe; McInerney, David J

    2012-10-01

    This study compares two widely used approaches for robustness analysis of decision problems: the info-gap method originally developed by Ben-Haim and the robust decision making (RDM) approach originally developed by Lempert, Popper, and Bankes. The study uses each approach to evaluate alternative paths for climate-altering greenhouse gas emissions given the potential for nonlinear threshold responses in the climate system, significant uncertainty about such a threshold response and a variety of other key parameters, as well as the ability to learn about any threshold responses over time. Info-gap and RDM share many similarities. Both represent uncertainty as sets of multiple plausible futures, and both seek to identify robust strategies whose performance is insensitive to uncertainties. Yet they also exhibit important differences, as they arrange their analyses in different orders, treat losses and gains in different ways, and take different approaches to imprecise probabilistic information. The study finds that the two approaches reach similar but not identical policy recommendations and that their differing attributes raise important questions about their appropriate roles in decision support applications. The comparison not only improves understanding of these specific methods, it also suggests some broader insights into robustness approaches and a framework for comparing them. © 2012 RAND Corporation.

  17. A New Hybrid BFOA-PSO Optimization Technique for Decoupling and Robust Control of Two-Coupled Distillation Column Process.

    PubMed

    Abdelkarim, Noha; Mohamed, Amr E; El-Garhy, Ahmed M; Dorrah, Hassen T

    2016-01-01

    The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller.

  18. Robust PI and PID design for first- and second-order processes with zeros, time-delay and structured uncertainties

    NASA Astrophysics Data System (ADS)

    Parada, M.; Sbarbaro, D.; Borges, R. A.; Peres, P. L. D.

    2017-01-01

    The use of robust design techniques such as the one based on ? and ? for tuning proportional integral (PI) and proportional integral derivative (PID) controllers have been limited to address a small set of processes. This work addresses the problem by considering a wide set of possible plants, both first- and second-order continuous-time systems with time delays and zeros, leading to PI and PID controllers. The use of structured uncertainties to handle neglected dynamics allows to expand the range of processes to be considered. The proposed approach takes into account the robustness of the controller with respect to these structured uncertainties by using the small-gain theorem. In addition, improved performance is sought through the minimisation of an upper bound to the closed-loop system ? norm. A Lyapunov-Krasovskii-type functional is used to obtain delay-dependent design conditions. The controller design is accomplished by means of a convex optimisation procedure formulated using linear matrix inequalities. In order to illustrate the flexibility of the approach, several examples considering recycle compensation, reduced-order controller design and a practical implementation are addressed. Numerical experiments are provided in each case to highlight the main characteristics of the proposed design method.

  19. A New Hybrid BFOA-PSO Optimization Technique for Decoupling and Robust Control of Two-Coupled Distillation Column Process

    PubMed Central

    Mohamed, Amr E.; Dorrah, Hassen T.

    2016-01-01

    The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller. PMID:27807444

  20. Emergence of robustness in networks of networks

    NASA Astrophysics Data System (ADS)

    Roth, Kevin; Morone, Flaviano; Min, Byungjoon; Makse, Hernán A.

    2017-06-01

    A model of interdependent networks of networks (NONs) was introduced recently [Proc. Natl. Acad. Sci. (USA) 114, 3849 (2017), 10.1073/pnas.1620808114] in the context of brain activation to identify the neural collective influencers in the brain NON. Here we investigate the emergence of robustness in such a model, and we develop an approach to derive an exact expression for the random percolation transition in Erdös-Rényi NONs of this kind. Analytical calculations are in agreement with numerical simulations, and highlight the robustness of the NON against random node failures, which thus presents a new robust universality class of NONs. The key aspect of this robust NON model is that a node can be activated even if it does not belong to the giant mutually connected component, thus allowing the NON to be built from below the percolation threshold, which is not possible in previous models of interdependent networks. Interestingly, the phase diagram of the model unveils particular patterns of interconnectivity for which the NON is most vulnerable, thereby marking the boundary above which the robustness of the system improves with increasing dependency connections.

  1. A model to assess the Mars Telecommunications Network relay robustness

    NASA Technical Reports Server (NTRS)

    Girerd, Andre R.; Meshkat, Leila; Edwards, Charles D., Jr.; Lee, Charles H.

    2005-01-01

    The relatively long mission durations and compatible radio protocols of current and projected Mars orbiters have enabled the gradual development of a heterogeneous constellation providing proximity communication services for surface assets. The current and forecasted capability of this evolving network has reached the point that designers of future surface missions consider complete dependence on it. Such designers, along with those architecting network requirements, have a need to understand the robustness of projected communication service. A model has been created to identify the robustness of the Mars Network as a function of surface location and time. Due to the decade-plus time horizon considered, the network will evolve, with emerging productive nodes and nodes that cease or fail to contribute. The model is a flexible framework to holistically process node information into measures of capability robustness that can be visualized for maximum understanding. Outputs from JPL's Telecom Orbit Analysis Simulation Tool (TOAST) provide global telecom performance parameters for current and projected orbiters. Probabilistic estimates of orbiter fuel life are derived from orbit keeping burn rates, forecasted maneuver tasking, and anomaly resolution budgets. Orbiter reliability is estimated probabilistically. A flexible scheduling framework accommodates the projected mission queue as well as potential alterations.

  2. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.

    2013-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R

  3. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-off on Phenotype Robustness in Biological Networks Part I: Gene Regulatory Networks in Systems and Evolutionary Biology

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view. PMID:23515240

  4. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-off on Phenotype Robustness in Biological Networks Part I: Gene Regulatory Networks in Systems and Evolutionary Biology.

    PubMed

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view.

  5. Improved Signal Processing Technique Leads to More Robust Self Diagnostic Accelerometer System

    NASA Technical Reports Server (NTRS)

    Tokars, Roger; Lekki, John; Jaros, Dave; Riggs, Terrence; Evans, Kenneth P.

    2010-01-01

    The self diagnostic accelerometer (SDA) is a sensor system designed to actively monitor the health of an accelerometer. In this case an accelerometer is considered healthy if it can be determined that it is operating correctly and its measurements may be relied upon. The SDA system accomplishes this by actively monitoring the accelerometer for a variety of failure conditions including accelerometer structural damage, an electrical open circuit, and most importantly accelerometer detachment. In recent testing of the SDA system in emulated engine operating conditions it has been found that a more robust signal processing technique was necessary. An improved accelerometer diagnostic technique and test results of the SDA system utilizing this technique are presented here. Furthermore, the real time, autonomous capability of the SDA system to concurrently compensate for effects from real operating conditions such as temperature changes and mechanical noise, while monitoring the condition of the accelerometer health and attachment, will be demonstrated.

  6. Identifying robust regional precipitation responses to regional aerosol emissions perturbations in three coupled chemistry-climate models

    NASA Astrophysics Data System (ADS)

    Westervelt, D. M.; Fiore, A. M.; Lamarque, J. F.; Previdi, M. J.; Conley, A. J.; Shindell, D. T.; Mascioli, N. R.; Correa, G. J. P.; Faluvegi, G.; Horowitz, L. W.

    2017-12-01

    Regional emissions of anthropogenic aerosols and their precursors will likely decrease for the remainder of the 21st century, due to emission reduction policies enacted to protect human health. Although there is some evidence that regional climate effects of aerosols can be significant, we currently lack a robust understanding of the magnitude, spatio-temporal pattern, statistical significance, and physical processes responsible for these influences, especially for precipitation. Here, we aim to quantify systematically the precipitation response to regional changes in aerosols and investigate underlying mechanisms using three fully coupled chemistry-climate models: NOAA Geophysical Fluid Dynamics Laboratory Coupled Model 3 (GFDL-CM3), NCAR Community Earth System Model (CESM), and NASA Goddard Institute for Space Studies ModelE2 (GISS-E2). The central approach we use is to contrast a long control experiment (400 years, run with perpetual year 2000 emissions) with 14 individual aerosol emissions perturbation experiments ( 200 years each). We perturb emissions of sulfur dioxide (SO2) and carbonaceous aerosol (BC and OM) within several world regions and assess which responses are significant relative to internal variability determined by the control run and robust across the three models. Initial results show significant changes in precipitation in several vulnerable regions including the Western Sahel and the Indian subcontinent. SO2 emissions reductions from Europe and the United States have the largest impact on precipitation among most of the selected response regions. The precipitation response to emissions changes from these regions projects onto known modes of variability, such as the North Atlantic Oscillation (NAO) and the El Niño Southern Oscillation (ENSO). Across all perturbation experiments, we find a strong linear relationship between the responses of Sahel precipitation and the interhemispheric temperature difference, suggesting a common mechanism of an

  7. A Penalized Robust Method for Identifying Gene-Environment Interactions

    PubMed Central

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Xie, Yang; Ma, Shuangge

    2015-01-01

    In high-throughput studies, an important objective is to identify gene-environment interactions associated with disease outcomes and phenotypes. Many commonly adopted methods assume specific parametric or semiparametric models, which may be subject to model mis-specification. In addition, they usually use significance level as the criterion for selecting important interactions. In this study, we adopt the rank-based estimation, which is much less sensitive to model specification than some of the existing methods and includes several commonly encountered data and models as special cases. Penalization is adopted for the identification of gene-environment interactions. It achieves simultaneous estimation and identification and does not rely on significance level. For computation feasibility, a smoothed rank estimation is further proposed. Simulation shows that under certain scenarios, for example with contaminated or heavy-tailed data, the proposed method can significantly outperform the existing alternatives with more accurate identification. We analyze a lung cancer prognosis study with gene expression measurements under the AFT (accelerated failure time) model. The proposed method identifies interactions different from those using the alternatives. Some of the identified genes have important implications. PMID:24616063

  8. Multi-wavelength approach towards on-product overlay accuracy and robustness

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Kaustuve; Noot, Marc; Chang, Hammer; Liao, Sax; Chang, Ken; Gosali, Benny; Su, Eason; Wang, Cathy; den Boef, Arie; Fouquet, Christophe; Huang, Guo-Tsai; Chen, Kai-Hsiung; Cheng, Kevin; Lin, John

    2018-03-01

    Success of diffraction-based overlay (DBO) technique1,4,5 in the industry is not just for its good precision and low toolinduced shift, but also for the measurement accuracy2 and robustness that DBO can provide. Significant efforts are put in to capitalize on the potential that DBO has to address measurement accuracy and robustness. Introduction of many measurement wavelength choices (continuous wavelength) in DBO is one of the key new capabilities in this area. Along with the continuous choice of wavelengths, the algorithms (fueled by swing-curve physics) on how to use these wavelengths are of high importance for a robust recipe setup that can avoid the impact from process stack variations (symmetric as well as asymmetric). All these are discussed. Moreover, another aspect of boosting measurement accuracy and robustness is discussed that deploys the capability to combine overlay measurement data from multiple wavelength measurements. The goal is to provide a method to make overlay measurements immune from process stack variations and also to report health KPIs for every measurement. By combining measurements from multiple wavelengths, a final overlay measurement is generated. The results show a significant benefit in accuracy and robustness against process stack variation. These results are supported by both measurement data as well as simulation from many product stacks.

  9. Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.

    PubMed

    Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry

    Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.

  10. Application of the network robustness index to identifying critical road-network links in Chittenden County, Vermont.

    DOT National Transportation Integrated Search

    2010-06-01

    The purpose of this project is to conduct a pilot application of the Network : Robustness Index (NRI) for the Chittenden County Regional Transportation Model. : Using the results, improvements to the method to increase its effectiveness for more : wi...

  11. Vehicle active steering control research based on two-DOF robust internal model control

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Liu, Yahui; Wang, Fengbo; Bao, Chunjiang; Sun, Qun; Zhao, Youqun

    2016-07-01

    Because of vehicle's external disturbances and model uncertainties, robust control algorithms have obtained popularity in vehicle stability control. The robust control usually gives up performance in order to guarantee the robustness of the control algorithm, therefore an improved robust internal model control(IMC) algorithm blending model tracking and internal model control is put forward for active steering system in order to reach high performance of yaw rate tracking with certain robustness. The proposed algorithm inherits the good model tracking ability of the IMC control and guarantees robustness to model uncertainties. In order to separate the design process of model tracking from the robustness design process, the improved 2 degree of freedom(DOF) robust internal model controller structure is given from the standard Youla parameterization. Simulations of double lane change maneuver and those of crosswind disturbances are conducted for evaluating the robust control algorithm, on the basis of a nonlinear vehicle simulation model with a magic tyre model. Results show that the established 2-DOF robust IMC method has better model tracking ability and a guaranteed level of robustness and robust performance, which can enhance the vehicle stability and handling, regardless of variations of the vehicle model parameters and the external crosswind interferences. Contradiction between performance and robustness of active steering control algorithm is solved and higher control performance with certain robustness to model uncertainties is obtained.

  12. An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems

    NASA Astrophysics Data System (ADS)

    Hieb, Jeffrey; Graham, James; Guan, Jian

    This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.

  13. Development of a method of robust rain gauge network optimization based on intensity-duration-frequency results

    NASA Astrophysics Data System (ADS)

    Chebbi, A.; Bargaoui, Z. K.; da Conceição Cunha, M.

    2013-10-01

    Based on rainfall intensity-duration-frequency (IDF) curves, fitted in several locations of a given area, a robust optimization approach is proposed to identify the best locations to install new rain gauges. The advantage of robust optimization is that the resulting design solutions yield networks which behave acceptably under hydrological variability. Robust optimization can overcome the problem of selecting representative rainfall events when building the optimization process. This paper reports an original approach based on Montana IDF model parameters. The latter are assumed to be geostatistical variables, and their spatial interdependence is taken into account through the adoption of cross-variograms in the kriging process. The problem of optimally locating a fixed number of new monitoring stations based on an existing rain gauge network is addressed. The objective function is based on the mean spatial kriging variance and rainfall variogram structure using a variance-reduction method. Hydrological variability was taken into account by considering and implementing several return periods to define the robust objective function. Variance minimization is performed using a simulated annealing algorithm. In addition, knowledge of the time horizon is needed for the computation of the robust objective function. A short- and a long-term horizon were studied, and optimal networks are identified for each. The method developed is applied to north Tunisia (area = 21 000 km2). Data inputs for the variogram analysis were IDF curves provided by the hydrological bureau and available for 14 tipping bucket type rain gauges. The recording period was from 1962 to 2001, depending on the station. The study concerns an imaginary network augmentation based on the network configuration in 1973, which is a very significant year in Tunisia because there was an exceptional regional flood event in March 1973. This network consisted of 13 stations and did not meet World Meteorological

  14. Robust Crossfeed Design for Hovering Rotorcraft

    NASA Technical Reports Server (NTRS)

    Catapang, David R.

    1993-01-01

    Control law design for rotorcraft fly-by-wire systems normally attempts to decouple angular responses using fixed-gain crossfeeds. This approach can lead to poor decoupling over the frequency range of pilot inputs and increase the load on the feedback loops. In order to improve the decoupling performance, dynamic crossfeeds may be adopted. Moreover, because of the large changes that occur in rotorcraft dynamics due to small changes about the nominal design condition, especially for near-hovering flight, the crossfeed design must be 'robust'. A new low-order matching method is presented here to design robust crossfeed compensators for multi-input, multi-output (MIMO) systems. The technique identifies degrees-of-freedom that can be decoupled using crossfeeds, given an anticipated set of parameter variations for the range of flight conditions of concern. Cross-coupling is then reduced for degrees-of-freedom that can use crossfeed compensation by minimizing off-axis response magnitude average and variance. Results are presented for the analysis of pitch, roll, yaw and heave coupling of the UH-60 Black Hawk helicopter in near-hovering flight. Robust crossfeeds are designed that show significant improvement in decoupling performance and robustness over nominal, single design point, compensators. The design method and results are presented in an easily used graphical format that lends significant physical insight to the design procedure. This plant pre-compensation technique is an appropriate preliminary step to the design of robust feedback control laws for rotorcraft.

  15. Kinetic Energy Transfer Process in a Double Shell Leading to Robust Burn

    NASA Astrophysics Data System (ADS)

    Montgomery, D. S.; Daughton, W. S.; Albright, B. J.; Wilson, D. C.; Loomis, E. N.; Merritt, E. C.; Dodd, E. S.; Kirkpatrick, R. C.; Watt, R. G.; Rosen, M. D.

    2017-10-01

    A goal of double shell capsule implosions is to impart sufficient internal energy to the D-T fuel at stagnation in order to obtain robust α-heating and burn with low hot spot convergence, C.R. < 10. A simple description of the kinetic energy transfer from the outer shell to the inner shell is found using shock physics and adiabatic compression, and compares well with 1D modeling. An isobaric model for the stagnation phase of the inner shell is used to determine the ideal partition of internal energy in the D-T fuel. Robust burn of the fuel requires, at minimum, that α-heating exceeds the rate of cooling by expansion of the hot spot so that the yield occurs before the hot spot disassembles, which is then used to define a minimum requirement for robust burn. One potential advantage of a double shell capsule compared to single shell capsules is the use of a heavy metal pusher, which may lead to a longer hot spot disassembly time. We present these analytic results and compare them to 1D and 2D radiation-hydrodynamic simulations. Work performed under the auspices of DOE by LANL under contract DE-AC52-06NA25396.

  16. A robust nonlinear filter for image restoration.

    PubMed

    Koivunen, V

    1995-01-01

    A class of nonlinear regression filters based on robust estimation theory is introduced. The goal of the filtering is to recover a high-quality image from degraded observations. Models for desired image structures and contaminating processes are employed, but deviations from strict assumptions are allowed since the assumptions on signal and noise are typically only approximately true. The robustness of filters is usually addressed only in a distributional sense, i.e., the actual error distribution deviates from the nominal one. In this paper, the robustness is considered in a broad sense since the outliers may also be due to inappropriate signal model, or there may be more than one statistical population present in the processing window, causing biased estimates. Two filtering algorithms minimizing a least trimmed squares criterion are provided. The design of the filters is simple since no scale parameters or context-dependent threshold values are required. Experimental results using both real and simulated data are presented. The filters effectively attenuate both impulsive and nonimpulsive noise while recovering the signal structure and preserving interesting details.

  17. Wavelet Filtering to Reduce Conservatism in Aeroservoelastic Robust Stability Margins

    NASA Technical Reports Server (NTRS)

    Brenner, Marty; Lind, Rick

    1998-01-01

    Wavelet analysis for filtering and system identification was used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins was reduced with parametric and nonparametric time-frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data was used to reduce the effects of external desirableness and unmodeled dynamics. Parametric estimates of modal stability were also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. F-18 high Alpha Research Vehicle aeroservoelastic flight test data demonstrated improved robust stability prediction by extension of the stability boundary beyond the flight regime.

  18. Efficient and mechanically robust stretchable organic light-emitting devices by a laser-programmable buckling process

    PubMed Central

    Yin, Da; Feng, Jing; Ma, Rui; Liu, Yue-Feng; Zhang, Yong-Lai; Zhang, Xu-Lin; Bi, Yan-Gang; Chen, Qi-Dai; Sun, Hong-Bo

    2016-01-01

    Stretchable organic light-emitting devices are becoming increasingly important in the fast-growing fields of wearable displays, biomedical devices and health-monitoring technology. Although highly stretchable devices have been demonstrated, their luminous efficiency and mechanical stability remain impractical for the purposes of real-life applications. This is due to significant challenges arising from the high strain-induced limitations on the structure design of the device, the materials used and the difficulty of controlling the stretch-release process. Here we have developed a laser-programmable buckling process to overcome these obstacles and realize a highly stretchable organic light-emitting diode with unprecedented efficiency and mechanical robustness. The strained device luminous efficiency −70 cd A−1 under 70% strain - is the largest to date and the device can accommodate 100% strain while exhibiting only small fluctuations in performance over 15,000 stretch-release cycles. This work paves the way towards fully stretchable organic light-emitting diodes that can be used in wearable electronic devices. PMID:27187936

  19. Application of a tablet film coating model to define a process-imposed transition boundary for robust film coating.

    PubMed

    van den Ban, Sander; Pitt, Kendal G; Whiteman, Marshall

    2018-02-01

    A scientific understanding of interaction of product, film coat, film coating process, and equipment is important to enable design and operation of industrial scale pharmaceutical film coating processes that are robust and provide the level of control required to consistently deliver quality film coated product. Thermodynamic film coating conditions provided in the tablet film coating process impact film coat formation and subsequent product quality. A thermodynamic film coating model was used to evaluate film coating process performance over a wide range of film coating equipment from pilot to industrial scale (2.5-400 kg). An approximate process-imposed transition boundary, from operating in a dry to a wet environment, was derived, for relative humidity and exhaust temperature, and used to understand the impact of the film coating process on product formulation and process control requirements. This approximate transition boundary may aid in an enhanced understanding of risk to product quality, application of modern Quality by Design (QbD) based product development, technology transfer and scale-up, and support the science-based justification of critical process parameters (CPPs).

  20. A hybrid approach identifies metabolic signatures of high-producers for chinese hamster ovary clone selection and process optimization.

    PubMed

    Popp, Oliver; Müller, Dirk; Didzus, Katharina; Paul, Wolfgang; Lipsmeier, Florian; Kirchner, Florian; Niklas, Jens; Mauch, Klaus; Beaucamp, Nicola

    2016-09-01

    In-depth characterization of high-producer cell lines and bioprocesses is vital to ensure robust and consistent production of recombinant therapeutic proteins in high quantity and quality for clinical applications. This requires applying appropriate methods during bioprocess development to enable meaningful characterization of CHO clones and processes. Here, we present a novel hybrid approach for supporting comprehensive characterization of metabolic clone performance. The approach combines metabolite profiling with multivariate data analysis and fluxomics to enable a data-driven mechanistic analysis of key metabolic traits associated with desired cell phenotypes. We applied the methodology to quantify and compare metabolic performance in a set of 10 recombinant CHO-K1 producer clones and a host cell line. The comprehensive characterization enabled us to derive an extended set of clone performance criteria that not only captured growth and product formation, but also incorporated information on intracellular clone physiology and on metabolic changes during the process. These criteria served to establish a quantitative clone ranking and allowed us to identify metabolic differences between high-producing CHO-K1 clones yielding comparably high product titers. Through multivariate data analysis of the combined metabolite and flux data we uncovered common metabolic traits characteristic of high-producer clones in the screening setup. This included high intracellular rates of glutamine synthesis, low cysteine uptake, reduced excretion of aspartate and glutamate, and low intracellular degradation rates of branched-chain amino acids and of histidine. Finally, the above approach was integrated into a workflow that enables standardized high-content selection of CHO producer clones in a high-throughput fashion. In conclusion, the combination of quantitative metabolite profiling, multivariate data analysis, and mechanistic network model simulations can identify metabolic

  1. Green-Solvent-Processable, Dopant-Free Hole-Transporting Materials for Robust and Efficient Perovskite Solar Cells.

    PubMed

    Lee, Junwoo; Malekshahi Byranvand, Mahdi; Kang, Gyeongho; Son, Sung Y; Song, Seulki; Kim, Guan-Woo; Park, Taiho

    2017-09-06

    In addition to having proper energy levels and high hole mobility (μ h ) without the use of dopants, hole-transporting materials (HTMs) used in n-i-p-type perovskite solar cells (PSCs) should be processed using green solvents to enable environmentally friendly device fabrication. Although many HTMs have been assessed, due to the limited solubility of HTMs in green solvents, no green-solvent-processable HTM has been reported to date. Here, we report on a green-solvent-processable HTM, an asymmetric D-A polymer (asy-PBTBDT) that exhibits superior solubility even in the green solvent, 2-methylanisole, which is a known food additive. The new HTM is well matched with perovskites in terms of energy levels and attains a high μ h (1.13 × 10 -3 cm 2 /(V s)) even without the use of dopants. Using the HTM, we produced robust PSCs with 18.3% efficiency (91% retention after 30 days without encapsulation under 50%-75% relative humidity) without dopants; with dopants (bis(trifluoromethanesulfonyl) imide and tert-butylpyridine, a 20.0% efficiency was achieved. Therefore, it is a first report for a green-solvent-processable hole-transporting polymer, exhibiting the highest efficiencies reported so far for n-i-p devices with and without the dopants.

  2. Metabolic Control in Mammalian Fed-Batch Cell Cultures for Reduced Lactic Acid Accumulation and Improved Process Robustness

    PubMed Central

    Konakovsky, Viktor; Clemens, Christoph; Müller, Markus Michael; Bechmann, Jan; Berger, Martina; Schlatter, Stefan; Herwig, Christoph

    2016-01-01

    Biomass and cell-specific metabolic rates usually change dynamically over time, making the “feed according to need” strategy difficult to realize in a commercial fed-batch process. We here demonstrate a novel feeding strategy which is designed to hold a particular metabolic state in a fed-batch process by adaptive feeding in real time. The feed rate is calculated with a transferable biomass model based on capacitance, which changes the nutrient flow stoichiometrically in real time. A limited glucose environment was used to confine the cell in a particular metabolic state. In order to cope with uncertainty, two strategies were tested to change the adaptive feed rate and prevent starvation while in limitation: (i) inline pH and online glucose concentration measurement or (ii) inline pH alone, which was shown to be sufficient for the problem statement. In this contribution, we achieved metabolic control within a defined target range. The direct benefit was two-fold: the lactic acid profile was improved and pH could be kept stable. Multivariate Data Analysis (MVDA) has shown that pH influenced lactic acid production or consumption in historical data sets. We demonstrate that a low pH (around 6.8) is not required for our strategy, as glucose availability is already limiting the flux. On the contrary, we boosted glycolytic flux in glucose limitation by setting the pH to 7.4. This new approach led to a yield of lactic acid/glucose (Y L/G) around zero for the whole process time and high titers in our labs. We hypothesize that a higher carbon flux, resulting from a higher pH, may lead to more cells which produce more product. The relevance of this work aims at feeding mammalian cell cultures safely in limitation with a desired metabolic flux range. This resulted in extremely stable, low glucose levels, very robust pH profiles without acid/base interventions and a metabolic state in which lactic acid was consumed instead of being produced from day 1. With this

  3. Design optimization for cost and quality: The robust design approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  4. Genome-wide screen identifies a novel prognostic signature for breast cancer survival

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mao, Xuan Y.; Lee, Matthew J.; Zhu, Jeffrey

    Large genomic datasets in combination with clinical data can be used as an unbiased tool to identify genes important in patient survival and discover potential therapeutic targets. We used a genome-wide screen to identify 587 genes significantly and robustly deregulated across four independent breast cancer (BC) datasets compared to normal breast tissue. Gene expression of 381 genes was significantly associated with relapse-free survival (RFS) in BC patients. We used a gene co-expression network approach to visualize the genetic architecture in normal breast and BCs. In normal breast tissue, co-expression cliques were identified enriched for cell cycle, gene transcription, cell adhesion,more » cytoskeletal organization and metabolism. In contrast, in BC, only two major co-expression cliques were identified enriched for cell cycle-related processes or blood vessel development, cell adhesion and mammary gland development processes. Interestingly, gene expression levels of 7 genes were found to be negatively correlated with many cell cycle related genes, highlighting these genes as potential tumor suppressors and novel therapeutic targets. A forward-conditional Cox regression analysis was used to identify a 12-gene signature associated with RFS. A prognostic scoring system was created based on the 12-gene signature. This scoring system robustly predicted BC patient RFS in 60 sampling test sets and was further validated in TCGA and METABRIC BC data. Our integrated study identified a 12-gene prognostic signature that could guide adjuvant therapy for BC patients and includes novel potential molecular targets for therapy.« less

  5. Genome-wide screen identifies a novel prognostic signature for breast cancer survival

    DOE PAGES

    Mao, Xuan Y.; Lee, Matthew J.; Zhu, Jeffrey; ...

    2017-01-21

    Large genomic datasets in combination with clinical data can be used as an unbiased tool to identify genes important in patient survival and discover potential therapeutic targets. We used a genome-wide screen to identify 587 genes significantly and robustly deregulated across four independent breast cancer (BC) datasets compared to normal breast tissue. Gene expression of 381 genes was significantly associated with relapse-free survival (RFS) in BC patients. We used a gene co-expression network approach to visualize the genetic architecture in normal breast and BCs. In normal breast tissue, co-expression cliques were identified enriched for cell cycle, gene transcription, cell adhesion,more » cytoskeletal organization and metabolism. In contrast, in BC, only two major co-expression cliques were identified enriched for cell cycle-related processes or blood vessel development, cell adhesion and mammary gland development processes. Interestingly, gene expression levels of 7 genes were found to be negatively correlated with many cell cycle related genes, highlighting these genes as potential tumor suppressors and novel therapeutic targets. A forward-conditional Cox regression analysis was used to identify a 12-gene signature associated with RFS. A prognostic scoring system was created based on the 12-gene signature. This scoring system robustly predicted BC patient RFS in 60 sampling test sets and was further validated in TCGA and METABRIC BC data. Our integrated study identified a 12-gene prognostic signature that could guide adjuvant therapy for BC patients and includes novel potential molecular targets for therapy.« less

  6. Parenchymal texture analysis in digital mammography: robust texture feature identification and equivalence across devices.

    PubMed

    Keller, Brad M; Oustimov, Andrew; Wang, Yan; Chen, Jinbo; Acciavatti, Raymond J; Zheng, Yuanjie; Ray, Shonket; Gee, James C; Maidment, Andrew D A; Kontos, Despina

    2015-04-01

    An analytical framework is presented for evaluating the equivalence of parenchymal texture features across different full-field digital mammography (FFDM) systems using a physical breast phantom. Phantom images (FOR PROCESSING) are acquired from three FFDM systems using their automated exposure control setting. A panel of texture features, including gray-level histogram, co-occurrence, run length, and structural descriptors, are extracted. To identify features that are robust across imaging systems, a series of equivalence tests are performed on the feature distributions, in which the extent of their intersystem variation is compared to their intrasystem variation via the Hodges-Lehmann test statistic. Overall, histogram and structural features tend to be most robust across all systems, and certain features, such as edge enhancement, tend to be more robust to intergenerational differences between detectors of a single vendor than to intervendor differences. Texture features extracted from larger regions of interest (i.e., [Formula: see text]) and with a larger offset length (i.e., [Formula: see text]), when applicable, also appear to be more robust across imaging systems. This framework and observations from our experiments may benefit applications utilizing mammographic texture analysis on images acquired in multivendor settings, such as in multicenter studies of computer-aided detection and breast cancer risk assessment.

  7. Best Practices for Reliable and Robust Spacecraft Structures

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Murthy, P. L. N.; Patel, Naresh R.; Bonacuse, Peter J.; Elliott, Kenny B.; Gordon, S. A.; Gyekenyesi, J. P.; Daso, E. O.; Aggarwal, P.; Tillman, R. F.

    2007-01-01

    A study was undertaken to capture the best practices for the development of reliable and robust spacecraft structures for NASA s next generation cargo and crewed launch vehicles. In this study, the NASA heritage programs such as Mercury, Gemini, Apollo, and the Space Shuttle program were examined. A series of lessons learned during the NASA and DoD heritage programs are captured. The processes that "make the right structural system" are examined along with the processes to "make the structural system right". The impact of technology advancements in materials and analysis and testing methods on reliability and robustness of spacecraft structures is studied. The best practices and lessons learned are extracted from these studies. Since the first human space flight, the best practices for reliable and robust spacecraft structures appear to be well established, understood, and articulated by each generation of designers and engineers. However, these best practices apparently have not always been followed. When the best practices are ignored or short cuts are taken, risks accumulate, and reliability suffers. Thus program managers need to be vigilant of circumstances and situations that tend to violate best practices. Adherence to the best practices may help develop spacecraft systems with high reliability and robustness against certain anomalies and unforeseen events.

  8. Multi-point objective-oriented sequential sampling strategy for constrained robust design

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; Zhang, Siliang; Chen, Wei

    2015-03-01

    Metamodelling techniques are widely used to approximate system responses of expensive simulation models. In association with the use of metamodels, objective-oriented sequential sampling methods have been demonstrated to be effective in balancing the need for searching an optimal solution versus reducing the metamodelling uncertainty. However, existing infilling criteria are developed for deterministic problems and restricted to one sampling point in one iteration. To exploit the use of multiple samples and identify the true robust solution in fewer iterations, a multi-point objective-oriented sequential sampling strategy is proposed for constrained robust design problems. In this article, earlier development of objective-oriented sequential sampling strategy for unconstrained robust design is first extended to constrained problems. Next, a double-loop multi-point sequential sampling strategy is developed. The proposed methods are validated using two mathematical examples followed by a highly nonlinear automotive crashworthiness design example. The results show that the proposed method can mitigate the effect of both metamodelling uncertainty and design uncertainty, and identify the robust design solution more efficiently than the single-point sequential sampling approach.

  9. Robust design of microchannel cooler

    NASA Astrophysics Data System (ADS)

    He, Ye; Yang, Tao; Hu, Li; Li, Leimin

    2005-12-01

    Microchannel cooler has offered a new method for the cooling of high power diode lasers, with the advantages of small volume, high efficiency of thermal dissipation and low cost when mass-produced. In order to reduce the sensitivity of design to manufacture errors or other disturbances, Taguchi method that is one of robust design method was chosen to optimize three parameters important to the cooling performance of roof-like microchannel cooler. The hydromechanical and thermal mathematical model of varying section microchannel was calculated using finite volume method by FLUENT. A special program was written to realize the automation of the design process for improving efficiency. The optimal design is presented which compromises between optimal cooling performance and its robustness. This design method proves to be available.

  10. Robustness, evolvability, and the logic of genetic regulation.

    PubMed

    Payne, Joshua L; Moore, Jason H; Wagner, Andreas

    2014-01-01

    In gene regulatory circuits, the expression of individual genes is commonly modulated by a set of regulating gene products, which bind to a gene's cis-regulatory region. This region encodes an input-output function, referred to as signal-integration logic, that maps a specific combination of regulatory signals (inputs) to a particular expression state (output) of a gene. The space of all possible signal-integration functions is vast and the mapping from input to output is many-to-one: For the same set of inputs, many functions (genotypes) yield the same expression output (phenotype). Here, we exhaustively enumerate the set of signal-integration functions that yield identical gene expression patterns within a computational model of gene regulatory circuits. Our goal is to characterize the relationship between robustness and evolvability in the signal-integration space of regulatory circuits, and to understand how these properties vary between the genotypic and phenotypic scales. Among other results, we find that the distributions of genotypic robustness are skewed, so that the majority of signal-integration functions are robust to perturbation. We show that the connected set of genotypes that make up a given phenotype are constrained to specific regions of the space of all possible signal-integration functions, but that as the distance between genotypes increases, so does their capacity for unique innovations. In addition, we find that robust phenotypes are (i) evolvable, (ii) easily identified by random mutation, and (iii) mutationally biased toward other robust phenotypes. We explore the implications of these latter observations for mutation-based evolution by conducting random walks between randomly chosen source and target phenotypes. We demonstrate that the time required to identify the target phenotype is independent of the properties of the source phenotype.

  11. Robustness, Evolvability, and the Logic of Genetic Regulation

    PubMed Central

    Moore, Jason H.; Wagner, Andreas

    2014-01-01

    In gene regulatory circuits, the expression of individual genes is commonly modulated by a set of regulating gene products, which bind to a gene’s cis-regulatory region. This region encodes an input-output function, referred to as signal-integration logic, that maps a specific combination of regulatory signals (inputs) to a particular expression state (output) of a gene. The space of all possible signal-integration functions is vast and the mapping from input to output is many-to-one: for the same set of inputs, many functions (genotypes) yield the same expression output (phenotype). Here, we exhaustively enumerate the set of signal-integration functions that yield idential gene expression patterns within a computational model of gene regulatory circuits. Our goal is to characterize the relationship between robustness and evolvability in the signal-integration space of regulatory circuits, and to understand how these properties vary between the genotypic and phenotypic scales. Among other results, we find that the distributions of genotypic robustness are skewed, such that the majority of signal-integration functions are robust to perturbation. We show that the connected set of genotypes that make up a given phenotype are constrained to specific regions of the space of all possible signal-integration functions, but that as the distance between genotypes increases, so does their capacity for unique innovations. In addition, we find that robust phenotypes are (i) evolvable, (ii) easily identified by random mutation, and (iii) mutationally biased toward other robust phenotypes. We explore the implications of these latter observations for mutation-based evolution by conducting random walks between randomly chosen source and target phenotypes. We demonstrate that the time required to identify the target phenotype is independent of the properties of the source phenotype. PMID:23373974

  12. Robustness of near-infrared calibration models for the prediction of milk constituents during the milking process.

    PubMed

    Melfsen, Andreas; Hartung, Eberhard; Haeussermann, Angelika

    2013-02-01

    The robustness of in-line raw milk analysis with near-infrared spectroscopy (NIRS) was tested with respect to the prediction of the raw milk contents fat, protein and lactose. Near-infrared (NIR) spectra of raw milk (n = 3119) were acquired on three different farms during the milking process of 354 milkings over a period of six months. Calibration models were calculated for: a random data set of each farm (fully random internal calibration); first two thirds of the visits per farm (internal calibration); whole datasets of two of the three farms (external calibration), and combinations of external and internal datasets. Validation was done either on the remaining data set per farm (internal validation) or on data of the remaining farms (external validation). Excellent calibration results were obtained when fully randomised internal calibration sets were used for milk analysis. In this case, RPD values of around ten, five and three for the prediction of fat, protein and lactose content, respectively, were achieved. Farm internal calibrations achieved much poorer prediction results especially for the prediction of protein and lactose with RPD values of around two and one respectively. The prediction accuracy improved when validation was done on spectra of an external farm, mainly due to the higher sample variation in external calibration sets in terms of feeding diets and individual cow effects. The results showed that further improvements were achieved when additional farm information was added to the calibration set. One of the main requirements towards a robust calibration model is the ability to predict milk constituents in unknown future milk samples. The robustness and quality of prediction increases with increasing variation of, e.g., feeding and cow individual milk composition in the calibration model.

  13. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  14. Identifying pathogenic processes by integrating microarray data with prior knowledge

    PubMed Central

    2014-01-01

    Background It is of great importance to identify molecular processes and pathways that are involved in disease etiology. Although there has been an extensive use of various high-throughput methods for this task, pathogenic pathways are still not completely understood. Often the set of genes or proteins identified as altered in genome-wide screens show a poor overlap with canonical disease pathways. These findings are difficult to interpret, yet crucial in order to improve the understanding of the molecular processes underlying the disease progression. We present a novel method for identifying groups of connected molecules from a set of differentially expressed genes. These groups represent functional modules sharing common cellular function and involve signaling and regulatory events. Specifically, our method makes use of Bayesian statistics to identify groups of co-regulated genes based on the microarray data, where external information about molecular interactions and connections are used as priors in the group assignments. Markov chain Monte Carlo sampling is used to search for the most reliable grouping. Results Simulation results showed that the method improved the ability of identifying correct groups compared to traditional clustering, especially for small sample sizes. Applied to a microarray heart failure dataset the method found one large cluster with several genes important for the structure of the extracellular matrix and a smaller group with many genes involved in carbohydrate metabolism. The method was also applied to a microarray dataset on melanoma cancer patients with or without metastasis, where the main cluster was dominated by genes related to keratinocyte differentiation. Conclusion Our method found clusters overlapping with known pathogenic processes, but also pointed to new connections extending beyond the classical pathways. PMID:24758699

  15. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  16. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  17. Neural processing of gendered information is more robustly associated with mothers' gendered communication with children than mothers' implicit and explicit gender stereotypes.

    PubMed

    Endendijk, Joyce J; Spencer, Hannah; Bos, Peter A; Derks, Belle

    2018-04-26

    Processes like gender socialization (the ways in which parents convey information to their children about how girls and boys should behave) often happen unconsciously and might therefore be studied best with neuroscientific measures. We examined whether neural processing of gender-stereotype-congruent and incongruent information is more robustly related to mothers' gendered socialization of their child than mothers' implicit and explicit gender stereotypes. To this end, we examined event-related potentials (ERPs) of mothers (N = 35) completing an implicit gender-stereotype task and mothers' gender stereotypes in relation to observed gendered communication with their child (2-6 years old) in a naturalistic picture-book-reading setting. Increased N2 activity (previously related to attentional processes) to gender stimuli in the implicit gender-stereotype task was associated with mothers' positive evaluation of similar gendered behaviors and activities in the picture book they read with their child. Increased P300 activity (previously related to attention to unexpected events) to incongruent trials in the gender-stereotype task was associated with a more positive evaluation of congruent versus incongruent pictures. Compared to mothers' gender stereotypes, neural processing of gendered information was more robustly related to how mothers talk to their children about boys' and girls' stereotype-congruent and incongruent behavior, and masculine and feminine activities.

  18. A robust real-time abnormal region detection framework from capsule endoscopy images

    NASA Astrophysics Data System (ADS)

    Cheng, Yanfen; Liu, Xu; Li, Huiping

    2009-02-01

    In this paper we present a novel method to detect abnormal regions from capsule endoscopy images. Wireless Capsule Endoscopy (WCE) is a recent technology where a capsule with an embedded camera is swallowed by the patient to visualize the gastrointestinal tract. One challenge is one procedure of diagnosis will send out over 50,000 images, making physicians' reviewing process expensive. Physicians' reviewing process involves in identifying images containing abnormal regions (tumor, bleeding, etc) from this large number of image sequence. In this paper we construct a novel framework for robust and real-time abnormal region detection from large amount of capsule endoscopy images. The detected potential abnormal regions can be labeled out automatically to let physicians review further, therefore, reduce the overall reviewing process. In this paper we construct an abnormal region detection framework with the following advantages: 1) Trainable. Users can define and label any type of abnormal region they want to find; The abnormal regions, such as tumor, bleeding, etc., can be pre-defined and labeled using the graphical user interface tool we provided. 2) Efficient. Due to the large number of image data, the detection speed is very important. Our system can detect very efficiently at different scales due to the integral image features we used; 3) Robust. After feature selection we use a cascade of classifiers to further enforce the detection accuracy.

  19. The Self in Movement: Being Identified and Identifying Oneself in the Process of Migration and Asylum Seeking.

    PubMed

    Watzlawik, Meike; Brescó de Luna, Ignacio

    2017-06-01

    How migration influences the processes of identity development has been under longstanding scrutiny in the social sciences. Usually, stage models have been suggested, and different strategies for acculturation (e.g., integration, assimilation, separation, and marginalization) have been considered as ways to make sense of the psychological transformations of migrants as a group. On an individual level, however, identity development is a more complex endeavor: Identity does not just develop by itself, but is constructed as an ongoing process. To capture these processes, we will look at different aspects of migration and asylum seeking; for example, the cultural-specific values and expectations of the hosting (European) countries (e.g., as identifier), but also of the arriving individuals/groups (e.g., identified as refugees). Since the two may contradict each other, negotiations between identities claims and identity assignments become necessary. Ways to solve these contradictions are discussed, with a special focus on the experienced (and often missing) agency in different settings upon arrival in a new country. In addition, it will be shown how sudden events (e.g., 9/11, the Charlie Hebdo attack) may challenge identity processes in different ways.

  20. Oscillatory Protein Expression Dynamics Endows Stem Cells with Robust Differentiation Potential

    PubMed Central

    Kaneko, Kunihiko

    2011-01-01

    The lack of understanding of stem cell differentiation and proliferation is a fundamental problem in developmental biology. Although gene regulatory networks (GRNs) for stem cell differentiation have been partially identified, the nature of differentiation dynamics and their regulation leading to robust development remain unclear. Herein, using a dynamical system modeling cell approach, we performed simulations of the developmental process using all possible GRNs with a few genes, and screened GRNs that could generate cell type diversity through cell-cell interactions. We found that model stem cells that both proliferated and differentiated always exhibited oscillatory expression dynamics, and the differentiation frequency of such stem cells was regulated, resulting in a robust number distribution. Moreover, we uncovered the common regulatory motifs for stem cell differentiation, in which a combination of regulatory motifs that generated oscillatory expression dynamics and stabilized distinct cellular states played an essential role. These findings may explain the recently observed heterogeneity and dynamic equilibrium in cellular states of stem cells, and can be used to predict regulatory networks responsible for differentiation in stem cell systems. PMID:22073296

  1. The Robust Beauty of Ordinary Information

    ERIC Educational Resources Information Center

    Katsikopoulos, Konstantinos V.; Schooler, Lael J.; Hertwig, Ralph

    2010-01-01

    Heuristics embodying limited information search and noncompensatory processing of information can yield robust performance relative to computationally more complex models. One criticism raised against heuristics is the argument that complexity is hidden in the calculation of the cue order used to make predictions. We discuss ways to order cues…

  2. Amino acid positions subject to multiple coevolutionary constraints can be robustly identified by their eigenvector network centrality scores.

    PubMed

    Parente, Daniel J; Ray, J Christian J; Swint-Kruse, Liskin

    2015-12-01

    As proteins evolve, amino acid positions key to protein structure or function are subject to mutational constraints. These positions can be detected by analyzing sequence families for amino acid conservation or for coevolution between pairs of positions. Coevolutionary scores are usually rank-ordered and thresholded to reveal the top pairwise scores, but they also can be treated as weighted networks. Here, we used network analyses to bypass a major complication of coevolution studies: For a given sequence alignment, alternative algorithms usually identify different, top pairwise scores. We reconciled results from five commonly-used, mathematically divergent algorithms (ELSC, McBASC, OMES, SCA, and ZNMI), using the LacI/GalR and 1,6-bisphosphate aldolase protein families as models. Calculations used unthresholded coevolution scores from which column-specific properties such as sequence entropy and random noise were subtracted; "central" positions were identified by calculating various network centrality scores. When compared among algorithms, network centrality methods, particularly eigenvector centrality, showed markedly better agreement than comparisons of the top pairwise scores. Positions with large centrality scores occurred at key structural locations and/or were functionally sensitive to mutations. Further, the top central positions often differed from those with top pairwise coevolution scores: instead of a few strong scores, central positions often had multiple, moderate scores. We conclude that eigenvector centrality calculations reveal a robust evolutionary pattern of constraints-detectable by divergent algorithms--that occur at key protein locations. Finally, we discuss the fact that multiple patterns coexist in evolutionary data that, together, give rise to emergent protein functions. © 2015 Wiley Periodicals, Inc.

  3. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision

  4. Many-objective robust decision making for water allocation under climate change.

    PubMed

    Yan, Dan; Ludwig, Fulco; Huang, He Qing; Werners, Saskia E

    2017-12-31

    Water allocation is facing profound challenges due to climate change uncertainties. To identify adaptive water allocation strategies that are robust to climate change uncertainties, a model framework combining many-objective robust decision making and biophysical modeling is developed for large rivers. The framework was applied to the Pearl River basin (PRB), China where sufficient flow to the delta is required to reduce saltwater intrusion in the dry season. Before identifying and assessing robust water allocation plans for the future, the performance of ten state-of-the-art MOEAs (multi-objective evolutionary algorithms) is evaluated for the water allocation problem in the PRB. The Borg multi-objective evolutionary algorithm (Borg MOEA), which is a self-adaptive optimization algorithm, has the best performance during the historical periods. Therefore it is selected to generate new water allocation plans for the future (2079-2099). This study shows that robust decision making using carefully selected MOEAs can help limit saltwater intrusion in the Pearl River Delta. However, the framework could perform poorly due to larger than expected climate change impacts on water availability. Results also show that subjective design choices from the researchers and/or water managers could potentially affect the ability of the model framework, and cause the most robust water allocation plans to fail under future climate change. Developing robust allocation plans in a river basin suffering from increasing water shortage requires the researchers and water managers to well characterize future climate change of the study regions and vulnerabilities of their tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Geometrically robust image watermarking by sector-shaped partitioning of geometric-invariant regions.

    PubMed

    Tian, Huawei; Zhao, Yao; Ni, Rongrong; Cao, Gang

    2009-11-23

    In a feature-based geometrically robust watermarking system, it is a challenging task to detect geometric-invariant regions (GIRs) which can survive a broad range of image processing operations. Instead of commonly used Harris detector or Mexican hat wavelet method, a more robust corner detector named multi-scale curvature product (MSCP) is adopted to extract salient features in this paper. Based on such features, disk-like GIRs are found, which consists of three steps. First, robust edge contours are extracted. Then, MSCP is utilized to detect the centers for GIRs. Third, the characteristic scale selection is performed to calculate the radius of each GIR. A novel sector-shaped partitioning method for the GIRs is designed, which can divide a GIR into several sector discs with the help of the most important corner (MIC). The watermark message is then embedded bit by bit in each sector by using Quantization Index Modulation (QIM). The GIRs and the divided sector discs are invariant to geometric transforms, so the watermarking method inherently has high robustness against geometric attacks. Experimental results show that the scheme has a better robustness against various image processing operations including common processing attacks, affine transforms, cropping, and random bending attack (RBA) than the previous approaches.

  6. Diffusion pseudotime robustly reconstructs lineage branching.

    PubMed

    Haghverdi, Laleh; Büttner, Maren; Wolf, F Alexander; Buettner, Florian; Theis, Fabian J

    2016-10-01

    The temporal order of differentiating cells is intrinsically encoded in their single-cell expression profiles. We describe an efficient way to robustly estimate this order according to diffusion pseudotime (DPT), which measures transitions between cells using diffusion-like random walks. Our DPT software implementations make it possible to reconstruct the developmental progression of cells and identify transient or metastable states, branching decisions and differentiation endpoints.

  7. Design of a robust fuzzy controller for the arc stability of CO(2) welding process using the Taguchi method.

    PubMed

    Kim, Dongcheol; Rhee, Sehun

    2002-01-01

    CO(2) welding is a complex process. Weld quality is dependent on arc stability and minimizing the effects of disturbances or changes in the operating condition commonly occurring during the welding process. In order to minimize these effects, a controller can be used. In this study, a fuzzy controller was used in order to stabilize the arc during CO(2) welding. The input variable of the controller was the Mita index. This index estimates quantitatively the arc stability that is influenced by many welding process parameters. Because the welding process is complex, a mathematical model of the Mita index was difficult to derive. Therefore, the parameter settings of the fuzzy controller were determined by performing actual control experiments without using a mathematical model of the controlled process. The solution, the Taguchi method was used to determine the optimal control parameter settings of the fuzzy controller to make the control performance robust and insensitive to the changes in the operating conditions.

  8. Info-Gap robustness pathway method for transitioning of urban drainage systems under deep uncertainties.

    PubMed

    Zischg, Jonatan; Goncalves, Mariana L R; Bacchin, Taneha Kuzniecow; Leonhardt, Günther; Viklander, Maria; van Timmeren, Arjan; Rauch, Wolfgang; Sitzenfrei, Robert

    2017-09-01

    In the urban water cycle, there are different ways of handling stormwater runoff. Traditional systems mainly rely on underground piped, sometimes named 'gray' infrastructure. New and so-called 'green/blue' ambitions aim for treating and conveying the runoff at the surface. Such concepts are mainly based on ground infiltration and temporal storage. In this work a methodology to create and compare different planning alternatives for stormwater handling on their pathways to a desired system state is presented. Investigations are made to assess the system performance and robustness when facing the deeply uncertain spatial and temporal developments in the future urban fabric, including impacts caused by climate change, urbanization and other disruptive events, like shifts in the network layout and interactions of 'gray' and 'green/blue' structures. With the Info-Gap robustness pathway method, three planning alternatives are evaluated to identify critical performance levels at different stages over time. This novel methodology is applied to a real case study problem where a city relocation process takes place during the upcoming decades. In this case study it is shown that hybrid systems including green infrastructures are more robust with respect to future uncertainties, compared to traditional network design.

  9. Topological properties of robust biological and computational networks

    PubMed Central

    Navlakha, Saket; He, Xin; Faloutsos, Christos; Bar-Joseph, Ziv

    2014-01-01

    Network robustness is an important principle in biology and engineering. Previous studies of global networks have identified both redundancy and sparseness as topological properties used by robust networks. By focusing on molecular subnetworks, or modules, we show that module topology is tightly linked to the level of environmental variability (noise) the module expects to encounter. Modules internal to the cell that are less exposed to environmental noise are more connected and less robust than external modules. A similar design principle is used by several other biological networks. We propose a simple change to the evolutionary gene duplication model which gives rise to the rich range of module topologies observed within real networks. We apply these observations to evaluate and design communication networks that are specifically optimized for noisy or malicious environments. Combined, joint analysis of biological and computational networks leads to novel algorithms and insights benefiting both fields. PMID:24789562

  10. Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares

    NASA Technical Reports Server (NTRS)

    Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.

    2012-01-01

    A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.

  11. Performance analysis of robust road sign identification

    NASA Astrophysics Data System (ADS)

    Ali, Nursabillilah M.; Mustafah, Y. M.; Rashid, N. K. A. M.

    2013-12-01

    This study describes performance analysis of a robust system for road sign identification that incorporated two stages of different algorithms. The proposed algorithms consist of HSV color filtering and PCA techniques respectively in detection and recognition stages. The proposed algorithms are able to detect the three standard types of colored images namely Red, Yellow and Blue. The hypothesis of the study is that road sign images can be used to detect and identify signs that are involved with the existence of occlusions and rotational changes. PCA is known as feature extraction technique that reduces dimensional size. The sign image can be easily recognized and identified by the PCA method as is has been used in many application areas. Based on the experimental result, it shows that the HSV is robust in road sign detection with minimum of 88% and 77% successful rate for non-partial and partial occlusions images. For successful recognition rates using PCA can be achieved in the range of 94-98%. The occurrences of all classes are recognized successfully is between 5% and 10% level of occlusions.

  12. Robust Kalman filter design for predictive wind shear detection

    NASA Technical Reports Server (NTRS)

    Stratton, Alexander D.; Stengel, Robert F.

    1991-01-01

    Severe, low-altitude wind shear is a threat to aviation safety. Airborne sensors under development measure the radial component of wind along a line directly in front of an aircraft. In this paper, optimal estimation theory is used to define a detection algorithm to warn of hazardous wind shear from these sensors. To achieve robustness, a wind shear detection algorithm must distinguish threatening wind shear from less hazardous gustiness, despite variations in wind shear structure. This paper presents statistical analysis methods to refine wind shear detection algorithm robustness. Computational methods predict the ability to warn of severe wind shear and avoid false warning. Comparative capability of the detection algorithm as a function of its design parameters is determined, identifying designs that provide robust detection of severe wind shear.

  13. Engineering Robustness of Microbial Cell Factories.

    PubMed

    Gong, Zhiwei; Nielsen, Jens; Zhou, Yongjin J

    2017-10-01

    Metabolic engineering and synthetic biology offer great prospects in developing microbial cell factories capable of converting renewable feedstocks into fuels, chemicals, food ingredients, and pharmaceuticals. However, prohibitively low production rate and mass concentration remain the major hurdles in industrial processes even though the biosynthetic pathways are comprehensively optimized. These limitations are caused by a variety of factors unamenable for host cell survival, such as harsh industrial conditions, fermentation inhibitors from biomass hydrolysates, and toxic compounds including metabolic intermediates and valuable target products. Therefore, engineered microbes with robust phenotypes is essential for achieving higher yield and productivity. In this review, the recent advances in engineering robustness and tolerance of cell factories is described to cope with these issues and briefly introduce novel strategies with great potential to enhance the robustness of cell factories, including metabolic pathway balancing, transporter engineering, and adaptive laboratory evolution. This review also highlights the integration of advanced systems and synthetic biology principles toward engineering the harmony of overall cell function, more than the specific pathways or enzymes. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Efficient and Robust Signal Approximations

    DTIC Science & Technology

    2009-05-01

    otherwise. Remark. Permutation matrices are both orthogonal and doubly- stochastic [62]. We will now show how to further simplify the Robust Coding...reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: signal processing, image compression, independent component analysis , sparse

  15. Visualization of the Invisible, Explanation of the Unknown, Ruggedization of the Unstable: Sensitivity Analysis, Virtual Tryout and Robust Design through Systematic Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Zwickl, Titus; Carleer, Bart; Kubli, Waldemar

    2005-08-01

    In the past decade, sheet metal forming simulation became a well established tool to predict the formability of parts. In the automotive industry, this has enabled significant reduction in the cost and time for vehicle design and development, and has helped to improve the quality and performance of vehicle parts. However, production stoppages for troubleshooting and unplanned die maintenance, as well as production quality fluctuations continue to plague manufacturing cost and time. The focus therefore has shifted in recent times beyond mere feasibility to robustness of the product and process being engineered. Ensuring robustness is the next big challenge for the virtual tryout / simulation technology. We introduce new methods, based on systematic stochastic simulations, to visualize the behavior of the part during the whole forming process — in simulation as well as in production. Sensitivity analysis explains the response of the part to changes in influencing parameters. Virtual tryout allows quick exploration of changed designs and conditions. Robust design and manufacturing guarantees quality and process capability for the production process. While conventional simulations helped to reduce development time and cost by ensuring feasible processes, robustness engineering tools have the potential for far greater cost and time savings. Through examples we illustrate how expected and unexpected behavior of deep drawing parts may be tracked down, identified and assigned to the influential parameters. With this knowledge, defects can be eliminated or springback can be compensated e.g.; the response of the part to uncontrollable noise can be predicted and minimized. The newly introduced methods enable more reliable and predictable stamping processes in general.

  16. A robust automated system elucidates mouse home cage behavioral structure

    PubMed Central

    Goulding, Evan H.; Schenk, A. Katrin; Juneja, Punita; MacKay, Adrienne W.; Wade, Jennifer M.; Tecott, Laurence H.

    2008-01-01

    Patterns of behavior exhibited by mice in their home cages reflect the function and interaction of numerous behavioral and physiological systems. Detailed assessment of these patterns thus has the potential to provide a powerful tool for understanding basic aspects of behavioral regulation and their perturbation by disease processes. However, the capacity to identify and examine these patterns in terms of their discrete levels of organization across diverse behaviors has been difficult to achieve and automate. Here, we describe an automated approach for the quantitative characterization of fundamental behavioral elements and their patterns in the freely behaving mouse. We demonstrate the utility of this approach by identifying unique features of home cage behavioral structure and changes in distinct levels of behavioral organization in mice with single gene mutations altering energy balance. The robust, automated, reproducible quantification of mouse home cage behavioral structure detailed here should have wide applicability for the study of mammalian physiology, behavior, and disease. PMID:19106295

  17. Optimization of robustness of interdependent network controllability by redundant design

    PubMed Central

    2018-01-01

    Controllability of complex networks has been a hot topic in recent years. Real networks regarded as interdependent networks are always coupled together by multiple networks. The cascading process of interdependent networks including interdependent failure and overload failure will destroy the robustness of controllability for the whole network. Therefore, the optimization of the robustness of interdependent network controllability is of great importance in the research area of complex networks. In this paper, based on the model of interdependent networks constructed first, we determine the cascading process under different proportions of node attacks. Then, the structural controllability of interdependent networks is measured by the minimum driver nodes. Furthermore, we propose a parameter which can be obtained by the structure and minimum driver set of interdependent networks under different proportions of node attacks and analyze the robustness for interdependent network controllability. Finally, we optimize the robustness of interdependent network controllability by redundant design including node backup and redundancy edge backup and improve the redundant design by proposing different strategies according to their cost. Comparative strategies of redundant design are conducted to find the best strategy. Results shows that node backup and redundancy edge backup can indeed decrease those nodes suffering from failure and improve the robustness of controllability. Considering the cost of redundant design, we should choose BBS (betweenness-based strategy) or DBS (degree based strategy) for node backup and HDF(high degree first) for redundancy edge backup. Above all, our proposed strategies are feasible and effective at improving the robustness of interdependent network controllability. PMID:29438426

  18. Robust detection, isolation and accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Emami-Naeini, A.; Akhter, M. M.; Rock, S. M.

    1986-01-01

    The objective is to extend the recent advances in robust control system design of multivariable systems to sensor failure detection, isolation, and accommodation (DIA), and estimator design. This effort provides analysis tools to quantify the trade-off between performance robustness and DIA sensitivity, which are to be used to achieve higher levels of performance robustness for given levels of DIA sensitivity. An innovations-based DIA scheme is used. Estimators, which depend upon a model of the process and process inputs and outputs, are used to generate these innovations. Thresholds used to determine failure detection are computed based on bounds on modeling errors, noise properties, and the class of failures. The applicability of the newly developed tools are demonstrated on a multivariable aircraft turbojet engine example. A new concept call the threshold selector was developed. It represents a significant and innovative tool for the analysis and synthesis of DiA algorithms. The estimators were made robust by introduction of an internal model and by frequency shaping. The internal mode provides asymptotically unbiased filter estimates.The incorporation of frequency shaping of the Linear Quadratic Gaussian cost functional modifies the estimator design to make it suitable for sensor failure DIA. The results are compared with previous studies which used thresholds that were selcted empirically. Comparison of these two techniques on a nonlinear dynamic engine simulation shows improved performance of the new method compared to previous techniques

  19. Study of robust thin film PT-1000 temperature sensors for cryogenic process control applications

    NASA Astrophysics Data System (ADS)

    Ramalingam, R.; Boguhn, D.; Fillinger, H.; Schlachter, S. I.; Süßer, M.

    2014-01-01

    In some cryogenic process measurement applications, for example, in hydrogen technology and in high temperature superconductor based generators, there is a need of robust temperature sensors. These sensors should be able to measure the large temperature range of 20 - 500 K with reasonable resolution and accuracy. Thin film PT 1000 sensors could be a choice to cover this large temperature range. Twenty one sensors selected from the same production batch were tested for their temperature sensitivity which was then compared with different batch sensors. Furthermore, the sensor's stability was studied by subjecting the sensors to repeated temperature cycles of 78-525 K. Deviations in the resistance were investigated using ice point calibration and water triple point calibration methods. Also the study of directional oriented intense static magnetic field effects up to 8 Oersted (Oe) were conducted to understand its magneto resistance behaviour in the cryogenic temperature range from 77 K - 15 K. This paper reports all investigation results in detail.

  20. Process development for robust removal of aggregates using cation exchange chromatography in monoclonal antibody purification with implementation of quality by design.

    PubMed

    Xu, Zhihao; Li, Jason; Zhou, Joe X

    2012-01-01

    Aggregate removal is one of the most important aspects in monoclonal antibody (mAb) purification. Cation-exchange chromatography (CEX), a widely used polishing step in mAb purification, is able to clear both process-related impurities and product-related impurities. In this study, with the implementation of quality by design (QbD), a process development approach for robust removal of aggregates using CEX is described. First, resin screening studies were performed and a suitable CEX resin was chosen because of its relatively better selectivity and higher dynamic binding capacity. Second, a pH-conductivity hybrid gradient elution method for the CEX was established, and the risk assessment for the process was carried out. Third, a process characterization study was used to evaluate the impact of the potentially important process parameters on the process performance with respect to aggregate removal. Accordingly, a process design space was established. Aggregate level in load is the critical parameter. Its operating range is set at 0-3% and the acceptable range is set at 0-5%. Equilibration buffer is the key parameter. Its operating range is set at 40 ± 5 mM acetate, pH 5.0 ± 0.1, and acceptable range is set at 40 ± 10 mM acetate, pH 5.0 ± 0.2. Elution buffer, load mass, and gradient elution volume are non-key parameters; their operating ranges and acceptable ranges are equally set at 250 ± 10 mM acetate, pH 6.0 ± 0.2, 45 ± 10 g/L resin, and 10 ± 20% CV respectively. Finally, the process was scaled up 80 times and the impurities removal profiles were revealed. Three scaled-up runs showed that the size-exclusion chromatography (SEC) purity of the CEX pool was 99.8% or above and the step yield was above 92%, thereby proving that the process is both consistent and robust.

  1. Supervisor Expertise, Teacher Autonomy and Environmental Robustness.

    ERIC Educational Resources Information Center

    Street, Sue; Licata, Joseph W.

    This study examines the collective perspective that teachers in schools have about the relationship between the supervisory expertise of the principal, teacher work autonomy, and school environmental robustness. Supervisory expertise, and teachers' satisfaction with the supervisory process, is measured with the "Fidelity of Supervision…

  2. Robust Fuzzy Logic Stabilization with Disturbance Elimination

    PubMed Central

    Danapalasingam, Kumeresan A.

    2014-01-01

    A robust fuzzy logic controller is proposed for stabilization and disturbance rejection in nonlinear control systems of a particular type. The dynamic feedback controller is designed as a combination of a control law that compensates for nonlinear terms in a control system and a dynamic fuzzy logic controller that addresses unknown model uncertainties and an unmeasured disturbance. Since it is challenging to derive a highly accurate mathematical model, the proposed controller requires only nominal functions of a control system. In this paper, a mathematical derivation is carried out to prove that the controller is able to achieve asymptotic stability by processing state measurements. Robustness here refers to the ability of the controller to asymptotically steer the state vector towards the origin in the presence of model uncertainties and a disturbance input. Simulation results of the robust fuzzy logic controller application in a magnetic levitation system demonstrate the feasibility of the control design. PMID:25177713

  3. Optimally robust redundancy relations for failure detection in uncertain systems

    NASA Technical Reports Server (NTRS)

    Lou, X.-C.; Willsky, A. S.; Verghese, G. C.

    1986-01-01

    All failure detection methods are based, either explicitly or implicitly, on the use of redundancy, i.e. on (possibly dynamic) relations among the measured variables. The robustness of the failure detection process consequently depends to a great degree on the reliability of the redundancy relations, which in turn is affected by the inevitable presence of model uncertainties. In this paper the problem of determining redundancy relations that are optimally robust is addressed in a sense that includes several major issues of importance in practical failure detection and that provides a significant amount of intuition concerning the geometry of robust failure detection. A procedure is given involving the construction of a single matrix and its singular value decomposition for the determination of a complete sequence of redundancy relations, ordered in terms of their level of robustness. This procedure also provides the basis for comparing levels of robustness in redundancy provided by different sets of sensors.

  4. Towards a Video Passive Content Fingerprinting Method for Partial-Copy Detection Robust against Non-Simulated Attacks

    PubMed Central

    2016-01-01

    Passive content fingerprinting is widely used for video content identification and monitoring. However, many challenges remain unsolved especially for partial-copies detection. The main challenge is to find the right balance between the computational cost of fingerprint extraction and fingerprint dimension, without compromising detection performance against various attacks (robustness). Fast video detection performance is desirable in several modern applications, for instance, in those where video detection involves the use of large video databases or in applications requiring real-time video detection of partial copies, a process whose difficulty increases when videos suffer severe transformations. In this context, conventional fingerprinting methods are not fully suitable to cope with the attacks and transformations mentioned before, either because the robustness of these methods is not enough or because their execution time is very high, where the time bottleneck is commonly found in the fingerprint extraction and matching operations. Motivated by these issues, in this work we propose a content fingerprinting method based on the extraction of a set of independent binary global and local fingerprints. Although these features are robust against common video transformations, their combination is more discriminant against severe video transformations such as signal processing attacks, geometric transformations and temporal and spatial desynchronization. Additionally, we use an efficient multilevel filtering system accelerating the processes of fingerprint extraction and matching. This multilevel filtering system helps to rapidly identify potential similar video copies upon which the fingerprint process is carried out only, thus saving computational time. We tested with datasets of real copied videos, and the results show how our method outperforms state-of-the-art methods regarding detection scores. Furthermore, the granularity of our method makes it suitable for

  5. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Reed, Patrick; Trindade, Bernardo; Jonathan, Herman; Harrison, Zeff; Gregory, Characklis

    2016-04-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  6. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Trindade, B. C.; Reed, P. M.; Herman, J. D.; Zeff, H. B.; Characklis, G. W.

    2015-12-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as of the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management should be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  7. Covariate selection with group lasso and doubly robust estimation of causal effects

    PubMed Central

    Koch, Brandon; Vock, David M.; Wolfson, Julian

    2017-01-01

    Summary The efficiency of doubly robust estimators of the average causal effect (ACE) of a treatment can be improved by including in the treatment and outcome models only those covariates which are related to both treatment and outcome (i.e., confounders) or related only to the outcome. However, it is often challenging to identify such covariates among the large number that may be measured in a given study. In this paper, we propose GLiDeR (Group Lasso and Doubly Robust Estimation), a novel variable selection technique for identifying confounders and predictors of outcome using an adaptive group lasso approach that simultaneously performs coefficient selection, regularization, and estimation across the treatment and outcome models. The selected variables and corresponding coefficient estimates are used in a standard doubly robust ACE estimator. We provide asymptotic results showing that, for a broad class of data generating mechanisms, GLiDeR yields a consistent estimator of the ACE when either the outcome or treatment model is correctly specified. A comprehensive simulation study shows that GLiDeR is more efficient than doubly robust methods using standard variable selection techniques and has substantial computational advantages over a recently proposed doubly robust Bayesian model averaging method. We illustrate our method by estimating the causal treatment effect of bilateral versus single-lung transplant on forced expiratory volume in one year after transplant using an observational registry. PMID:28636276

  8. A robust two-stage design identifying the optimal biological dose for phase I/II clinical trials.

    PubMed

    Zang, Yong; Lee, J Jack

    2017-01-15

    We propose a robust two-stage design to identify the optimal biological dose for phase I/II clinical trials evaluating both toxicity and efficacy outcomes. In the first stage of dose finding, we use the Bayesian model averaging continual reassessment method to monitor the toxicity outcomes and adopt an isotonic regression method based on the efficacy outcomes to guide dose escalation. When the first stage ends, we use the Dirichlet-multinomial distribution to jointly model the toxicity and efficacy outcomes and pick the candidate doses based on a three-dimensional volume ratio. The selected candidate doses are then seamlessly advanced to the second stage for dose validation. Both toxicity and efficacy outcomes are continuously monitored so that any overly toxic and/or less efficacious dose can be dropped from the study as the trial continues. When the phase I/II trial ends, we select the optimal biological dose as the dose obtaining the minimal value of the volume ratio within the candidate set. An advantage of the proposed design is that it does not impose a monotonically increasing assumption on the shape of the dose-efficacy curve. We conduct extensive simulation studies to examine the operating characteristics of the proposed design. The simulation results show that the proposed design has desirable operating characteristics across different shapes of the underlying true dose-toxicity and dose-efficacy curves. The software to implement the proposed design is available upon request. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Distribution path robust optimization of electric vehicle with multiple distribution centers

    PubMed Central

    Hao, Wei; He, Ruichun; Jia, Xiaoyan; Pan, Fuquan; Fan, Jing; Xiong, Ruiqi

    2018-01-01

    To identify electrical vehicle (EV) distribution paths with high robustness, insensitivity to uncertainty factors, and detailed road-by-road schemes, optimization of the distribution path problem of EV with multiple distribution centers and considering the charging facilities is necessary. With the minimum transport time as the goal, a robust optimization model of EV distribution path with adjustable robustness is established based on Bertsimas’ theory of robust discrete optimization. An enhanced three-segment genetic algorithm is also developed to solve the model, such that the optimal distribution scheme initially contains all road-by-road path data using the three-segment mixed coding and decoding method. During genetic manipulation, different interlacing and mutation operations are carried out on different chromosomes, while, during population evolution, the infeasible solution is naturally avoided. A part of the road network of Xifeng District in Qingyang City is taken as an example to test the model and the algorithm in this study, and the concrete transportation paths are utilized in the final distribution scheme. Therefore, more robust EV distribution paths with multiple distribution centers can be obtained using the robust optimization model. PMID:29518169

  10. Robust functional regression model for marginal mean and subject-specific inferences.

    PubMed

    Cao, Chunzheng; Shi, Jian Qing; Lee, Youngjo

    2017-01-01

    We introduce flexible robust functional regression models, using various heavy-tailed processes, including a Student t-process. We propose efficient algorithms in estimating parameters for the marginal mean inferences and in predicting conditional means as well as interpolation and extrapolation for the subject-specific inferences. We develop bootstrap prediction intervals (PIs) for conditional mean curves. Numerical studies show that the proposed model provides a robust approach against data contamination or distribution misspecification, and the proposed PIs maintain the nominal confidence levels. A real data application is presented as an illustrative example.

  11. Functional Groups Based on Leaf Physiology: Are they Spatially and Temporally Robust?

    NASA Technical Reports Server (NTRS)

    Foster, Tammy E.; Brooks, J. Renee

    2004-01-01

    The functional grouping hypothesis, which suggests that complexity in ecosystem function can be simplified by grouping species with similar responses, was tested in the Florida scrub habitat. Functional groups were identified based on how species in fire maintained Florida scrub regulate exchange of carbon and water with the atmosphere as indicated by both instantaneous gas exchange measurements and integrated measures of function (%N, delta C-13, delta N-15, C-N ratio). Using cluster analysis, five distinct physiologically-based functional groups were identified in the fire maintained scrub. These functional groups were tested to determine if they were robust spatially, temporally, and with management regime. Analysis of Similarities (ANOSIM), a non-parametric multivariate analysis, indicated that these five physiologically-based groupings were not altered by plot differences (R = -0.115, p = 0.893) or by the three different management regimes; prescribed burn, mechanically treated and burn, and fire-suppressed (R = 0.018, p = 0.349). The physiological groupings also remained robust between the two climatically different years 1999 and 2000 (R = -0.027, p = 0.725). Easy-to-measure morphological characteristics indicating functional groups would be more practical for scaling and modeling ecosystem processes than detailed gas-exchange measurements, therefore we tested a variety of morphological characteristics as functional indicators. A combination of non-parametric multivariate techniques (Hierarchical cluster analysis, non-metric Multi-Dimensional Scaling, and ANOSIM) were used to compare the ability of life form, leaf thickness, and specific leaf area classifications to identify the physiologically-based functional groups. Life form classifications (ANOSIM; R = 0.629, p 0.001) were able to depict the physiological groupings more adequately than either specific leaf area (ANOSIM; R = 0.426, p = 0.001) or leaf thickness (ANOSIM; R 0.344, p 0.001). The ability of

  12. Robust Bayesian clustering.

    PubMed

    Archambeau, Cédric; Verleysen, Michel

    2007-01-01

    A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algorithm leads to (i) robust density estimation, (ii) robust clustering and (iii) robust automatic model selection. Gaussian mixture models are learning machines which are based on a divide-and-conquer approach. They are commonly used for density estimation and clustering tasks, but are sensitive to outliers. The Student-t distribution has heavier tails than the Gaussian distribution and is therefore less sensitive to any departure of the empirical distribution from Gaussianity. As a consequence, the Student-t distribution is suitable for constructing robust mixture models. In this work, we formalize the Bayesian Student-t mixture model as a latent variable model in a different way from Svensén and Bishop [Svensén, M., & Bishop, C. M. (2005). Robust Bayesian mixture modelling. Neurocomputing, 64, 235-252]. The main difference resides in the fact that it is not necessary to assume a factorized approximation of the posterior distribution on the latent indicator variables and the latent scale variables in order to obtain a tractable solution. Not neglecting the correlations between these unobserved random variables leads to a Bayesian model having an increased robustness. Furthermore, it is expected that the lower bound on the log-evidence is tighter. Based on this bound, the model complexity, i.e. the number of components in the mixture, can be inferred with a higher confidence.

  13. Importance analysis for Hudson River PCB transport and fate model parameters using robust sensitivity studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, S.; Toll, J.; Cothern, K.

    1995-12-31

    The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less

  14. UNIX-based operating systems robustness evaluation

    NASA Technical Reports Server (NTRS)

    Chang, Yu-Ming

    1996-01-01

    Robust operating systems are required for reliable computing. Techniques for robustness evaluation of operating systems not only enhance the understanding of the reliability of computer systems, but also provide valuable feed- back to system designers. This thesis presents results from robustness evaluation experiments on five UNIX-based operating systems, which include Digital Equipment's OSF/l, Hewlett Packard's HP-UX, Sun Microsystems' Solaris and SunOS, and Silicon Graphics' IRIX. Three sets of experiments were performed. The methodology for evaluation tested (1) the exception handling mechanism, (2) system resource management, and (3) system capacity under high workload stress. An exception generator was used to evaluate the exception handling mechanism of the operating systems. Results included exit status of the exception generator and the system state. Resource management techniques used by individual operating systems were tested using programs designed to usurp system resources such as physical memory and process slots. Finally, the workload stress testing evaluated the effect of the workload on system performance by running a synthetic workload and recording the response time of local and remote user requests. Moderate to severe performance degradations were observed on the systems under stress.

  15. Robust, optimal subsonic airfoil shapes

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan (Inventor)

    2008-01-01

    Method system, and product from application of the method, for design of a subsonic airfoil shape, beginning with an arbitrary initial airfoil shape and incorporating one or more constraints on the airfoil geometric parameters and flow characteristics. The resulting design is robust against variations in airfoil dimensions and local airfoil shape introduced in the airfoil manufacturing process. A perturbation procedure provides a class of airfoil shapes, beginning with an initial airfoil shape.

  16. Stochastic simulation and robust design optimization of integrated photonic filters

    NASA Astrophysics Data System (ADS)

    Weng, Tsui-Wei; Melati, Daniele; Melloni, Andrea; Daniel, Luca

    2017-01-01

    Manufacturing variations are becoming an unavoidable issue in modern fabrication processes; therefore, it is crucial to be able to include stochastic uncertainties in the design phase. In this paper, integrated photonic coupled ring resonator filters are considered as an example of significant interest. The sparsity structure in photonic circuits is exploited to construct a sparse combined generalized polynomial chaos model, which is then used to analyze related statistics and perform robust design optimization. Simulation results show that the optimized circuits are more robust to fabrication process variations and achieve a reduction of 11%-35% in the mean square errors of the 3 dB bandwidth compared to unoptimized nominal designs.

  17. Robust allocation of a defensive budget considering an attacker's private information.

    PubMed

    Nikoofal, Mohammad E; Zhuang, Jun

    2012-05-01

    Attackers' private information is one of the main issues in defensive resource allocation games in homeland security. The outcome of a defense resource allocation decision critically depends on the accuracy of estimations about the attacker's attributes. However, terrorists' goals may be unknown to the defender, necessitating robust decisions by the defender. This article develops a robust-optimization game-theoretical model for identifying optimal defense resource allocation strategies for a rational defender facing a strategic attacker while the attacker's valuation of targets, being the most critical attribute of the attacker, is unknown but belongs to bounded distribution-free intervals. To our best knowledge, no previous research has applied robust optimization in homeland security resource allocation when uncertainty is defined in bounded distribution-free intervals. The key features of our model include (1) modeling uncertainty in attackers' attributes, where uncertainty is characterized by bounded intervals; (2) finding the robust-optimization equilibrium for the defender using concepts dealing with budget of uncertainty and price of robustness; and (3) applying the proposed model to real data. © 2011 Society for Risk Analysis.

  18. A robust color signal processing with wide dynamic range WRGB CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Kawada, Shun; Kuroda, Rihito; Sugawa, Shigetoshi

    2011-01-01

    We have developed a robust color reproduction methodology by a simple calculation with a new color matrix using the formerly developed wide dynamic range WRGB lateral overflow integration capacitor (LOFIC) CMOS image sensor. The image sensor was fabricated through a 0.18 μm CMOS technology and has a 45 degrees oblique pixel array, the 4.2 μm effective pixel pitch and the W pixels. A W pixel was formed by replacing one of the two G pixels in the Bayer RGB color filter. The W pixel has a high sensitivity through the visible light waveband. An emerald green and yellow (EGY) signal is generated from the difference between the W signal and the sum of RGB signals. This EGY signal mainly includes emerald green and yellow lights. These colors are difficult to be reproduced accurately by the conventional simple linear matrix because their wave lengths are in the valleys of the spectral sensitivity characteristics of the RGB pixels. A new linear matrix based on the EGY-RGB signal was developed. Using this simple matrix, a highly accurate color processing with a large margin to the sensitivity fluctuation and noise has been achieved.

  19. Robust, directed assembly of fluorescent nanodiamonds.

    PubMed

    Kianinia, Mehran; Shimoni, Olga; Bendavid, Avi; Schell, Andreas W; Randolph, Steven J; Toth, Milos; Aharonovich, Igor; Lobo, Charlene J

    2016-10-27

    Arrays of fluorescent nanoparticles are highly sought after for applications in sensing, nanophotonics and quantum communications. Here we present a simple and robust method of assembling fluorescent nanodiamonds into macroscopic arrays. Remarkably, the yield of this directed assembly process is greater than 90% and the assembled patterns withstand ultra-sonication for more than three hours. The assembly process is based on covalent bonding of carboxyl to amine functional carbon seeds and is applicable to any material, and to non-planar surfaces. Our results pave the way to directed assembly of sensors and nanophotonics devices.

  20. Robust optimization of front members in a full frontal car impact

    NASA Astrophysics Data System (ADS)

    Aspenberg (né Lönn), David; Jergeus, Johan; Nilsson, Larsgunnar

    2013-03-01

    In the search for lightweight automobile designs, it is necessary to assure that robust crashworthiness performance is achieved. Structures that are optimized to handle a finite number of load cases may perform poorly when subjected to various dispersions. Thus, uncertainties must be accounted for in the optimization process. This article presents an approach to optimization where all design evaluations include an evaluation of the robustness. Metamodel approximations are applied both to the design space and the robustness evaluations, using artifical neural networks and polynomials, respectively. The features of the robust optimization approach are displayed in an analytical example, and further demonstrated in a large-scale design example of front side members of a car. Different optimization formulations are applied and it is shown that the proposed approach works well. It is also concluded that a robust optimization puts higher demands on the finite element model performance than normally.

  1. Robust Spatial Autoregressive Modeling for Hardwood Log Inspection

    Treesearch

    Dongping Zhu; A.A. Beex

    1994-01-01

    We explore the application of a stochastic texture modeling method toward a machine vision system for log inspection in the forest products industry. This machine vision system uses computerized tomography (CT) imaging to locate and identify internal defects in hardwood logs. The application of CT to such industrial vision problems requires efficient and robust image...

  2. Discriminating sediment archives and sedimentary processes in the arid endorheic Ejina Basin, NW China using a robust geochemical approach

    NASA Astrophysics Data System (ADS)

    Yu, Kaifeng; Hartmann, Kai; Nottebaum, Veit; Stauch, Georg; Lu, Huayu; Zeeden, Christian; Yi, Shuangwen; Wünnemann, Bernd; Lehmkuhl, Frank

    2016-04-01

    Geochemical characteristics have been intensively used to assign sediment properties to paleoclimate and provenance. Nonetheless, in particular concerning the arid context, bulk geochemistry of different sediment archives and corresponding process interpretations are hitherto elusive. The Ejina Basin, with its suite of different sediment archives, is known as one of the main sources for the loess accumulation on the Chinese Loess Plateau. In order to understand mechanisms along this supra-regional sediment cascade, it is crucial to decipher the archive characteristics and formation processes. To address these issues, five profiles in different geomorphological contexts were selected. Analyses of X-ray fluorescence and diffraction, grain size, optically stimulated luminescence and radiocarbon dating were performed. Robust factor analysis was applied to reduce the attribute space to the process space of sedimentation history. Five sediment archives from three lithologic units exhibit geochemical characteristics as follows: (i) aeolian sands have high contents of Zr and Hf, whereas only Hf can be regarded as a valuable indicator to discriminate the coarse sand proportion; (ii) sandy loess has high Ca and Sr contents which both exhibit broad correlations with the medium to coarse silt proportions; (iii) lacustrine clays have high contents of felsic, ferromagnesian and mica source elements e.g., K, Fe, Ti, V, and Ni; (iv) fluvial sands have high contents of Mg, Cl and Na which may be enriched in evaporite minerals; (v) alluvial gravels have high contents of Cr which may originate from nearby Cr-rich bedrock. Temporal variations can be illustrated by four robust factors: weathering intensity, silicate-bearing mineral abundance, saline/alkaline magnitude and quasi-constant aeolian input. In summary, the bulk-composition of the late Quaternary sediments in this arid context is governed by the nature of the source terrain, weak chemical weathering, authigenic minerals

  3. 20 CFR 1010.300 - What processes are to be implemented to identify covered persons?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What processes are to be implemented to identify covered persons? 1010.300 Section 1010.300 Employees' Benefits OFFICE OF THE ASSISTANT SECRETARY... processes to identify covered persons who physically access service delivery points or who access virtual...

  4. Covariate selection with group lasso and doubly robust estimation of causal effects.

    PubMed

    Koch, Brandon; Vock, David M; Wolfson, Julian

    2018-03-01

    The efficiency of doubly robust estimators of the average causal effect (ACE) of a treatment can be improved by including in the treatment and outcome models only those covariates which are related to both treatment and outcome (i.e., confounders) or related only to the outcome. However, it is often challenging to identify such covariates among the large number that may be measured in a given study. In this article, we propose GLiDeR (Group Lasso and Doubly Robust Estimation), a novel variable selection technique for identifying confounders and predictors of outcome using an adaptive group lasso approach that simultaneously performs coefficient selection, regularization, and estimation across the treatment and outcome models. The selected variables and corresponding coefficient estimates are used in a standard doubly robust ACE estimator. We provide asymptotic results showing that, for a broad class of data generating mechanisms, GLiDeR yields a consistent estimator of the ACE when either the outcome or treatment model is correctly specified. A comprehensive simulation study shows that GLiDeR is more efficient than doubly robust methods using standard variable selection techniques and has substantial computational advantages over a recently proposed doubly robust Bayesian model averaging method. We illustrate our method by estimating the causal treatment effect of bilateral versus single-lung transplant on forced expiratory volume in one year after transplant using an observational registry. © 2017, The International Biometric Society.

  5. Constructing Robust Cooperative Networks using a Multi-Objective Evolutionary Algorithm

    PubMed Central

    Wang, Shuai; Liu, Jing

    2017-01-01

    The design and construction of network structures oriented towards different applications has attracted much attention recently. The existing studies indicated that structural heterogeneity plays different roles in promoting cooperation and robustness. Compared with rewiring a predefined network, it is more flexible and practical to construct new networks that satisfy the desired properties. Therefore, in this paper, we study a method for constructing robust cooperative networks where the only constraint is that the number of nodes and links is predefined. We model this network construction problem as a multi-objective optimization problem and propose a multi-objective evolutionary algorithm, named MOEA-Netrc, to generate the desired networks from arbitrary initializations. The performance of MOEA-Netrc is validated on several synthetic and real-world networks. The results show that MOEA-Netrc can construct balanced candidates and is insensitive to the initializations. MOEA-Netrc can find the Pareto fronts for networks with different levels of cooperation and robustness. In addition, further investigation of the robustness of the constructed networks revealed the impact on other aspects of robustness during the construction process. PMID:28134314

  6. A Hybrid Interval–Robust Optimization Model for Water Quality Management

    PubMed Central

    Xu, Jieyu; Li, Yongping; Huang, Guohe

    2013-01-01

    Abstract In water quality management problems, uncertainties may exist in many system components and pollution-related processes (i.e., random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval–robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements. PMID:23922495

  7. A Hybrid Interval-Robust Optimization Model for Water Quality Management.

    PubMed

    Xu, Jieyu; Li, Yongping; Huang, Guohe

    2013-05-01

    In water quality management problems, uncertainties may exist in many system components and pollution-related processes ( i.e. , random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval-robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements.

  8. Enhanced echolocation via robust statistics and super-resolution of sonar images

    NASA Astrophysics Data System (ADS)

    Kim, Kio

    Echolocation is a process in which an animal uses acoustic signals to exchange information with environments. In a recent study, Neretti et al. have shown that the use of robust statistics can significantly improve the resiliency of echolocation against noise and enhance its accuracy by suppressing the development of sidelobes in the processing of an echo signal. In this research, the use of robust statistics is extended to problems in underwater explorations. The dissertation consists of two parts. Part I describes how robust statistics can enhance the identification of target objects, which in this case are cylindrical containers filled with four different liquids. Particularly, this work employs a variation of an existing robust estimator called an L-estimator, which was first suggested by Koenker and Bassett. As pointed out by Au et al.; a 'highlight interval' is an important feature, and it is closely related with many other important features that are known to be crucial for dolphin echolocation. A varied L-estimator described in this text is used to enhance the detection of highlight intervals, which eventually leads to a successful classification of echo signals. Part II extends the problem into 2 dimensions. Thanks to the advances in material and computer technology, various sonar imaging modalities are available on the market. By registering acoustic images from such video sequences, one can extract more information on the region of interest. Computer vision and image processing allowed application of robust statistics to the acoustic images produced by forward looking sonar systems, such as Dual-frequency Identification Sonar and ProViewer. The first use of robust statistics for sonar image enhancement in this text is in image registration. Random Sampling Consensus (RANSAC) is widely used for image registration. The registration algorithm using RANSAC is optimized for sonar image registration, and the performance is studied. The second use of robust

  9. Automated robust registration of grossly misregistered whole-slide images with varying stains

    NASA Astrophysics Data System (ADS)

    Litjens, G.; Safferling, K.; Grabe, N.

    2016-03-01

    Cancer diagnosis and pharmaceutical research increasingly depend on the accurate quantification of cancer biomarkers. Identification of biomarkers is usually performed through immunohistochemical staining of cancer sections on glass slides. However, combination of multiple biomarkers from a wide variety of immunohistochemically stained slides is a tedious process in traditional histopathology due to the switching of glass slides and re-identification of regions of interest by pathologists. Digital pathology now allows us to apply image registration algorithms to digitized whole-slides to align the differing immunohistochemical stains automatically. However, registration algorithms need to be robust to changes in color due to differing stains and severe changes in tissue content between slides. In this work we developed a robust registration methodology to allow for fast coarse alignment of multiple immunohistochemical stains to the base hematyoxylin and eosin stained image. We applied HSD color model conversion to obtain a less stain color dependent representation of the whole-slide images. Subsequently, optical density thresholding and connected component analysis were used to identify the relevant regions for registration. Template matching using normalized mutual information was applied to provide initial translation and rotation parameters, after which a cost function-driven affine registration was performed. The algorithm was validated using 40 slides from 10 prostate cancer patients, with landmark registration error as a metric. Median landmark registration error was around 180 microns, which indicates performance is adequate for practical application. None of the registrations failed, indicating the robustness of the algorithm.

  10. What Is Robustness?: Problem Framing Challenges for Water Systems Planning Under Change

    NASA Astrophysics Data System (ADS)

    Herman, J. D.; Reed, P. M.; Zeff, H. B.; Characklis, G. W.

    2014-12-01

    Water systems planners have long recognized the need for robust solutions capable of withstanding deviations from the conditions for which they were designed. Faced with a set of alternatives to choose from—for example, resulting from a multi-objective optimization—existing analysis frameworks offer competing definitions of robustness under change. Robustness analyses have moved from expected utility to exploratory "bottom-up" approaches in which vulnerable scenarios are identified prior to assigning likelihoods; examples include Robust Decision Making (RDM), Decision Scaling, Info-Gap, and Many-Objective Robust Decision Making (MORDM). We propose a taxonomy of robustness frameworks to compare and contrast these approaches, based on their methods of (1) alternative selection, (2) sampling of states of the world, (3) quantification of robustness measures, and (4) identification of key uncertainties using sensitivity analysis. Using model simulations from recent work in multi-objective urban water supply portfolio planning, we illustrate the decision-relevant consequences that emerge from each of these choices. Results indicate that the methodological choices in the taxonomy lead to substantially different planning alternatives, underscoring the importance of an informed definition of robustness. We conclude with a set of recommendations for problem framing: that alternatives should be searched rather than prespecified; dominant uncertainties should be discovered rather than assumed; and that a multivariate satisficing measure of robustness allows stakeholders to achieve their problem-specific performance requirements. This work highlights the importance of careful problem formulation, and provides a common vocabulary to link the robustness frameworks widely used in the field of water systems planning.

  11. Robust nonlinear system identification: Bayesian mixture of experts using the t-distribution

    NASA Astrophysics Data System (ADS)

    Baldacchino, Tara; Worden, Keith; Rowson, Jennifer

    2017-02-01

    A novel variational Bayesian mixture of experts model for robust regression of bifurcating and piece-wise continuous processes is introduced. The mixture of experts model is a powerful model which probabilistically splits the input space allowing different models to operate in the separate regions. However, current methods have no fail-safe against outliers. In this paper, a robust mixture of experts model is proposed which consists of Student-t mixture models at the gates and Student-t distributed experts, trained via Bayesian inference. The Student-t distribution has heavier tails than the Gaussian distribution, and so it is more robust to outliers, noise and non-normality in the data. Using both simulated data and real data obtained from the Z24 bridge this robust mixture of experts performs better than its Gaussian counterpart when outliers are present. In particular, it provides robustness to outliers in two forms: unbiased parameter regression models, and robustness to overfitting/complex models.

  12. Advancements in robust algorithm formulation for speaker identification of whispered speech

    NASA Astrophysics Data System (ADS)

    Fan, Xing

    Whispered speech is an alternative speech production mode from neutral speech, which is used by talkers intentionally in natural conversational scenarios to protect privacy and to avoid certain content from being overheard/made public. Due to the profound differences between whispered and neutral speech in production mechanism and the absence of whispered adaptation data, the performance of speaker identification systems trained with neutral speech degrades significantly. This dissertation therefore focuses on developing a robust closed-set speaker recognition system for whispered speech by using no or limited whispered adaptation data from non-target speakers. This dissertation proposes the concept of "High''/"Low'' performance whispered data for the purpose of speaker identification. A variety of acoustic properties are identified that contribute to the quality of whispered data. An acoustic analysis is also conducted to compare the phoneme/speaker dependency of the differences between whispered and neutral data in the feature domain. The observations from those acoustic analysis are new in this area and also serve as a guidance for developing robust speaker identification systems for whispered speech. This dissertation further proposes two systems for speaker identification of whispered speech. One system focuses on front-end processing. A two-dimensional feature space is proposed to search for "Low''-quality performance based whispered utterances and separate feature mapping functions are applied to vowels and consonants respectively in order to retain the speaker's information shared between whispered and neutral speech. The other system focuses on speech-mode-independent model training. The proposed method generates pseudo whispered features from neutral features by using the statistical information contained in a whispered Universal Background model (UBM) trained from extra collected whispered data from non-target speakers. Four modeling methods are proposed

  13. A point process approach to identifying and tracking transitions in neural spiking dynamics in the subthalamic nucleus of Parkinson's patients

    NASA Astrophysics Data System (ADS)

    Deng, Xinyi; Eskandar, Emad N.; Eden, Uri T.

    2013-12-01

    Understanding the role of rhythmic dynamics in normal and diseased brain function is an important area of research in neural electrophysiology. Identifying and tracking changes in rhythms associated with spike trains present an additional challenge, because standard approaches for continuous-valued neural recordings—such as local field potential, magnetoencephalography, and electroencephalography data—require assumptions that do not typically hold for point process data. Additionally, subtle changes in the history dependent structure of a spike train have been shown to lead to robust changes in rhythmic firing patterns. Here, we propose a point process modeling framework to characterize the rhythmic spiking dynamics in spike trains, test for statistically significant changes to those dynamics, and track the temporal evolution of such changes. We first construct a two-state point process model incorporating spiking history and develop a likelihood ratio test to detect changes in the firing structure. We then apply adaptive state-space filters and smoothers to track these changes through time. We illustrate our approach with a simulation study as well as with experimental data recorded in the subthalamic nucleus of Parkinson's patients performing an arm movement task. Our analyses show that during the arm movement task, neurons underwent a complex pattern of modulation of spiking intensity characterized initially by a release of inhibitory control at 20-40 ms after a spike, followed by a decrease in excitatory influence at 40-60 ms after a spike.

  14. Modular Energy-Efficient and Robust Paradigms for a Disaster-Recovery Process over Wireless Sensor Networks.

    PubMed

    Razaque, Abdul; Elleithy, Khaled

    2015-07-06

    Robust paradigms are a necessity, particularly for emerging wireless sensor network (WSN) applications. The lack of robust and efficient paradigms causes a reduction in the provision of quality of service (QoS) and additional energy consumption. In this paper, we introduce modular energy-efficient and robust paradigms that involve two archetypes: (1) the operational medium access control (O-MAC) hybrid protocol and (2) the pheromone termite (PT) model. The O-MAC protocol controls overhearing and congestion and increases the throughput, reduces the latency and extends the network lifetime. O-MAC uses an optimized data frame format that reduces the channel access time and provides faster data delivery over the medium. Furthermore, O-MAC uses a novel randomization function that avoids channel collisions. The PT model provides robust routing for single and multiple links and includes two new significant features: (1) determining the packet generation rate to avoid congestion and (2) pheromone sensitivity to determine the link capacity prior to sending the packets on each link. The state-of-the-art research in this work is based on improving both the QoS and energy efficiency. To determine the strength of O-MAC with the PT model; we have generated and simulated a disaster recovery scenario using a network simulator (ns-3.10) that monitors the activities of disaster recovery staff; hospital staff and disaster victims brought into the hospital. Moreover; the proposed paradigm can be used for general purpose applications. Finally; the QoS metrics of the O-MAC and PT paradigms are evaluated and compared with other known hybrid protocols involving the MAC and routing features. The simulation results indicate that O-MAC with PT produced better outcomes.

  15. Designing Flood Management Systems for Joint Economic and Ecological Robustness

    NASA Astrophysics Data System (ADS)

    Spence, C. M.; Grantham, T.; Brown, C. M.; Poff, N. L.

    2015-12-01

    Freshwater ecosystems across the United States are threatened by hydrologic change caused by water management operations and non-stationary climate trends. Nonstationary hydrology also threatens flood management systems' performance. Ecosystem managers and flood risk managers need tools to design systems that achieve flood risk reduction objectives while sustaining ecosystem functions and services in an uncertain hydrologic future. Robust optimization is used in water resources engineering to guide system design under climate change uncertainty. Using principles introduced by Eco-Engineering Decision Scaling (EEDS), we extend robust optimization techniques to design flood management systems that meet both economic and ecological goals simultaneously across a broad range of future climate conditions. We use three alternative robustness indices to identify flood risk management solutions that preserve critical ecosystem functions in a case study from the Iowa River, where recent severe flooding has tested the limits of the existing flood management system. We seek design modifications to the system that both reduce expected cost of flood damage while increasing ecologically beneficial inundation of riparian floodplains across a wide range of plausible climate futures. The first robustness index measures robustness as the fraction of potential climate scenarios in which both engineering and ecological performance goals are met, implicitly weighting each climate scenario equally. The second index builds on the first by using climate projections to weight each climate scenario, prioritizing acceptable performance in climate scenarios most consistent with climate projections. The last index measures robustness as mean performance across all climate scenarios, but penalizes scenarios with worse performance than average, rewarding consistency. Results stemming from alternate robustness indices reflect implicit assumptions about attitudes toward risk and reveal the

  16. Robust non-parametric one-sample tests for the analysis of recurrent events.

    PubMed

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  17. Variable fidelity robust optimization of pulsed laser orbital debris removal under epistemic uncertainty

    NASA Astrophysics Data System (ADS)

    Hou, Liqiang; Cai, Yuanli; Liu, Jin; Hou, Chongyuan

    2016-04-01

    A variable fidelity robust optimization method for pulsed laser orbital debris removal (LODR) under uncertainty is proposed. Dempster-shafer theory of evidence (DST), which merges interval-based and probabilistic uncertainty modeling, is used in the robust optimization. The robust optimization method optimizes the performance while at the same time maximizing its belief value. A population based multi-objective optimization (MOO) algorithm based on a steepest descent like strategy with proper orthogonal decomposition (POD) is used to search robust Pareto solutions. Analytical and numerical lifetime predictors are used to evaluate the debris lifetime after the laser pulses. Trust region based fidelity management is designed to reduce the computational cost caused by the expensive model. When the solutions fall into the trust region, the analytical model is used to reduce the computational cost. The proposed robust optimization method is first tested on a set of standard problems and then applied to the removal of Iridium 33 with pulsed lasers. It will be shown that the proposed approach can identify the most robust solutions with minimum lifetime under uncertainty.

  18. Standard and Robust Methods in Regression Imputation

    ERIC Educational Resources Information Center

    Moraveji, Behjat; Jafarian, Koorosh

    2014-01-01

    The aim of this paper is to provide an introduction of new imputation algorithms for estimating missing values from official statistics in larger data sets of data pre-processing, or outliers. The goal is to propose a new algorithm called IRMI (iterative robust model-based imputation). This algorithm is able to deal with all challenges like…

  19. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  20. Stochastic Noise and Synchronisation during Dictyostelium Aggregation Make cAMP Oscillations Robust

    PubMed Central

    Kim, Jongrae; Heslop-Harrison, Pat; Postlethwaite, Ian; Bates, Declan G

    2007-01-01

    Stable and robust oscillations in the concentration of adenosine 3′, 5′-cyclic monophosphate (cAMP) are observed during the aggregation phase of starvation-induced development in Dictyostelium discoideum. In this paper we use mathematical modelling together with ideas from robust control theory to identify two factors which appear to make crucial contributions to ensuring the robustness of these oscillations. Firstly, we show that stochastic fluctuations in the molecular interactions play an important role in preserving stable oscillations in the face of variations in the kinetics of the intracellular network. Secondly, we show that synchronisation of the aggregating cells through the diffusion of extracellular cAMP is a key factor in ensuring robustness of the oscillatory waves of cAMP observed in Dictyostelium cell cultures to cell-to-cell variations. A striking and quite general implication of the results is that the robustness analysis of models of oscillating biomolecular networks (circadian clocks, Ca2+ oscillations, etc.) can only be done reliably by using stochastic simulations, even in the case where molecular concentrations are very high. PMID:17997595

  1. Dendrites Enable a Robust Mechanism for Neuronal Stimulus Selectivity.

    PubMed

    Cazé, Romain D; Jarvis, Sarah; Foust, Amanda J; Schultz, Simon R

    2017-09-01

    Hearing, vision, touch: underlying all of these senses is stimulus selectivity, a robust information processing operation in which cortical neurons respond more to some stimuli than to others. Previous models assume that these neurons receive the highest weighted input from an ensemble encoding the preferred stimulus, but dendrites enable other possibilities. Nonlinear dendritic processing can produce stimulus selectivity based on the spatial distribution of synapses, even if the total preferred stimulus weight does not exceed that of nonpreferred stimuli. Using a multi-subunit nonlinear model, we demonstrate that stimulus selectivity can arise from the spatial distribution of synapses. We propose this as a general mechanism for information processing by neurons possessing dendritic trees. Moreover, we show that this implementation of stimulus selectivity increases the neuron's robustness to synaptic and dendritic failure. Importantly, our model can maintain stimulus selectivity for a larger range of loss of synapses or dendrites than an equivalent linear model. We then use a layer 2/3 biophysical neuron model to show that our implementation is consistent with two recent experimental observations: (1) one can observe a mixture of selectivities in dendrites that can differ from the somatic selectivity, and (2) hyperpolarization can broaden somatic tuning without affecting dendritic tuning. Our model predicts that an initially nonselective neuron can become selective when depolarized. In addition to motivating new experiments, the model's increased robustness to synapses and dendrites loss provides a starting point for fault-resistant neuromorphic chip development.

  2. Optimum Design of Forging Process Parameters and Preform Shape under Uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2004-06-01

    Forging is a highly complex non-linear process that is vulnerable to various uncertainties, such as variations in billet geometry, die temperature, material properties, workpiece and forging equipment positional errors and process parameters. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion and production risk. Identifying the sources of uncertainties, quantifying and controlling them will reduce risk in the manufacturing environment, which will minimize the overall cost of production. In this paper, various uncertainties that affect forging tool life and preform design are identified, and their cumulative effect on the forging process is evaluated. Since the forging process simulation is computationally intensive, the response surface approach is used to reduce time by establishing a relationship between the system performance and the critical process design parameters. Variability in system performance due to randomness in the parameters is computed by applying Monte Carlo Simulations (MCS) on generated Response Surface Models (RSM). Finally, a Robust Methodology is developed to optimize forging process parameters and preform shape. The developed method is demonstrated by applying it to an axisymmetric H-cross section disk forging to improve the product quality and robustness.

  3. Reducing regional drought vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Trindade, B. C.; Reed, P. M.; Herman, J. D.; Zeff, H. B.; Characklis, G. W.

    2017-06-01

    Emerging water scarcity concerns in many urban regions are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative drought management strategies. Our results show that appropriately designing adaptive risk-of-failure action triggers required stressing them with a comprehensive sample of deeply uncertain factors in the computational search phase of MORDM. Search under the new ensemble of states-of-the-world is shown to fundamentally change perceived performance tradeoffs and substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Search under deep uncertainty enhanced the discovery of how cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be employed jointly to improve regional robustness and decrease robustness conflicts between the utilities. Insights from this work have general merit for regions where

  4. A Robust Gold Deconvolution Approach for LiDAR Waveform Data Processing to Characterize Vegetation Structure

    NASA Astrophysics Data System (ADS)

    Zhou, T.; Popescu, S. C.; Krause, K.; Sheridan, R.; Ku, N. W.

    2014-12-01

    Increasing attention has been paid in the remote sensing community to the next generation Light Detection and Ranging (lidar) waveform data systems for extracting information on topography and the vertical structure of vegetation. However, processing waveform lidar data raises some challenges compared to analyzing discrete return data. The overall goal of this study was to present a robust de-convolution algorithm- Gold algorithm used to de-convolve waveforms in a lidar dataset acquired within a 60 x 60m study area located in the Harvard Forest in Massachusetts. The waveform lidar data was collected by the National Ecological Observatory Network (NEON). Specific objectives were to: (1) explore advantages and limitations of various waveform processing techniques to derive topography and canopy height information; (2) develop and implement a novel de-convolution algorithm, the Gold algorithm, to extract elevation and canopy metrics; and (3) compare results and assess accuracy. We modeled lidar waveforms with a mixture of Gaussian functions using the Non-least squares (NLS) algorithm implemented in R and derived a Digital Terrain Model (DTM) and canopy height. We compared our waveform-derived topography and canopy height measurements using the Gold de-convolution algorithm to results using the Richardson-Lucy algorithm. Our findings show that the Gold algorithm performed better than the Richardson-Lucy algorithm in terms of recovering the hidden echoes and detecting false echoes for generating a DTM, which indicates that the Gold algorithm could potentially be applied to processing of waveform lidar data to derive information on terrain elevation and canopy characteristics.

  5. Modular Energy-Efficient and Robust Paradigms for a Disaster-Recovery Process over Wireless Sensor Networks

    PubMed Central

    Razaque, Abdul; Elleithy, Khaled

    2015-01-01

    Robust paradigms are a necessity, particularly for emerging wireless sensor network (WSN) applications. The lack of robust and efficient paradigms causes a reduction in the provision of quality of service (QoS) and additional energy consumption. In this paper, we introduce modular energy-efficient and robust paradigms that involve two archetypes: (1) the operational medium access control (O-MAC) hybrid protocol and (2) the pheromone termite (PT) model. The O-MAC protocol controls overhearing and congestion and increases the throughput, reduces the latency and extends the network lifetime. O-MAC uses an optimized data frame format that reduces the channel access time and provides faster data delivery over the medium. Furthermore, O-MAC uses a novel randomization function that avoids channel collisions. The PT model provides robust routing for single and multiple links and includes two new significant features: (1) determining the packet generation rate to avoid congestion and (2) pheromone sensitivity to determine the link capacity prior to sending the packets on each link. The state-of-the-art research in this work is based on improving both the QoS and energy efficiency. To determine the strength of O-MAC with the PT model; we have generated and simulated a disaster recovery scenario using a network simulator (ns-3.10) that monitors the activities of disaster recovery staff; hospital staff and disaster victims brought into the hospital. Moreover; the proposed paradigm can be used for general purpose applications. Finally; the QoS metrics of the O-MAC and PT paradigms are evaluated and compared with other known hybrid protocols involving the MAC and routing features. The simulation results indicate that O-MAC with PT produced better outcomes. PMID:26153768

  6. A robust post-processing workflow for datasets with motion artifacts in diffusion kurtosis imaging.

    PubMed

    Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X; Wan, Mingxi

    2014-01-01

    The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post-processing

  7. Robust Fault Detection for Aircraft Using Mixed Structured Singular Value Theory and Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Collins, Emmanuel G.

    2000-01-01

    The purpose of fault detection is to identify when a fault or failure has occurred in a system such as an aircraft or expendable launch vehicle. The faults may occur in sensors, actuators, structural components, etc. One of the primary approaches to model-based fault detection relies on analytical redundancy. That is the output of a computer-based model (actually a state estimator) is compared with the sensor measurements of the actual system to determine when a fault has occurred. Unfortunately, the state estimator is based on an idealized mathematical description of the underlying plant that is never totally accurate. As a result of these modeling errors, false alarms can occur. This research uses mixed structured singular value theory, a relatively recent and powerful robustness analysis tool, to develop robust estimators and demonstrates the use of these estimators in fault detection. To allow qualitative human experience to be effectively incorporated into the detection process fuzzy logic is used to predict the seriousness of the fault that has occurred.

  8. Robust guaranteed-cost adaptive quantum phase estimation

    NASA Astrophysics Data System (ADS)

    Roy, Shibdas; Berry, Dominic W.; Petersen, Ian R.; Huntington, Elanor H.

    2017-05-01

    Quantum parameter estimation plays a key role in many fields like quantum computation, communication, and metrology. Optimal estimation allows one to achieve the most precise parameter estimates, but requires accurate knowledge of the model. Any inevitable uncertainty in the model parameters may heavily degrade the quality of the estimate. It is therefore desired to make the estimation process robust to such uncertainties. Robust estimation was previously studied for a varying phase, where the goal was to estimate the phase at some time in the past, using the measurement results from both before and after that time within a fixed time interval up to current time. Here, we consider a robust guaranteed-cost filter yielding robust estimates of a varying phase in real time, where the current phase is estimated using only past measurements. Our filter minimizes the largest (worst-case) variance in the allowable range of the uncertain model parameter(s) and this determines its guaranteed cost. It outperforms in the worst case the optimal Kalman filter designed for the model with no uncertainty, which corresponds to the center of the possible range of the uncertain parameter(s). Moreover, unlike the Kalman filter, our filter in the worst case always performs better than the best achievable variance for heterodyne measurements, which we consider as the tolerable threshold for our system. Furthermore, we consider effective quantum efficiency and effective noise power, and show that our filter provides the best results by these measures in the worst case.

  9. Robust Speaker Authentication Based on Combined Speech and Voiceprint Recognition

    NASA Astrophysics Data System (ADS)

    Malcangi, Mario

    2009-08-01

    Personal authentication is becoming increasingly important in many applications that have to protect proprietary data. Passwords and personal identification numbers (PINs) prove not to be robust enough to ensure that unauthorized people do not use them. Biometric authentication technology may offer a secure, convenient, accurate solution but sometimes fails due to its intrinsically fuzzy nature. This research aims to demonstrate that combining two basic speech processing methods, voiceprint identification and speech recognition, can provide a very high degree of robustness, especially if fuzzy decision logic is used.

  10. Fast and Robust Segmentation and Classification for Change Detection in Urban Point Clouds

    NASA Astrophysics Data System (ADS)

    Roynard, X.; Deschaud, J.-E.; Goulette, F.

    2016-06-01

    Change detection is an important issue in city monitoring to analyse street furniture, road works, car parking, etc. For example, parking surveys are needed but are currently a laborious task involving sending operators in the streets to identify the changes in car locations. In this paper, we propose a method that performs a fast and robust segmentation and classification of urban point clouds, that can be used for change detection. We apply this method to detect the cars, as a particular object class, in order to perform parking surveys automatically. A recently proposed method already addresses the need for fast segmentation and classification of urban point clouds, using elevation images. The interest to work on images is that processing is much faster, proven and robust. However there may be a loss of information in complex 3D cases: for example when objects are one above the other, typically a car under a tree or a pedestrian under a balcony. In this paper we propose a method that retain the three-dimensional information while preserving fast computation times and improving segmentation and classification accuracy. It is based on fast region-growing using an octree, for the segmentation, and specific descriptors with Random-Forest for the classification. Experiments have been performed on large urban point clouds acquired by Mobile Laser Scanning. They show that the method is as fast as the state of the art, and that it gives more robust results in the complex 3D cases.

  11. Robust Low Cost Liquid Rocket Combustion Chamber by Advanced Vacuum Plasma Process

    NASA Technical Reports Server (NTRS)

    Holmes, Richard; Elam, Sandra; Ellis, David L.; McKechnie, Timothy; Hickman, Robert; Rose, M. Franklin (Technical Monitor)

    2001-01-01

    Next-generation, regeneratively cooled rocket engines will require materials that can withstand high temperatures while retaining high thermal conductivity. Fabrication techniques must be cost efficient so that engine components can be manufactured within the constraints of shrinking budgets. Three technologies have been combined to produce an advanced liquid rocket engine combustion chamber at NASA-Marshall Space Flight Center (MSFC) using relatively low-cost, vacuum-plasma-spray (VPS) techniques. Copper alloy NARloy-Z was replaced with a new high performance Cu-8Cr-4Nb alloy developed by NASA-Glenn Research Center (GRC), which possesses excellent high-temperature strength, creep resistance, and low cycle fatigue behavior combined with exceptional thermal stability. Functional gradient technology, developed building composite cartridges for space furnaces was incorporated to add oxidation resistant and thermal barrier coatings as an integral part of the hot wall of the liner during the VPS process. NiCrAlY, utilized to produce durable protective coating for the space shuttle high pressure fuel turbopump (BPFTP) turbine blades, was used as the functional gradient material coating (FGM). The FGM not only serves as a protection from oxidation or blanching, the main cause of engine failure, but also serves as a thermal barrier because of its lower thermal conductivity, reducing the temperature of the combustion liner 200 F, from 1000 F to 800 F producing longer life. The objective of this program was to develop and demonstrate the technology to fabricate high-performance, robust, inexpensive combustion chambers for advanced propulsion systems (such as Lockheed-Martin's VentureStar and NASA's Reusable Launch Vehicle, RLV) using the low-cost VPS process. VPS formed combustion chamber test articles have been formed with the FGM hot wall built in and hot fire tested, demonstrating for the first time a coating that will remain intact through the hot firing test, and with

  12. Robust image modeling techniques with an image restoration application

    NASA Astrophysics Data System (ADS)

    Kashyap, Rangasami L.; Eom, Kie-Bum

    1988-08-01

    A robust parameter-estimation algorithm for a nonsymmetric half-plane (NSHP) autoregressive model, where the driving noise is a mixture of a Gaussian and an outlier process, is presented. The convergence of the estimation algorithm is proved. An algorithm to estimate parameters and original image intensity simultaneously from the impulse-noise-corrupted image, where the model governing the image is not available, is also presented. The robustness of the parameter estimates is demonstrated by simulation. Finally, an algorithm to restore realistic images is presented. The entire image generally does not obey a simple image model, but a small portion (e.g., 8 x 8) of the image is assumed to obey an NSHP model. The original image is divided into windows and the robust estimation algorithm is applied for each window. The restoration algorithm is tested by comparing it to traditional methods on several different images.

  13. Capacity planning for waste management systems: an interval fuzzy robust dynamic programming approach.

    PubMed

    Nie, Xianghui; Huang, Guo H; Li, Yongping

    2009-11-01

    This study integrates the concepts of interval numbers and fuzzy sets into optimization analysis by dynamic programming as a means of accounting for system uncertainty. The developed interval fuzzy robust dynamic programming (IFRDP) model improves upon previous interval dynamic programming methods. It allows highly uncertain information to be effectively communicated into the optimization process through introducing the concept of fuzzy boundary interval and providing an interval-parameter fuzzy robust programming method for an embedded linear programming problem. Consequently, robustness of the optimization process and solution can be enhanced. The modeling approach is applied to a hypothetical problem for the planning of waste-flow allocation and treatment/disposal facility expansion within a municipal solid waste (MSW) management system. Interval solutions for capacity expansion of waste management facilities and relevant waste-flow allocation are generated and interpreted to provide useful decision alternatives. The results indicate that robust and useful solutions can be obtained, and the proposed IFRDP approach is applicable to practical problems that are associated with highly complex and uncertain information.

  14. ROBUSTNESS OF SIGNALING GRADIENT IN DROSOPHILA WING IMAGINAL DISC

    PubMed Central

    Lei, Jinzhi; Wan, Frederic Y. M.; Lander, Arthur D.; Nie, Qing

    2012-01-01

    Quasi-stable gradients of signaling protein molecules (known as morphogens or ligands) bound to cell receptors are known to be responsible for differential cell signaling and gene expressions. From these follow different stable cell fates and visually patterned tissues in biological development. Recent studies have shown that the relevant basic biological processes yield gradients that are sensitive to small changes in system characteristics (such as expression level of morphogens or receptors) or environmental conditions (such as temperature changes). Additional biological activities must play an important role in the high level of robustness observed in embryonic patterning for example. It is natural to attribute observed robustness to various type of feedback control mechanisms. However, our own simulation studies have shown that feedback control is neither necessary nor sufficient for robustness of the morphogen decapentaplegic (Dpp) gradient in wing imaginal disc of Drosophilas. Furthermore, robustness can be achieved by substantial binding of the signaling morphogen Dpp with nonsignaling cell surface bound molecules (such as heparan sulfate proteoglygans) and degrading the resulting complexes at a sufficiently rapid rate. The present work provides a theoretical basis for the results of our numerical simulation studies. PMID:24098092

  15. A robust embedded vision system feasible white balance algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Yuan; Yu, Feihong

    2018-01-01

    White balance is a very important part of the color image processing pipeline. In order to meet the need of efficiency and accuracy in embedded machine vision processing system, an efficient and robust white balance algorithm combining several classical ones is proposed. The proposed algorithm mainly has three parts. Firstly, in order to guarantee higher efficiency, an initial parameter calculated from the statistics of R, G and B components from raw data is used to initialize the following iterative method. After that, the bilinear interpolation algorithm is utilized to implement demosaicing procedure. Finally, an adaptive step adjustable scheme is introduced to ensure the controllability and robustness of the algorithm. In order to verify the proposed algorithm's performance on embedded vision system, a smart camera based on IMX6 DualLite, IMX291 and XC6130 is designed. Extensive experiments on a large amount of images under different color temperatures and exposure conditions illustrate that the proposed white balance algorithm avoids color deviation problem effectively, achieves a good balance between efficiency and quality, and is suitable for embedded machine vision processing system.

  16. RSRE: RNA structural robustness evaluator

    PubMed Central

    Shu, Wenjie; Zheng, Zhiqiang; Wang, Shengqi

    2007-01-01

    Biological robustness, defined as the ability to maintain stable functioning in the face of various perturbations, is an important and fundamental topic in current biology, and has become a focus of numerous studies in recent years. Although structural robustness has been explored in several types of RNA molecules, the origins of robustness are still controversial. Computational analysis results are needed to make up for the lack of evidence of robustness in natural biological systems. The RNA structural robustness evaluator (RSRE) web server presented here provides a freely available online tool to quantitatively evaluate the structural robustness of RNA based on the widely accepted definition of neutrality. Several classical structure comparison methods are employed; five randomization methods are implemented to generate control sequences; sub-optimal predicted structures can be optionally utilized to mitigate the uncertainty of secondary structure prediction. With a user-friendly interface, the web application is easy to use. Intuitive illustrations are provided along with the original computational results to facilitate analysis. The RSRE will be helpful in the wide exploration of RNA structural robustness and will catalyze our understanding of RNA evolution. The RSRE web server is freely available at http://biosrv1.bmi.ac.cn/RSRE/ or http://biotech.bmi.ac.cn/RSRE/. PMID:17567615

  17. Robust excitons inhabit soft supramolecular nanotubes

    PubMed Central

    Eisele, Dörthe M.; Arias, Dylan H.; Fu, Xiaofeng; Bloemsma, Erik A.; Steiner, Colby P.; Jensen, Russell A.; Rebentrost, Patrick; Eisele, Holger; Tokmakoff, Andrei; Lloyd, Seth; Nelson, Keith A.; Nicastro, Daniela; Knoester, Jasper; Bawendi, Moungi G.

    2014-01-01

    Nature's highly efficient light-harvesting antennae, such as those found in green sulfur bacteria, consist of supramolecular building blocks that self-assemble into a hierarchy of close-packed structures. In an effort to mimic the fundamental processes that govern nature’s efficient systems, it is important to elucidate the role of each level of hierarchy: from molecule, to supramolecular building block, to close-packed building blocks. Here, we study the impact of hierarchical structure. We present a model system that mirrors nature’s complexity: cylinders self-assembled from cyanine-dye molecules. Our work reveals that even though close-packing may alter the cylinders’ soft mesoscopic structure, robust delocalized excitons are retained: Internal order and strong excitation-transfer interactions—prerequisites for efficient energy transport—are both maintained. Our results suggest that the cylindrical geometry strongly favors robust excitons; it presents a rational design that is potentially key to nature’s high efficiency, allowing construction of efficient light-harvesting devices even from soft, supramolecular materials. PMID:25092336

  18. Robust Magnetotelluric Impedance Estimation

    NASA Astrophysics Data System (ADS)

    Sutarno, D.

    2010-12-01

    Robust magnetotelluric (MT) response function estimators are now in standard use by the induction community. Properly devised and applied, these have ability to reduce the influence of unusual data (outliers). The estimators always yield impedance estimates which are better than the conventional least square (LS) estimation because the `real' MT data almost never satisfy the statistical assumptions of Gaussian distribution and stationary upon which normal spectral analysis is based. This paper discuses the development and application of robust estimation procedures which can be classified as M-estimators to MT data. Starting with the description of the estimators, special attention is addressed to the recent development of a bounded-influence robust estimation, including utilization of the Hilbert Transform (HT) operation on causal MT impedance functions. The resulting robust performances are illustrated using synthetic as well as real MT data.

  19. Adaptive and robust statistical methods for processing near-field scanning microwave microscopy images.

    PubMed

    Coakley, K J; Imtiaz, A; Wallis, T M; Weber, J C; Berweger, S; Kabos, P

    2015-03-01

    Near-field scanning microwave microscopy offers great potential to facilitate characterization, development and modeling of materials. By acquiring microwave images at multiple frequencies and amplitudes (along with the other modalities) one can study material and device physics at different lateral and depth scales. Images are typically noisy and contaminated by artifacts that can vary from scan line to scan line and planar-like trends due to sample tilt errors. Here, we level images based on an estimate of a smooth 2-d trend determined with a robust implementation of a local regression method. In this robust approach, features and outliers which are not due to the trend are automatically downweighted. We denoise images with the Adaptive Weights Smoothing method. This method smooths out additive noise while preserving edge-like features in images. We demonstrate the feasibility of our methods on topography images and microwave |S11| images. For one challenging test case, we demonstrate that our method outperforms alternative methods from the scanning probe microscopy data analysis software package Gwyddion. Our methods should be useful for massive image data sets where manual selection of landmarks or image subsets by a user is impractical. Published by Elsevier B.V.

  20. On the Interplay between the Evolvability and Network Robustness in an Evolutionary Biological Network: A Systems Biology Approach

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2011-01-01

    In the evolutionary process, the random transmission and mutation of genes provide biological diversities for natural selection. In order to preserve functional phenotypes between generations, gene networks need to evolve robustly under the influence of random perturbations. Therefore, the robustness of the phenotype, in the evolutionary process, exerts a selection force on gene networks to keep network functions. However, gene networks need to adjust, by variations in genetic content, to generate phenotypes for new challenges in the network’s evolution, ie, the evolvability. Hence, there should be some interplay between the evolvability and network robustness in evolutionary gene networks. In this study, the interplay between the evolvability and network robustness of a gene network and a biochemical network is discussed from a nonlinear stochastic system point of view. It was found that if the genetic robustness plus environmental robustness is less than the network robustness, the phenotype of the biological network is robust in evolution. The tradeoff between the genetic robustness and environmental robustness in evolution is discussed from the stochastic stability robustness and sensitivity of the nonlinear stochastic biological network, which may be relevant to the statistical tradeoff between bias and variance, the so-called bias/variance dilemma. Further, the tradeoff could be considered as an antagonistic pleiotropic action of a gene network and discussed from the systems biology perspective. PMID:22084563

  1. Robust and Effective Component-based Banknote Recognition for the Blind

    PubMed Central

    Hasanuzzaman, Faiz M.; Yang, Xiaodong; Tian, YingLi

    2012-01-01

    We develop a novel camera-based computer vision technology to automatically recognize banknotes for assisting visually impaired people. Our banknote recognition system is robust and effective with the following features: 1) high accuracy: high true recognition rate and low false recognition rate, 2) robustness: handles a variety of currency designs and bills in various conditions, 3) high efficiency: recognizes banknotes quickly, and 4) ease of use: helps blind users to aim the target for image capture. To make the system robust to a variety of conditions including occlusion, rotation, scaling, cluttered background, illumination change, viewpoint variation, and worn or wrinkled bills, we propose a component-based framework by using Speeded Up Robust Features (SURF). Furthermore, we employ the spatial relationship of matched SURF features to detect if there is a bill in the camera view. This process largely alleviates false recognition and can guide the user to correctly aim at the bill to be recognized. The robustness and generalizability of the proposed system is evaluated on a dataset including both positive images (with U.S. banknotes) and negative images (no U.S. banknotes) collected under a variety of conditions. The proposed algorithm, achieves 100% true recognition rate and 0% false recognition rate. Our banknote recognition system is also tested by blind users. PMID:22661884

  2. Evaluating optimal therapy robustness by virtual expansion of a sample population, with a case study in cancer immunotherapy

    PubMed Central

    Barish, Syndi; Ochs, Michael F.; Sontag, Eduardo D.; Gevertz, Jana L.

    2017-01-01

    Cancer is a highly heterogeneous disease, exhibiting spatial and temporal variations that pose challenges for designing robust therapies. Here, we propose the VEPART (Virtual Expansion of Populations for Analyzing Robustness of Therapies) technique as a platform that integrates experimental data, mathematical modeling, and statistical analyses for identifying robust optimal treatment protocols. VEPART begins with time course experimental data for a sample population, and a mathematical model fit to aggregate data from that sample population. Using nonparametric statistics, the sample population is amplified and used to create a large number of virtual populations. At the final step of VEPART, robustness is assessed by identifying and analyzing the optimal therapy (perhaps restricted to a set of clinically realizable protocols) across each virtual population. As proof of concept, we have applied the VEPART method to study the robustness of treatment response in a mouse model of melanoma subject to treatment with immunostimulatory oncolytic viruses and dendritic cell vaccines. Our analysis (i) showed that every scheduling variant of the experimentally used treatment protocol is fragile (nonrobust) and (ii) discovered an alternative region of dosing space (lower oncolytic virus dose, higher dendritic cell dose) for which a robust optimal protocol exists. PMID:28716945

  3. The comparison of robust partial least squares regression with robust principal component regression on a real

    NASA Astrophysics Data System (ADS)

    Polat, Esra; Gunay, Suleyman

    2013-10-01

    One of the problems encountered in Multiple Linear Regression (MLR) is multicollinearity, which causes the overestimation of the regression parameters and increase of the variance of these parameters. Hence, in case of multicollinearity presents, biased estimation procedures such as classical Principal Component Regression (CPCR) and Partial Least Squares Regression (PLSR) are then performed. SIMPLS algorithm is the leading PLSR algorithm because of its speed, efficiency and results are easier to interpret. However, both of the CPCR and SIMPLS yield very unreliable results when the data set contains outlying observations. Therefore, Hubert and Vanden Branden (2003) have been presented a robust PCR (RPCR) method and a robust PLSR (RPLSR) method called RSIMPLS. In RPCR, firstly, a robust Principal Component Analysis (PCA) method for high-dimensional data on the independent variables is applied, then, the dependent variables are regressed on the scores using a robust regression method. RSIMPLS has been constructed from a robust covariance matrix for high-dimensional data and robust linear regression. The purpose of this study is to show the usage of RPCR and RSIMPLS methods on an econometric data set, hence, making a comparison of two methods on an inflation model of Turkey. The considered methods have been compared in terms of predictive ability and goodness of fit by using a robust Root Mean Squared Error of Cross-validation (R-RMSECV), a robust R2 value and Robust Component Selection (RCS) statistic.

  4. Robustness analysis of non-ordinary Petri nets for flexible assembly systems

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2010-05-01

    Non-ordinary controlled Petri nets (NCPNs) have the advantages to model flexible assembly systems in which multiple identical resources may be required to perform an operation. However, existing studies on NCPNs are still limited. For example, the robustness properties of NCPNs have not been studied. This motivates us to develop an analysis method for NCPNs. Robustness analysis concerns the ability for a system to maintain operation in the presence of uncertainties. It provides an alternative way to analyse a perturbed system without reanalysis. In our previous research, we have analysed the robustness properties of several subclasses of ordinary controlled Petri nets. To study the robustness properties of NCPNs, we augment NCPNs with an uncertainty model, which specifies an upper bound on the uncertainties for each reachable marking. The resulting PN models are called non-ordinary controlled Petri nets with uncertainties (NCPNU). Based on NCPNU, the problem is to characterise the maximal tolerable uncertainties for each reachable marking. The computational complexities to characterise maximal tolerable uncertainties for each reachable marking grow exponentially with the size of the nets. Instead of considering general NCPNU, we limit our scope to a subclass of PN models called non-ordinary controlled flexible assembly Petri net with uncertainties (NCFAPNU) for assembly systems and study its robustness. We will extend the robustness analysis to NCFAPNU. We identify two types of uncertainties under which the liveness of NCFAPNU can be maintained.

  5. On robust parameter estimation in brain-computer interfacing

    NASA Astrophysics Data System (ADS)

    Samek, Wojciech; Nakajima, Shinichi; Kawanabe, Motoaki; Müller, Klaus-Robert

    2017-12-01

    Objective. The reliable estimation of parameters such as mean or covariance matrix from noisy and high-dimensional observations is a prerequisite for successful application of signal processing and machine learning algorithms in brain-computer interfacing (BCI). This challenging task becomes significantly more difficult if the data set contains outliers, e.g. due to subject movements, eye blinks or loose electrodes, as they may heavily bias the estimation and the subsequent statistical analysis. Although various robust estimators have been developed to tackle the outlier problem, they ignore important structural information in the data and thus may not be optimal. Typical structural elements in BCI data are the trials consisting of a few hundred EEG samples and indicating the start and end of a task. Approach. This work discusses the parameter estimation problem in BCI and introduces a novel hierarchical view on robustness which naturally comprises different types of outlierness occurring in structured data. Furthermore, the class of minimum divergence estimators is reviewed and a robust mean and covariance estimator for structured data is derived and evaluated with simulations and on a benchmark data set. Main results. The results show that state-of-the-art BCI algorithms benefit from robustly estimated parameters. Significance. Since parameter estimation is an integral part of various machine learning algorithms, the presented techniques are applicable to many problems beyond BCI.

  6. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty

    PubMed Central

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications. PMID:27835670

  7. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty.

    PubMed

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications.

  8. Designing robust control laws using genetic algorithms

    NASA Technical Reports Server (NTRS)

    Marrison, Chris

    1994-01-01

    The purpose of this research is to create a method of finding practical, robust control laws. The robustness of a controller is judged by Stochastic Robustness metrics and the level of robustness is optimized by searching for design parameters that minimize a robustness cost function.

  9. Robustness of Controllability for Networks Based on Edge-Attack

    PubMed Central

    Nie, Sen; Wang, Xuwen; Zhang, Haifeng; Li, Qilang; Wang, Binghong

    2014-01-01

    We study the controllability of networks in the process of cascading failures under two different attacking strategies, random and intentional attack, respectively. For the highest-load edge attack, it is found that the controllability of Erdős-Rényi network, that with moderate average degree, is less robust, whereas the Scale-free network with moderate power-law exponent shows strong robustness of controllability under the same attack strategy. The vulnerability of controllability under random and intentional attacks behave differently with the increasing of removal fraction, especially, we find that the robustness of control has important role in cascades for large removal fraction. The simulation results show that for Scale-free networks with various power-law exponents, the network has larger scale of cascades do not mean that there will be more increments of driver nodes. Meanwhile, the number of driver nodes in cascading failures is also related to the edges amount in strongly connected components. PMID:24586507

  10. Robustness of controllability for networks based on edge-attack.

    PubMed

    Nie, Sen; Wang, Xuwen; Zhang, Haifeng; Li, Qilang; Wang, Binghong

    2014-01-01

    We study the controllability of networks in the process of cascading failures under two different attacking strategies, random and intentional attack, respectively. For the highest-load edge attack, it is found that the controllability of Erdős-Rényi network, that with moderate average degree, is less robust, whereas the Scale-free network with moderate power-law exponent shows strong robustness of controllability under the same attack strategy. The vulnerability of controllability under random and intentional attacks behave differently with the increasing of removal fraction, especially, we find that the robustness of control has important role in cascades for large removal fraction. The simulation results show that for Scale-free networks with various power-law exponents, the network has larger scale of cascades do not mean that there will be more increments of driver nodes. Meanwhile, the number of driver nodes in cascading failures is also related to the edges amount in strongly connected components.

  11. Robust Control Systems.

    DTIC Science & Technology

    1981-12-01

    time control system algorithms that will perform adequately (i.e., at least maintain closed-loop system stability) when ucertain parameters in the...system design models vary significantly. Such a control algorithm is said to have stability robustness-or more simply is said to be "robust". This...cas6s above, the performance is analyzed using a covariance analysis. The development of all the controllers and the performance analysis algorithms is

  12. A data mining paradigm for identifying key factors in biological processes using gene expression data.

    PubMed

    Li, Jin; Zheng, Le; Uchiyama, Akihiko; Bin, Lianghua; Mauro, Theodora M; Elias, Peter M; Pawelczyk, Tadeusz; Sakowicz-Burkiewicz, Monika; Trzeciak, Magdalena; Leung, Donald Y M; Morasso, Maria I; Yu, Peng

    2018-06-13

    A large volume of biological data is being generated for studying mechanisms of various biological processes. These precious data enable large-scale computational analyses to gain biological insights. However, it remains a challenge to mine the data efficiently for knowledge discovery. The heterogeneity of these data makes it difficult to consistently integrate them, slowing down the process of biological discovery. We introduce a data processing paradigm to identify key factors in biological processes via systematic collection of gene expression datasets, primary analysis of data, and evaluation of consistent signals. To demonstrate its effectiveness, our paradigm was applied to epidermal development and identified many genes that play a potential role in this process. Besides the known epidermal development genes, a substantial proportion of the identified genes are still not supported by gain- or loss-of-function studies, yielding many novel genes for future studies. Among them, we selected a top gene for loss-of-function experimental validation and confirmed its function in epidermal differentiation, proving the ability of this paradigm to identify new factors in biological processes. In addition, this paradigm revealed many key genes in cold-induced thermogenesis using data from cold-challenged tissues, demonstrating its generalizability. This paradigm can lead to fruitful results for studying molecular mechanisms in an era of explosive accumulation of publicly available biological data.

  13. Performance and Ageing Robustness of Graphite/NMC Pouch Prototypes Manufactured through Eco-Friendly Materials and Processes.

    PubMed

    Loeffler, Nicholas; Kim, Guk-T; Passerini, Stefano; Gutierrez, Cesar; Cendoya, Iosu; De Meatza, Iratxe; Alessandrini, Fabrizio; Appetecchi, Giovanni B

    2017-09-22

    Graphite/lithium nickel-manganese-cobalt oxide (NMC), stacked pouch cells with nominal capacity of 15-18 Ah were designed, developed, and manufactured for automotive applications in the frame of the European Project GREENLION. A natural, water-soluble material was used as the main electrode binder, thus allowing the employment of H 2 O as the only processing solvent. The electrode formulations were developed, optimized, and upscaled for cell manufacturing. Prolonged cycling and ageing tests revealed excellent capacity retention and robustness toward degradation phenomena. For instance, above 99 % of the initial capacity is retained upon 500 full charge/discharge cycles, corresponding to a fading of 0.004 % per cycle, and about 80 % of the initial capacity is delivered after 8 months ageing at 45 °C. The stacked soft-packaged cells have shown very reproducible characteristics and performance, reflecting the goodness of design and manufacturing. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.

    PubMed

    Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E

    2016-06-21

    We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.

  15. Thermo-compressive transfer printing for facile alignment and robust device integration of nanowires.

    PubMed

    Lee, Won Seok; Won, Sejeong; Park, Jeunghee; Lee, Jihye; Park, Inkyu

    2012-06-07

    Controlled alignment and mechanically robust bonding between nanowires (NWs) and electrodes are essential requirements for reliable operation of functional NW-based electronic devices. In this work, we developed a novel process for the alignment and bonding between NWs and metal electrodes by using thermo-compressive transfer printing. In this process, bottom-up synthesized NWs were aligned in parallel by shear loading onto the intermediate substrate and then finally transferred onto the target substrate with low melting temperature metal electrodes. In particular, multi-layer (e.g. Cr/Au/In/Au and Cr/Cu/In/Au) metal electrodes are softened at low temperatures (below 100 °C) and facilitate submergence of aligned NWs into the surface of electrodes at a moderate pressure (∼5 bar). By using this thermo-compressive transfer printing process, robust electrical and mechanical contact between NWs and metal electrodes can be realized. This method is believed to be very useful for the large-area fabrication of NW-based electrical devices with improved mechanical robustness, electrical contact resistance, and reliability.

  16. Robust, Optimal Water Infrastructure Planning Under Deep Uncertainty Using Metamodels

    NASA Astrophysics Data System (ADS)

    Maier, H. R.; Beh, E. H. Y.; Zheng, F.; Dandy, G. C.; Kapelan, Z.

    2015-12-01

    Optimal long-term planning plays an important role in many water infrastructure problems. However, this task is complicated by deep uncertainty about future conditions, such as the impact of population dynamics and climate change. One way to deal with this uncertainty is by means of robustness, which aims to ensure that water infrastructure performs adequately under a range of plausible future conditions. However, as robustness calculations require computationally expensive system models to be run for a large number of scenarios, it is generally computationally intractable to include robustness as an objective in the development of optimal long-term infrastructure plans. In order to overcome this shortcoming, an approach is developed that uses metamodels instead of computationally expensive simulation models in robustness calculations. The approach is demonstrated for the optimal sequencing of water supply augmentation options for the southern portion of the water supply for Adelaide, South Australia. A 100-year planning horizon is subdivided into ten equal decision stages for the purpose of sequencing various water supply augmentation options, including desalination, stormwater harvesting and household rainwater tanks. The objectives include the minimization of average present value of supply augmentation costs, the minimization of average present value of greenhouse gas emissions and the maximization of supply robustness. The uncertain variables are rainfall, per capita water consumption and population. Decision variables are the implementation stages of the different water supply augmentation options. Artificial neural networks are used as metamodels to enable all objectives to be calculated in a computationally efficient manner at each of the decision stages. The results illustrate the importance of identifying optimal staged solutions to ensure robustness and sustainability of water supply into an uncertain long-term future.

  17. The role of the PIRT process in identifying code improvements and executing code development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, G.E.; Boyack, B.E.

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, wasmore » originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.« less

  18. A Statistical Approach Reveals Designs for the Most Robust Stochastic Gene Oscillators

    PubMed Central

    2016-01-01

    The engineering of transcriptional networks presents many challenges due to the inherent uncertainty in the system structure, changing cellular context, and stochasticity in the governing dynamics. One approach to address these problems is to design and build systems that can function across a range of conditions; that is they are robust to uncertainty in their constituent components. Here we examine the parametric robustness landscape of transcriptional oscillators, which underlie many important processes such as circadian rhythms and the cell cycle, plus also serve as a model for the engineering of complex and emergent phenomena. The central questions that we address are: Can we build genetic oscillators that are more robust than those already constructed? Can we make genetic oscillators arbitrarily robust? These questions are technically challenging due to the large model and parameter spaces that must be efficiently explored. Here we use a measure of robustness that coincides with the Bayesian model evidence, combined with an efficient Monte Carlo method to traverse model space and concentrate on regions of high robustness, which enables the accurate evaluation of the relative robustness of gene network models governed by stochastic dynamics. We report the most robust two and three gene oscillator systems, plus examine how the number of interactions, the presence of autoregulation, and degradation of mRNA and protein affects the frequency, amplitude, and robustness of transcriptional oscillators. We also find that there is a limit to parametric robustness, beyond which there is nothing to be gained by adding additional feedback. Importantly, we provide predictions on new oscillator systems that can be constructed to verify the theory and advance design and modeling approaches to systems and synthetic biology. PMID:26835539

  19. Development and pilot test of a process to identify research needs from a systematic review.

    PubMed

    Saldanha, Ian J; Wilson, Lisa M; Bennett, Wendy L; Nicholson, Wanda K; Robinson, Karen A

    2013-05-01

    To ensure appropriate allocation of research funds, we need methods for identifying high-priority research needs. We developed and pilot tested a process to identify needs for primary clinical research using a systematic review in gestational diabetes mellitus. We conducted eight steps: abstract research gaps from a systematic review using the Population, Intervention, Comparison, Outcomes, and Settings (PICOS) framework; solicit feedback from the review authors; translate gaps into researchable questions using the PICOS framework; solicit feedback from multidisciplinary stakeholders at our institution; establish consensus among multidisciplinary external stakeholders on the importance of the research questions using the Delphi method; prioritize outcomes; develop conceptual models to highlight research needs; and evaluate the process. We identified 19 research questions. During the Delphi method, external stakeholders established consensus for 16 of these 19 questions (15 with "high" and 1 with "medium" clinical benefit/importance). We pilot tested an eight-step process to identify clinically important research needs. Before wider application of this process, it should be tested using systematic reviews of other diseases. Further evaluation should include assessment of the usefulness of the research needs generated using this process for primary researchers and funders. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Relationship of cranial robusticity to cranial form, geography and climate in Homo sapiens.

    PubMed

    Baab, Karen L; Freidline, Sarah E; Wang, Steven L; Hanson, Timothy

    2010-01-01

    Variation in cranial robusticity among modern human populations is widely acknowledged but not well-understood. While the use of "robust" cranial traits in hominin systematics and phylogeny suggests that these characters are strongly heritable, this hypothesis has not been tested. Alternatively, cranial robusticity may be a response to differences in diet/mastication or it may be an adaptation to cold, harsh environments. This study quantifies the distribution of cranial robusticity in 14 geographically widespread human populations, and correlates this variation with climatic variables, neutral genetic distances, cranial size, and cranial shape. With the exception of the occipital torus region, all traits were positively correlated with each other, suggesting that they should not be treated as individual characters. While males are more robust than females within each of the populations, among the independent variables (cranial shape, size, climate, and neutral genetic distances), only shape is significantly correlated with inter-population differences in robusticity. Two-block partial least-squares analysis was used to explore the relationship between cranial shape (captured by three-dimensional landmark data) and robusticity across individuals. Weak support was found for the hypothesis that robusticity was related to mastication as the shape associated with greater robusticity was similar to that described for groups that ate harder-to-process diets. Specifically, crania with more prognathic faces, expanded glabellar and occipital regions, and (slightly) longer skulls were more robust than those with rounder vaults and more orthognathic faces. However, groups with more mechanically demanding diets (hunter-gatherers) were not always more robust than groups practicing some form of agriculture.

  1. Robust Fixed-Structure Controller Synthesis

    NASA Technical Reports Server (NTRS)

    Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)

    2000-01-01

    The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.

  2. Robustness of spatial micronetworks

    NASA Astrophysics Data System (ADS)

    McAndrew, Thomas C.; Danforth, Christopher M.; Bagrow, James P.

    2015-04-01

    Power lines, roadways, pipelines, and other physical infrastructure are critical to modern society. These structures may be viewed as spatial networks where geographic distances play a role in the functionality and construction cost of links. Traditionally, studies of network robustness have primarily considered the connectedness of large, random networks. Yet for spatial infrastructure, physical distances must also play a role in network robustness. Understanding the robustness of small spatial networks is particularly important with the increasing interest in microgrids, i.e., small-area distributed power grids that are well suited to using renewable energy resources. We study the random failures of links in small networks where functionality depends on both spatial distance and topological connectedness. By introducing a percolation model where the failure of each link is proportional to its spatial length, we find that when failures depend on spatial distances, networks are more fragile than expected. Accounting for spatial effects in both construction and robustness is important for designing efficient microgrids and other network infrastructure.

  3. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  4. Gradient descent for robust kernel-based regression

    NASA Astrophysics Data System (ADS)

    Guo, Zheng-Chu; Hu, Ting; Shi, Lei

    2018-06-01

    In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.

  5. Robust statistical methods for impulse noise suppressing of spread spectrum induced polarization data, with application to a mine site, Gansu province, China

    NASA Astrophysics Data System (ADS)

    Liu, Weiqiang; Chen, Rujun; Cai, Hongzhu; Luo, Weibin

    2016-12-01

    In this paper, we investigated the robust processing of noisy spread spectrum induced polarization (SSIP) data. SSIP is a new frequency domain induced polarization method that transmits pseudo-random m-sequence as source current where m-sequence is a broadband signal. The potential information at multiple frequencies can be obtained through measurement. Removing the noise is a crucial problem for SSIP data processing. Considering that if the ordinary mean stack and digital filter are not capable of reducing the impulse noise effectively in SSIP data processing, the impact of impulse noise will remain in the complex resistivity spectrum that will affect the interpretation of profile anomalies. We implemented a robust statistical method to SSIP data processing. The robust least-squares regression is used to fit and remove the linear trend from the original data before stacking. The robust M estimate is used to stack the data of all periods. The robust smooth filter is used to suppress the residual noise for data after stacking. For robust statistical scheme, the most appropriate influence function and iterative algorithm are chosen by testing the simulated data to suppress the outliers' influence. We tested the benefits of the robust SSIP data processing using examples of SSIP data recorded in a test site beside a mine in Gansu province, China.

  6. Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration

    NASA Astrophysics Data System (ADS)

    Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.

    2017-12-01

    Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.

  7. Problems Identifying Independent and Dependent Variables

    ERIC Educational Resources Information Center

    Leatham, Keith R.

    2012-01-01

    This paper discusses one step from the scientific method--that of identifying independent and dependent variables--from both scientific and mathematical perspectives. It begins by analyzing an episode from a middle school mathematics classroom that illustrates the need for students and teachers alike to develop a robust understanding of…

  8. Analysis and improvements of Adaptive Particle Refinement (APR) through CPU time, accuracy and robustness considerations

    NASA Astrophysics Data System (ADS)

    Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.

    2018-02-01

    While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.

  9. A Highly Stretchable and Robust Non-fluorinated Superhydrophobic Surface.

    PubMed

    Ju, Jie; Yao, Xi; Hou, Xu; Liu, Qihan; Zhang, Yu Shrike; Khademhosseini, Ali

    2017-08-21

    Superhydrophobic surface simultaneously possessing exceptional stretchability, robustness, and non-fluorination is highly desirable in applications ranging from wearable devices to artificial skins. While conventional superhydrophobic surfaces typically feature stretchability, robustness, or non-fluorination individually, co-existence of all these features still remains a great challenge. Here we report a multi-performance superhydrophobic surface achieved through incorporating hydrophilic micro-sized particles with pre-stretched silicone elastomer. The commercial silicone elastomer (Ecoflex) endowed the resulting surface with high stretchability; the densely packed micro-sized particles in multi-layers contributed to the preservation of the large surface roughness even under large strains; and the physical encapsulation of the microparticles by silicone elastomer due to the capillary dragging effect and the chemical interaction between the hydrophilic silica and the elastomer gave rise to the robust and non-fluorinated superhydrophobicity. It was demonstrated that the as-prepared fluorine-free surface could preserve the superhydrophobicity under repeated stretching-relaxing cycles. Most importantly, the surface's superhydrophobicity can be well maintained after severe rubbing process, indicating wear-resistance. Our novel superhydrophobic surface integrating multiple key properties, i.e. stretchability, robustness, and non-fluorination, is expected to provide unique advantages for a wide range of applications in biomedicine, energy, and electronics.

  10. Neutrality and Robustness in Evo-Devo: Emergence of Lateral Inhibition

    PubMed Central

    Munteanu, Andreea; Solé, Ricard V.

    2008-01-01

    Embryonic development is defined by the hierarchical dynamical process that translates genetic information (genotype) into a spatial gene expression pattern (phenotype) providing the positional information for the correct unfolding of the organism. The nature and evolutionary implications of genotype–phenotype mapping still remain key topics in evolutionary developmental biology (evo-devo). We have explored here issues of neutrality, robustness, and diversity in evo-devo by means of a simple model of gene regulatory networks. The small size of the system allowed an exhaustive analysis of the entire fitness landscape and the extent of its neutrality. This analysis shows that evolution leads to a class of robust genetic networks with an expression pattern characteristic of lateral inhibition. This class is a repertoire of distinct implementations of this key developmental process, the diversity of which provides valuable clues about its underlying causal principles. PMID:19023404

  11. Multilevel robustness

    NASA Astrophysics Data System (ADS)

    Girard, Henri-Louis; Khan, Sami; Varanasi, Kripa K.

    2018-03-01

    A combination of hard, soft and nanoscale organic components results in robust superhydrophobic surfaces that can withstand mechanical abrasion and chemical oxidation, and exhibit excellent substrate adhesion.

  12. Eco-Efficient Process Improvement at the Early Development Stage: Identifying Environmental and Economic Process Hotspots for Synergetic Improvement Potential.

    PubMed

    Piccinno, Fabiano; Hischier, Roland; Seeger, Stefan; Som, Claudia

    2018-05-15

    We present here a new eco-efficiency process-improvement method to highlight combined environmental and costs hotspots of the production process of new material at a very early development stage. Production-specific and scaled-up results for life cycle assessment (LCA) and production costs are combined in a new analysis to identify synergetic improvement potentials and trade-offs, setting goals for the eco-design of new processes. The identified hotspots and bottlenecks will help users to focus on the relevant steps for improvements from an eco-efficiency perspective and potentially reduce their associated environmental impacts and production costs. Our method is illustrated with a case study of nanocellulose. The results indicate that the production route should start with carrot pomace, use heat and solvent recovery, and deactivate the enzymes with bleach instead of heat. To further improve the process, the results show that focus should be laid on the carrier polymer, sodium alginate, and the production of the GripX coating. Overall, the method shows that the underlying LCA scale-up framework is valuable for purposes beyond conventional LCA studies and is applicable at a very early stage to provide researchers with a better understanding of their production process.

  13. Mechanisms for Robust Cognition

    ERIC Educational Resources Information Center

    Walsh, Matthew M.; Gluck, Kevin A.

    2015-01-01

    To function well in an unpredictable environment using unreliable components, a system must have a high degree of robustness. Robustness is fundamental to biological systems and is an objective in the design of engineered systems such as airplane engines and buildings. Cognitive systems, like biological and engineered systems, exist within…

  14. A Robust Post-Processing Workflow for Datasets with Motion Artifacts in Diffusion Kurtosis Imaging

    PubMed Central

    Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X.; Wan, Mingxi

    2014-01-01

    Purpose The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). Materials and methods The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). Results The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). Conclusion The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI

  15. Robust design of configurations and parameters of adaptable products

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Chen, Yongliang; Xue, Deyi; Gu, Peihua

    2014-03-01

    An adaptable product can satisfy different customer requirements by changing its configuration and parameter values during the operation stage. Design of adaptable products aims at reducing the environment impact through replacement of multiple different products with single adaptable ones. Due to the complex architecture, multiple functional requirements, and changes of product configurations and parameter values in operation, impact of uncertainties to the functional performance measures needs to be considered in design of adaptable products. In this paper, a robust design approach is introduced to identify the optimal design configuration and parameters of an adaptable product whose functional performance measures are the least sensitive to uncertainties. An adaptable product in this paper is modeled by both configurations and parameters. At the configuration level, methods to model different product configuration candidates in design and different product configuration states in operation to satisfy design requirements are introduced. At the parameter level, four types of product/operating parameters and relations among these parameters are discussed. A two-level optimization approach is developed to identify the optimal design configuration and its parameter values of the adaptable product. A case study is implemented to illustrate the effectiveness of the newly developed robust adaptable design method.

  16. Optimisation of multiplet identifier processing on a PLAYSTATION® 3

    NASA Astrophysics Data System (ADS)

    Hattori, Masami; Mizuno, Takashi

    2010-02-01

    To enable high-performance computing (HPC) for applications with large datasets using a Sony® PLAYSTATION® 3 (PS3™) video game console, we configured a hybrid system consisting of a Windows® PC and a PS3™. To validate this system, we implemented the real-time multiplet identifier (RTMI) application, which identifies multiplets of microearthquakes in terms of the similarity of their waveforms. The cross-correlation computation, which is a core algorithm of the RTMI application, was optimised for the PS3™ platform, while the rest of the computation, including data input and output remained on the PC. With this configuration, the core part of the algorithm ran 69 times faster than the original program, accelerating total computation speed more than five times. As a result, the system processed up to 2100 total microseismic events, whereas the original implementation had a limit of 400 events. These results indicate that this system enables high-performance computing for large datasets using the PS3™, as long as data transfer time is negligible compared with computation time.

  17. Tuning and Robustness Analysis for the Orion Absolute Navigation System

    NASA Technical Reports Server (NTRS)

    Holt, Greg N.; Zanetti, Renato; D'Souza, Christopher

    2013-01-01

    The Orion Multi-Purpose Crew Vehicle (MPCV) is currently under development as NASA's next-generation spacecraft for exploration missions beyond Low Earth Orbit. The MPCV is set to perform an orbital test flight, termed Exploration Flight Test 1 (EFT-1), some time in late 2014. The navigation system for the Orion spacecraft is being designed in a Multi-Organizational Design Environment (MODE) team including contractor and NASA personnel. The system uses an Extended Kalman Filter to process measurements and determine the state. The design of the navigation system has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to show the efforts made to-date in tuning the filter for the EFT-1 mission and instilling appropriate robustness into the system to meet the requirements of manned space ight. Filter performance is affected by many factors: data rates, sensor measurement errors, tuning, and others. This paper focuses mainly on the error characterization and tuning portion. Traditional efforts at tuning a navigation filter have centered around the observation/measurement noise and Gaussian process noise of the Extended Kalman Filter. While the Orion MODE team must certainly address those factors, the team is also looking at residual edit thresholds and measurement underweighting as tuning tools. Tuning analysis is presented with open loop Monte-Carlo simulation results showing statistical errors bounded by the 3-sigma filter uncertainty covariance. The Orion filter design uses 24 Exponentially Correlated Random Variable (ECRV) parameters to estimate the accel/gyro misalignment and nonorthogonality. By design, the time constant and noise terms of these ECRV parameters were set to manufacturer specifications and not used as tuning parameters. They are included in the filter as a more analytically correct method of modeling uncertainties than ad-hoc tuning of the process noise. Tuning is explored for the

  18. A robust watermarking scheme using lifting wavelet transform and singular value decomposition

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Anuj; Verma, Deval; Verma, Vivek Singh

    2017-01-01

    The present paper proposes a robust image watermarking scheme using lifting wavelet transform (LWT) and singular value decomposition (SVD). Second level LWT is applied on host/cover image to decompose into different subbands. SVD is used to obtain singular values of watermark image and then these singular values are updated with the singular values of LH2 subband. The algorithm is tested on a number of benchmark images and it is found that the present algorithm is robust against different geometric and image processing operations. A comparison of the proposed scheme is performed with other existing schemes and observed that the present scheme is better not only in terms of robustness but also in terms of imperceptibility.

  19. The topological requirements for robust perfect adaptation in networks of any size.

    PubMed

    Araujo, Robyn P; Liotta, Lance A

    2018-05-01

    Robustness, and the ability to function and thrive amid changing and unfavorable environments, is a fundamental requirement for living systems. Until now it has been an open question how large and complex biological networks can exhibit robust behaviors, such as perfect adaptation to a variable stimulus, since complexity is generally associated with fragility. Here we report that all networks that exhibit robust perfect adaptation (RPA) to a persistent change in stimulus are decomposable into well-defined modules, of which there exist two distinct classes. These two modular classes represent a topological basis for all RPA-capable networks, and generate the full set of topological realizations of the internal model principle for RPA in complex, self-organizing, evolvable bionetworks. This unexpected result supports the notion that evolutionary processes are empowered by simple and scalable modular design principles that promote robust performance no matter how large or complex the underlying networks become.

  20. A new robust adaptive controller for vibration control of active engine mount subjected to large uncertainties

    NASA Astrophysics Data System (ADS)

    Fakhari, Vahid; Choi, Seung-Bok; Cho, Chang-Hyun

    2015-04-01

    This work presents a new robust model reference adaptive control (MRAC) for vibration control caused from vehicle engine using an electromagnetic type of active engine mount. Vibration isolation performances of the active mount associated with the robust controller are evaluated in the presence of large uncertainties. As a first step, an active mount with linear solenoid actuator is prepared and its dynamic model is identified via experimental test. Subsequently, a new robust MRAC based on the gradient method with σ-modification is designed by selecting a proper reference model. In designing the robust adaptive control, structured (parametric) uncertainties in the stiffness of the passive part of the mount and in damping ratio of the active part of the mount are considered to investigate the robustness of the proposed controller. Experimental and simulation results are presented to evaluate performance focusing on the robustness behavior of the controller in the face of large uncertainties. The obtained results show that the proposed controller can sufficiently provide the robust vibration control performance even in the presence of large uncertainties showing an effective vibration isolation.

  1. Robust image watermarking using DWT and SVD for copyright protection

    NASA Astrophysics Data System (ADS)

    Harjito, Bambang; Suryani, Esti

    2017-02-01

    The Objective of this paper is proposed a robust combined Discrete Wavelet Transform (DWT) and Singular Value Decomposition (SVD). The RGB image is called a cover medium, and watermark image is converted into gray scale. Then, they are transformed using DWT so that they can be split into several subbands, namely sub-band LL2, LH2, HL2. The watermark image embeds into the cover medium on sub-band LL2. This scheme aims to obtain the higher robustness level than the previous method which performs of SVD matrix factorization image for copyright protection. The experiment results show that the proposed method has robustness against several image processing attacks such as Gaussian, Poisson and Salt and Pepper Noise. In these attacks, noise has average Normalized Correlation (NC) values of 0.574863 0.889784, 0.889782 respectively. The watermark image can be detected and extracted.

  2. Machine learning for inverse lithography: using stochastic gradient descent for robust photomask synthesis

    NASA Astrophysics Data System (ADS)

    Jia, Ningning; Y Lam, Edmund

    2010-04-01

    Inverse lithography technology (ILT) synthesizes photomasks by solving an inverse imaging problem through optimization of an appropriate functional. Much effort on ILT is dedicated to deriving superior masks at a nominal process condition. However, the lower k1 factor causes the mask to be more sensitive to process variations. Robustness to major process variations, such as focus and dose variations, is desired. In this paper, we consider the focus variation as a stochastic variable, and treat the mask design as a machine learning problem. The stochastic gradient descent approach, which is a useful tool in machine learning, is adopted to train the mask design. Compared with previous work, simulation shows that the proposed algorithm is effective in producing robust masks.

  3. Robust visual tracking via multiscale deep sparse networks

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Hou, Zhiqiang; Yu, Wangsheng; Xue, Yang; Jin, Zefenfen; Dai, Bo

    2017-04-01

    In visual tracking, deep learning with offline pretraining can extract more intrinsic and robust features. It has significant success solving the tracking drift in a complicated environment. However, offline pretraining requires numerous auxiliary training datasets and is considerably time-consuming for tracking tasks. To solve these problems, a multiscale sparse networks-based tracker (MSNT) under the particle filter framework is proposed. Based on the stacked sparse autoencoders and rectifier linear unit, the tracker has a flexible and adjustable architecture without the offline pretraining process and exploits the robust and powerful features effectively only through online training of limited labeled data. Meanwhile, the tracker builds four deep sparse networks of different scales, according to the target's profile type. During tracking, the tracker selects the matched tracking network adaptively in accordance with the initial target's profile type. It preserves the inherent structural information more efficiently than the single-scale networks. Additionally, a corresponding update strategy is proposed to improve the robustness of the tracker. Extensive experimental results on a large scale benchmark dataset show that the proposed method performs favorably against state-of-the-art methods in challenging environments.

  4. Robust crossfeed design for hovering rotorcraft. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Catapang, David R.

    1993-01-01

    Control law design for rotorcraft fly-by-wire systems normally attempts to decouple angular responses using fixed-gain crossfeeds. This approach can lead to poor decoupling over the frequency range of pilot inputs and increase the load on the feedback loops. In order to improve the decoupling performance, dynamic crossfeeds may be adopted. Moreover, because of the large changes that occur in rotorcraft dynamics due to small changes about the nominal design condition, especially for near-hovering flight, the crossfeed design must be 'robust.' A new low-order matching method is presented here to design robost crossfeed compensators for multi-input, multi-output (MIMO) systems. The technique identifies degrees-of-freedom that can be decoupled using crossfeeds, given an anticipated set of parameter variations for the range of flight conditions of concern. Cross-coupling is then reduced for degrees-of-freedom that can use crossfeed compensation by minimizing off-axis response magnitude average and variance. Results are presented for the analysis of pitch, roll, yaw, and heave coupling of the UH-60 Black Hawk helicopter in near-hovering flight. Robust crossfeeds are designed that show significant improvement in decoupling performance and robustness over nominal, single design point, compensators. The design method and results are presented in an easily-used graphical format that lends significant physical insight to the design procedure. This plant pre-compensation technique is an appropriate preliminary step to the design of robust feedback control laws for rotorcraft.

  5. Using adaptive processes and adverse outcome pathways to develop meaningful, robust, and actionable environmental monitoring programs.

    PubMed

    Arciszewski, Tim J; Munkittrick, Kelly R; Scrimgeour, Garry J; Dubé, Monique G; Wrona, Fred J; Hazewinkel, Rod R

    2017-09-01

    The primary goals of environmental monitoring are to indicate whether unexpected changes related to development are occurring in the physical, chemical, and biological attributes of ecosystems and to inform meaningful management intervention. Although achieving these objectives is conceptually simple, varying scientific and social challenges often result in their breakdown. Conceptualizing, designing, and operating programs that better delineate monitoring, management, and risk assessment processes supported by hypothesis-driven approaches, strong inference, and adverse outcome pathways can overcome many of the challenges. Generally, a robust monitoring program is characterized by hypothesis-driven questions associated with potential adverse outcomes and feedback loops informed by data. Specifically, key and basic features are predictions of future observations (triggers) and mechanisms to respond to success or failure of those predictions (tiers). The adaptive processes accelerate or decelerate the effort to highlight and overcome ignorance while preventing the potentially unnecessary escalation of unguided monitoring and management. The deployment of the mutually reinforcing components can allow for more meaningful and actionable monitoring programs that better associate activities with consequences. Integr Environ Assess Manag 2017;13:877-891. © 2017 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  6. Identifying target processes for microbial electrosynthesis by elementary mode analysis.

    PubMed

    Kracke, Frauke; Krömer, Jens O

    2014-12-30

    Microbial electrosynthesis and electro fermentation are techniques that aim to optimize microbial production of chemicals and fuels by regulating the cellular redox balance via interaction with electrodes. While the concept is known for decades major knowledge gaps remain, which make it hard to evaluate its biotechnological potential. Here we present an in silico approach to identify beneficial production processes for electro fermentation by elementary mode analysis. Since the fundamentals of electron transport between electrodes and microbes have not been fully uncovered yet, we propose different options and discuss their impact on biomass and product yields. For the first time 20 different valuable products were screened for their potential to show increased yields during anaerobic electrically enhanced fermentation. Surprisingly we found that an increase in product formation by electrical enhancement is not necessarily dependent on the degree of reduction of the product but rather the metabolic pathway it is derived from. We present a variety of beneficial processes with product yield increases of maximal 36% in reductive and 84% in oxidative fermentations and final theoretical product yields up to 100%. This includes compounds that are already produced at industrial scale such as succinic acid, lysine and diaminopentane as well as potential novel bio-commodities such as isoprene, para-hydroxybenzoic acid and para-aminobenzoic acid. Furthermore, it is shown that the way of electron transport has major impact on achievable biomass and product yields. The coupling of electron transport to energy conservation could be identified as crucial for most processes. This study introduces a powerful tool to determine beneficial substrate and product combinations for electro-fermentation. It also highlights that the maximal yield achievable by bio electrochemical techniques depends strongly on the actual electron transport mechanisms. Therefore it is of great importance to

  7. Design of forging process variables under uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2005-02-01

    Forging is a complex nonlinear process that is vulnerable to various manufacturing anomalies, such as variations in billet geometry, billet/die temperatures, material properties, and workpiece and forging equipment positional errors. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion, and reduced productivity. Identifying, quantifying, and controlling the uncertainties will reduce variability risk in a manufacturing environment, which will minimize the overall production cost. In this article, various uncertainties that affect the forging process are identified, and their cumulative effect on the forging tool life is evaluated. Because the forging process simulation is time-consuming, a response surface model is used to reduce computation time by establishing a relationship between the process performance and the critical process variables. A robust design methodology is developed by incorporating reliability-based optimization techniques to obtain sound forging components. A case study of an automotive-component forging-process design is presented to demonstrate the applicability of the method.

  8. The effects of ecology and evolutionary history on robust capuchin morphological diversity.

    PubMed

    Wright, Kristin A; Wright, Barth W; Ford, Susan M; Fragaszy, Dorothy; Izar, Patricia; Norconk, Marilyn; Masterson, Thomas; Hobbs, David G; Alfaro, Michael E; Lynch Alfaro, Jessica W

    2015-01-01

    Recent molecular work has confirmed the long-standing morphological hypothesis that capuchins are comprised of two distinct clades, the gracile (untufted) capuchins (genus Cebus, Erxleben, 1777) and the robust (tufted) capuchins (genus Sapajus Kerr, 1792). In the past, the robust group was treated as a single, undifferentiated and cosmopolitan species, with data from all populations lumped together in morphological and ecological studies, obscuring morphological differences that might exist across this radiation. Genetic evidence suggests that the modern radiation of robust capuchins began diversifying ∼2.5 Ma, with significant subsequent geographic expansion into new habitat types. In this study we use a morphological sample of gracile and robust capuchin craniofacial and postcranial characters to examine how ecology and evolutionary history have contributed to morphological diversity within the robust capuchins. We predicted that if ecology is driving robust capuchin variation, three distinct robust morphotypes would be identified: (1) the Atlantic Forest species (Sapajus xanthosternos, S. robustus, and S. nigritus), (2) the Amazonian rainforest species (S. apella, S. cay and S. macrocephalus), and (3) the Cerrado-Caatinga species (S. libidinosus). Alternatively, if diversification time between species pairs predicts degree of morphological difference, we predicted that the recently diverged S. apella, S. macrocephalus, S. libidinosus, and S. cay would be morphologically comparable, with greater variation among the more ancient lineages of S. nigritus, S. xanthosternos, and S. robustus. Our analyses suggest that S. libidinosus has the most derived craniofacial and postcranial features, indicative of inhabiting a more terrestrial niche that includes a dependence on tool use for the extraction of imbedded foods. We also suggest that the cranial robusticity of S. macrocephalus and S. apella are indicative of recent competition with sympatric gracile capuchin

  9. Robust Regression for Slope Estimation in Curriculum-Based Measurement Progress Monitoring

    ERIC Educational Resources Information Center

    Mercer, Sterett H.; Lyons, Alina F.; Johnston, Lauren E.; Millhoff, Courtney L.

    2015-01-01

    Although ordinary least-squares (OLS) regression has been identified as a preferred method to calculate rates of improvement for individual students during curriculum-based measurement (CBM) progress monitoring, OLS slope estimates are sensitive to the presence of extreme values. Robust estimators have been developed that are less biased by…

  10. A sensitivity analysis of process design parameters, commodity prices and robustness on the economics of odour abatement technologies.

    PubMed

    Estrada, José M; Kraakman, N J R Bart; Lebrero, Raquel; Muñoz, Raúl

    2012-01-01

    The sensitivity of the economics of the five most commonly applied odour abatement technologies (biofiltration, biotrickling filtration, activated carbon adsorption, chemical scrubbing and a hybrid technology consisting of a biotrickling filter coupled with carbon adsorption) towards design parameters and commodity prices was evaluated. Besides, the influence of the geographical location on the Net Present Value calculated for a 20 years lifespan (NPV20) of each technology and its robustness towards typical process fluctuations and operational upsets were also assessed. This comparative analysis showed that biological techniques present lower operating costs (up to 6 times) and lower sensitivity than their physical/chemical counterparts, with the packing material being the key parameter affecting their operating costs (40-50% of the total operating costs). The use of recycled or partially treated water (e.g. secondary effluent in wastewater treatment plants) offers an opportunity to significantly reduce costs in biological techniques. Physical/chemical technologies present a high sensitivity towards H2S concentration, which is an important drawback due to the fluctuating nature of malodorous emissions. The geographical analysis evidenced high NPV20 variations around the world for all the technologies evaluated, but despite the differences in wage and price levels, biofiltration and biotrickling filtration are always the most cost-efficient alternatives (NPV20). When, in an economical evaluation, the robustness is as relevant as the overall costs (NPV20), the hybrid technology would move up next to BTF as the most preferred technologies. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  12. Thermotaxis is a Robust Mechanism for Thermoregulation in C. elegans Nematodes

    PubMed Central

    Ramot, Daniel; MacInnis, Bronwyn L.; Lee, Hau-Chen; Goodman, Miriam B.

    2013-01-01

    Many biochemical networks are robust to variations in network or stimulus parameters. Although robustness is considered an important design principle of such networks, it is not known whether this principle also applies to higher-level biological processes such as animal behavior. In thermal gradients, C. elegans uses thermotaxis to bias its movement along the direction of the gradient. Here we develop a detailed, quantitative map of C. elegans thermotaxis and use these data to derive a computational model of thermotaxis in the soil, a natural environment of C. elegans. This computational analysis indicates that thermotaxis enables animals to avoid temperatures at which they cannot reproduce, to limit excursions from their adapted temperature, and to remain relatively close to the surface of the soil, where oxygen is abundant. Furthermore, our analysis reveals that this mechanism is robust to large variations in the parameters governing both worm locomotion and temperature fluctuations in the soil. We suggest that, similar to biochemical networks, animals evolve behavioral strategies that are robust, rather than strategies that rely on fine-tuning of specific behavioral parameters. PMID:19020047

  13. TU-H-CAMPUS-JeP3-01: Towards Robust Adaptive Radiation Therapy Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boeck, M; KTH Royal Institute of Technology, Stockholm; Eriksson, K

    Purpose: To set up a framework combining robust treatment planning with adaptive reoptimization in order to maintain high treatment quality, to respond to interfractional variations and to identify those patients who will benefit the most from an adaptive fractionation schedule. Methods: We propose adaptive strategies based on stochastic minimax optimization for a series of simulated treatments on a one-dimensional patient phantom. The plan should be able to handle anticipated systematic and random errors and is applied during the first fractions. Information on the individual geometric variations is gathered at each fraction. At scheduled fractions, the impact of the measured errorsmore » on the delivered dose distribution is evaluated. For a patient that receives a dose that does not satisfy specified plan quality criteria, the plan is reoptimized based on these individual measurements using one of three different adaptive strategies. The reoptimized plan is then applied during future fractions until a new scheduled adaptation becomes necessary. In the first adaptive strategy the measured systematic and random error scenarios and their assigned probabilities are updated to guide the robust reoptimization. The focus of the second strategy lies on variation of the fraction of the worst scenarios taken into account during robust reoptimization. In the third strategy the uncertainty margins around the target are recalculated with the measured errors. Results: By studying the effect of the three adaptive strategies combined with various adaptation schedules on the same patient population, the group which benefits from adaptation is identified together with the most suitable strategy and schedule. Preliminary computational results indicate when and how best to adapt for the three different strategies. Conclusion: A workflow is presented that provides robust adaptation of the treatment plan throughout the course of treatment and useful measures to identify patients in

  14. Face Processing: Models For Recognition

    NASA Astrophysics Data System (ADS)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  15. A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: II. Probabilistic Guarantees on Constraint Satisfaction

    PubMed Central

    Li, Zukui; Floudas, Christodoulos A.

    2012-01-01

    Probabilistic guarantees on constraint satisfaction for robust counterpart optimization are studied in this paper. The robust counterpart optimization formulations studied are derived from box, ellipsoidal, polyhedral, “interval+ellipsoidal” and “interval+polyhedral” uncertainty sets (Li, Z., Ding, R., and Floudas, C.A., A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: I. Robust Linear and Robust Mixed Integer Linear Optimization, Ind. Eng. Chem. Res, 2011, 50, 10567). For those robust counterpart optimization formulations, their corresponding probability bounds on constraint satisfaction are derived for different types of uncertainty characteristic (i.e., bounded or unbounded uncertainty, with or without detailed probability distribution information). The findings of this work extend the results in the literature and provide greater flexibility for robust optimization practitioners in choosing tighter probability bounds so as to find less conservative robust solutions. Extensive numerical studies are performed to compare the tightness of the different probability bounds and the conservatism of different robust counterpart optimization formulations. Guiding rules for the selection of robust counterpart optimization models and for the determination of the size of the uncertainty set are discussed. Applications in production planning and process scheduling problems are presented. PMID:23329868

  16. Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery

    PubMed Central

    Sivaraks, Haemwaan

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284

  17. Beyond optimality: Multistakeholder robustness tradeoffs for regional water portfolio planning under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Herman, Jonathan D.; Zeff, Harrison B.; Reed, Patrick M.; Characklis, Gregory W.

    2014-10-01

    While optimality is a foundational mathematical concept in water resources planning and management, "optimal" solutions may be vulnerable to failure if deeply uncertain future conditions deviate from those assumed during optimization. These vulnerabilities may produce severely asymmetric impacts across a region, making it vital to evaluate the robustness of management strategies as well as their impacts for regional stakeholders. In this study, we contribute a multistakeholder many-objective robust decision making (MORDM) framework that blends many-objective search and uncertainty analysis tools to discover key tradeoffs between water supply alternatives and their robustness to deep uncertainties (e.g., population pressures, climate change, and financial risks). The proposed framework is demonstrated for four interconnected water utilities representing major stakeholders in the "Research Triangle" region of North Carolina, U.S. The utilities supply well over one million customers and have the ability to collectively manage drought via transfer agreements and shared infrastructure. We show that water portfolios for this region that compose optimal tradeoffs (i.e., Pareto-approximate solutions) under expected future conditions may suffer significantly degraded performance with only modest changes in deeply uncertain hydrologic and economic factors. We then use the Patient Rule Induction Method (PRIM) to identify which uncertain factors drive the individual and collective vulnerabilities for the four cooperating utilities. Our framework identifies key stakeholder dependencies and robustness tradeoffs associated with cooperative regional planning, which are critical to understanding the tensions between individual versus regional water supply goals. Cooperative demand management was found to be the key factor controlling the robustness of regional water supply planning, dominating other hydroclimatic and economic uncertainties through the 2025 planning horizon. Results

  18. Robust algebraic image enhancement for intelligent control systems

    NASA Technical Reports Server (NTRS)

    Lerner, Bao-Ting; Morrelli, Michael

    1993-01-01

    Robust vision capability for intelligent control systems has been an elusive goal in image processing. The computationally intensive techniques a necessary for conventional image processing make real-time applications, such as object tracking and collision avoidance difficult. In order to endow an intelligent control system with the needed vision robustness, an adequate image enhancement subsystem capable of compensating for the wide variety of real-world degradations, must exist between the image capturing and the object recognition subsystems. This enhancement stage must be adaptive and must operate with consistency in the presence of both statistical and shape-based noise. To deal with this problem, we have developed an innovative algebraic approach which provides a sound mathematical framework for image representation and manipulation. Our image model provides a natural platform from which to pursue dynamic scene analysis, and its incorporation into a vision system would serve as the front-end to an intelligent control system. We have developed a unique polynomial representation of gray level imagery and applied this representation to develop polynomial operators on complex gray level scenes. This approach is highly advantageous since polynomials can be manipulated very easily, and are readily understood, thus providing a very convenient environment for image processing. Our model presents a highly structured and compact algebraic representation of grey-level images which can be viewed as fuzzy sets.

  19. Robust quantum network architectures and topologies for entanglement distribution

    NASA Astrophysics Data System (ADS)

    Das, Siddhartha; Khatri, Sumeet; Dowling, Jonathan P.

    2018-01-01

    Entanglement distribution is a prerequisite for several important quantum information processing and computing tasks, such as quantum teleportation, quantum key distribution, and distributed quantum computing. In this work, we focus on two-dimensional quantum networks based on optical quantum technologies using dual-rail photonic qubits for the building of a fail-safe quantum internet. We lay out a quantum network architecture for entanglement distribution between distant parties using a Bravais lattice topology, with the technological constraint that quantum repeaters equipped with quantum memories are not easily accessible. We provide a robust protocol for simultaneous entanglement distribution between two distant groups of parties on this network. We also discuss a memory-based quantum network architecture that can be implemented on networks with an arbitrary topology. We examine networks with bow-tie lattice and Archimedean lattice topologies and use percolation theory to quantify the robustness of the networks. In particular, we provide figures of merit on the loss parameter of the optical medium that depend only on the topology of the network and quantify the robustness of the network against intermittent photon loss and intermittent failure of nodes. These figures of merit can be used to compare the robustness of different network topologies in order to determine the best topology in a given real-world scenario, which is critical in the realization of the quantum internet.

  20. High-throughput electrical characterization for robust overlay lithography control

    NASA Astrophysics Data System (ADS)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  1. Robust control for uncertain structures

    NASA Technical Reports Server (NTRS)

    Douglas, Joel; Athans, Michael

    1991-01-01

    Viewgraphs on robust control for uncertain structures are presented. Topics covered include: robust linear quadratic regulator (RLQR) formulas; mismatched LQR design; RLQR design; interpretations of RLQR design; disturbance rejection; and performance comparisons: RLQR vs. mismatched LQR.

  2. Identification of robust adaptation gene regulatory network parameters using an improved particle swarm optimization algorithm.

    PubMed

    Huang, X N; Ren, H P

    2016-05-13

    Robust adaptation is a critical ability of gene regulatory network (GRN) to survive in a fluctuating environment, which represents the system responding to an input stimulus rapidly and then returning to its pre-stimulus steady state timely. In this paper, the GRN is modeled using the Michaelis-Menten rate equations, which are highly nonlinear differential equations containing 12 undetermined parameters. The robust adaption is quantitatively described by two conflicting indices. To identify the parameter sets in order to confer the GRNs with robust adaptation is a multi-variable, multi-objective, and multi-peak optimization problem, which is difficult to acquire satisfactory solutions especially high-quality solutions. A new best-neighbor particle swarm optimization algorithm is proposed to implement this task. The proposed algorithm employs a Latin hypercube sampling method to generate the initial population. The particle crossover operation and elitist preservation strategy are also used in the proposed algorithm. The simulation results revealed that the proposed algorithm could identify multiple solutions in one time running. Moreover, it demonstrated a superior performance as compared to the previous methods in the sense of detecting more high-quality solutions within an acceptable time. The proposed methodology, owing to its universality and simplicity, is useful for providing the guidance to design GRN with superior robust adaptation.

  3. Robust biological parametric mapping: an improved technique for multimodal brain image analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.

    2011-03-01

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.

  4. Robust estimation approach for blind denoising.

    PubMed

    Rabie, Tamer

    2005-11-01

    This work develops a new robust statistical framework for blind image denoising. Robust statistics addresses the problem of estimation when the idealized assumptions about a system are occasionally violated. The contaminating noise in an image is considered as a violation of the assumption of spatial coherence of the image intensities and is treated as an outlier random variable. A denoised image is estimated by fitting a spatially coherent stationary image model to the available noisy data using a robust estimator-based regression method within an optimal-size adaptive window. The robust formulation aims at eliminating the noise outliers while preserving the edge structures in the restored image. Several examples demonstrating the effectiveness of this robust denoising technique are reported and a comparison with other standard denoising filters is presented.

  5. De-identification of health records using Anonym: effectiveness and robustness across datasets.

    PubMed

    Zuccon, Guido; Kotzur, Daniel; Nguyen, Anthony; Bergheim, Anton

    2014-07-01

    Evaluate the effectiveness and robustness of Anonym, a tool for de-identifying free-text health records based on conditional random fields classifiers informed by linguistic and lexical features, as well as features extracted by pattern matching techniques. De-identification of personal health information in electronic health records is essential for the sharing and secondary usage of clinical data. De-identification tools that adapt to different sources of clinical data are attractive as they would require minimal intervention to guarantee high effectiveness. The effectiveness and robustness of Anonym are evaluated across multiple datasets, including the widely adopted Integrating Biology and the Bedside (i2b2) dataset, used for evaluation in a de-identification challenge. The datasets used here vary in type of health records, source of data, and their quality, with one of the datasets containing optical character recognition errors. Anonym identifies and removes up to 96.6% of personal health identifiers (recall) with a precision of up to 98.2% on the i2b2 dataset, outperforming the best system proposed in the i2b2 challenge. The effectiveness of Anonym across datasets is found to depend on the amount of information available for training. Findings show that Anonym compares to the best approach from the 2006 i2b2 shared task. It is easy to retrain Anonym with new datasets; if retrained, the system is robust to variations of training size, data type and quality in presence of sufficient training data. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  6. Model of areas for identifying risks influencing the compliance of technological processes and products

    NASA Astrophysics Data System (ADS)

    Misztal, A.; Belu, N.

    2016-08-01

    Operation of every company is associated with the risk of interfering with proper performance of its fundamental processes. This risk is associated with various internal areas of the company, as well as the environment in which it operates. From the point of view of ensuring compliance of the course of specific technological processes and, consequently, product conformity with requirements, it is important to identify these threats and eliminate or reduce the risk of their occurrence. The purpose of this article is to present a model of areas of identifying risk affecting the compliance of processes and products, which is based on multiregional targeted monitoring of typical places of interference and risk management methods. The model is based on the verification of risk analyses carried out in small and medium-sized manufacturing companies in various industries..

  7. Sulfite pretreatment (SPORL) for robust enzymatic saccharification of spruce and red pine

    Treesearch

    J.Y. Zhu; X.J. Pan; G.S. Wang; R. Gleisner

    2009-01-01

    This study established a novel process using sulfite pretreatment to overcome recalcitrance of lignocellulose (SPORL) for robust and efficient bioconversion of softwoods. The process consists of sulfite treatment of wood chips under acidic conditions followed by mechanical size reduction using disk refining. The results indicated that after the SPORL pretreatment of...

  8. CORROSION PROCESS IN REINFORCED CONCRETE IDENTIFIED BY ACOUSTIC EMISSION

    NASA Astrophysics Data System (ADS)

    Kawasaki, Yuma; Kitaura, Misuzu; Tomoda, Yuichi; Ohtsu, Masayasu

    Deterioration of Reinforced Concrete (RC) due to salt attack is known as one of serious problems. Thus, development of non-destructive evaluation (NDE) techniques is important to assess the corrosion process. Reinforcement in concrete normally does not corrode because of a passive film on the surface of reinforcement. When chloride concentration at reinfo rcement exceeds the threshold level, the passive film is destroyed. Thus maintenance is desirable at an early stage. In this study, to identify the onset of corrosion and the nucleation of corrosion-induced cracking in concrete due to expansion of corrosion products, continuous acoustic emission (AE) monitoring is applied. Accelerated corrosion and cyclic wet and dry tests are performed in a laboratory. The SiGMA (Simplified Green's functions for Moment tensor Analysis) proce dure is applied to AE waveforms to clarify source kinematics of micro-cracks locations, types and orientations. Results show that the onset of corrosion and the nu cleation of corrosion-induced cracking in concrete are successfully identified. Additionally, cross-sections inside the reinforcement are observed by a scanning electron microscope (SEM). From these results, a great promise for AE techniques to monitor salt damage at an early stage in RC structures is demonstrated.

  9. A robust color image watermarking algorithm against rotation attacks

    NASA Astrophysics Data System (ADS)

    Han, Shao-cheng; Yang, Jin-feng; Wang, Rui; Jia, Gui-min

    2018-01-01

    A robust digital watermarking algorithm is proposed based on quaternion wavelet transform (QWT) and discrete cosine transform (DCT) for copyright protection of color images. The luminance component Y of a host color image in YIQ space is decomposed by QWT, and then the coefficients of four low-frequency subbands are transformed by DCT. An original binary watermark scrambled by Arnold map and iterated sine chaotic system is embedded into the mid-frequency DCT coefficients of the subbands. In order to improve the performance of the proposed algorithm against rotation attacks, a rotation detection scheme is implemented before watermark extracting. The experimental results demonstrate that the proposed watermarking scheme shows strong robustness not only against common image processing attacks but also against arbitrary rotation attacks.

  10. Non-Hermitian bidirectional robust transport

    NASA Astrophysics Data System (ADS)

    Longhi, Stefano

    2017-01-01

    Transport of quantum or classical waves in open systems is known to be strongly affected by non-Hermitian terms that arise from an effective description of system-environment interaction. A simple and paradigmatic example of non-Hermitian transport, originally introduced by Hatano and Nelson two decades ago [N. Hatano and D. R. Nelson, Phys. Rev. Lett. 77, 570 (1996), 10.1103/PhysRevLett.77.570], is the hopping dynamics of a quantum particle on a one-dimensional tight-binding lattice in the presence of an imaginary vectorial potential. The imaginary gauge field can prevent Anderson localization via non-Hermitian delocalization, opening up a mobility region and realizing robust transport immune to disorder and backscattering. Like for robust transport of topologically protected edge states in quantum Hall and topological insulator systems, non-Hermitian robust transport in the Hatano-Nelson model is unidirectional. However, there is not any physical impediment to observe robust bidirectional non-Hermitian transport. Here it is shown that in a quasi-one-dimensional zigzag lattice, with non-Hermitian (imaginary) hopping amplitudes and a synthetic gauge field, robust transport immune to backscattering can occur bidirectionally along the lattice.

  11. Robust control of combustion instabilities

    NASA Astrophysics Data System (ADS)

    Hong, Boe-Shong

    Several interactive dynamical subsystems, each of which has its own time-scale and physical significance, are decomposed to build a feedback-controlled combustion- fluid robust dynamics. On the fast-time scale, the phenomenon of combustion instability is corresponding to the internal feedback of two subsystems: acoustic dynamics and flame dynamics, which are parametrically dependent on the slow-time-scale mean-flow dynamics controlled for global performance by a mean-flow controller. This dissertation constructs such a control system, through modeling, analysis and synthesis, to deal with model uncertainties, environmental noises and time- varying mean-flow operation. Conservation law is decomposed as fast-time acoustic dynamics and slow-time mean-flow dynamics, served for synthesizing LPV (linear parameter varying)- L2-gain robust control law, in which a robust observer is embedded for estimating and controlling the internal status, while achieving trade- offs among robustness, performances and operation. The robust controller is formulated as two LPV-type Linear Matrix Inequalities (LMIs), whose numerical solver is developed by finite-element method. Some important issues related to physical understanding and engineering application are discussed in simulated results of the control system.

  12. SU-E-J-212: Identifying Bones From MRI: A Dictionary Learnign and Sparse Regression Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruan, D; Yang, Y; Cao, M

    2014-06-01

    Purpose: To develop an efficient and robust scheme to identify bony anatomy based on MRI-only simulation images. Methods: MRI offers important soft tissue contrast and functional information, yet its lack of correlation to electron-density has placed it as an auxiliary modality to CT in radiotherapy simulation and adaptation. An effective scheme to identify bony anatomy is an important first step towards MR-only simulation/treatment paradigm and would satisfy most practical purposes. We utilize a UTE acquisition sequence to achieve visibility of the bone. By contrast to manual + bulk or registration-to identify bones, we propose a novel learning-based approach for improvedmore » robustness to MR artefacts and environmental changes. Specifically, local information is encoded with MR image patch, and the corresponding label is extracted (during training) from simulation CT aligned to the UTE. Within each class (bone vs. nonbone), an overcomplete dictionary is learned so that typical patches within the proper class can be represented as a sparse combination of the dictionary entries. For testing, an acquired UTE-MRI is divided to patches using a sliding scheme, where each patch is sparsely regressed against both bone and nonbone dictionaries, and subsequently claimed to be associated with the class with the smaller residual. Results: The proposed method has been applied to the pilot site of brain imaging and it has showed general good performance, with dice similarity coefficient of greater than 0.9 in a crossvalidation study using 4 datasets. Importantly, it is robust towards consistent foreign objects (e.g., headset) and the artefacts relates to Gibbs and field heterogeneity. Conclusion: A learning perspective has been developed for inferring bone structures based on UTE MRI. The imaging setting is subject to minimal motion effects and the post-processing is efficient. The improved efficiency and robustness enables a first translation to MR-only routine. The

  13. A fast, robust and tunable synthetic gene oscillator.

    PubMed

    Stricker, Jesse; Cookson, Scott; Bennett, Matthew R; Mather, William H; Tsimring, Lev S; Hasty, Jeff

    2008-11-27

    One defining goal of synthetic biology is the development of engineering-based approaches that enable the construction of gene-regulatory networks according to 'design specifications' generated from computational modelling. This approach provides a systematic framework for exploring how a given regulatory network generates a particular phenotypic behaviour. Several fundamental gene circuits have been developed using this approach, including toggle switches and oscillators, and these have been applied in new contexts such as triggered biofilm development and cellular population control. Here we describe an engineered genetic oscillator in Escherichia coli that is fast, robust and persistent, with tunable oscillatory periods as fast as 13 min. The oscillator was designed using a previously modelled network architecture comprising linked positive and negative feedback loops. Using a microfluidic platform tailored for single-cell microscopy, we precisely control environmental conditions and monitor oscillations in individual cells through multiple cycles. Experiments reveal remarkable robustness and persistence of oscillations in the designed circuit; almost every cell exhibited large-amplitude fluorescence oscillations throughout observation runs. The oscillatory period can be tuned by altering inducer levels, temperature and the media source. Computational modelling demonstrates that the key design principle for constructing a robust oscillator is a time delay in the negative feedback loop, which can mechanistically arise from the cascade of cellular processes involved in forming a functional transcription factor. The positive feedback loop increases the robustness of the oscillations and allows for greater tunability. Examination of our refined model suggested the existence of a simplified oscillator design without positive feedback, and we construct an oscillator strain confirming this computational prediction.

  14. Robustness of multidimensional Brownian ratchets as directed transport mechanisms.

    PubMed

    González-Candela, Ernesto; Romero-Rochín, Víctor; Del Río, Fernando

    2011-08-07

    Brownian ratchets have recently been considered as models to describe the ability of certain systems to locate very specific states in multidimensional configuration spaces. This directional process has particularly been proposed as an alternative explanation for the protein folding problem, in which the polypeptide is driven toward the native state by a multidimensional Brownian ratchet. Recognizing the relevance of robustness in biological systems, in this work we analyze such a property of Brownian ratchets by pushing to the limits all the properties considered essential to produce directed transport. Based on the results presented here, we can state that Brownian ratchets are able to deliver current and locate funnel structures under a wide range of conditions. As a result, they represent a simple model that solves the Levinthal's paradox with great robustness and flexibility and without requiring any ad hoc biased transition probability. The behavior of Brownian ratchets shown in this article considerably enhances the plausibility of the model for at least part of the structural mechanism behind protein folding process.

  15. Reliability, robustness, and reproducibility in mouse behavioral phenotyping: a cross-laboratory study

    PubMed Central

    Mandillo, Silvia; Tucci, Valter; Hölter, Sabine M.; Meziane, Hamid; Banchaabouchi, Mumna Al; Kallnik, Magdalena; Lad, Heena V.; Nolan, Patrick M.; Ouagazzal, Abdel-Mouttalib; Coghill, Emma L.; Gale, Karin; Golini, Elisabetta; Jacquot, Sylvie; Krezel, Wojtek; Parker, Andy; Riet, Fabrice; Schneider, Ilka; Marazziti, Daniela; Auwerx, Johan; Brown, Steve D. M.; Chambon, Pierre; Rosenthal, Nadia; Tocchini-Valentini, Glauco; Wurst, Wolfgang

    2008-01-01

    Establishing standard operating procedures (SOPs) as tools for the analysis of behavioral phenotypes is fundamental to mouse functional genomics. It is essential that the tests designed provide reliable measures of the process under investigation but most importantly that these are reproducible across both time and laboratories. For this reason, we devised and tested a set of SOPs to investigate mouse behavior. Five research centers were involved across France, Germany, Italy, and the UK in this study, as part of the EUMORPHIA program. All the procedures underwent a cross-validation experimental study to investigate the robustness of the designed protocols. Four inbred reference strains (C57BL/6J, C3HeB/FeJ, BALB/cByJ, 129S2/SvPas), reflecting their use as common background strains in mutagenesis programs, were analyzed to validate these tests. We demonstrate that the operating procedures employed, which includes open field, SHIRPA, grip-strength, rotarod, Y-maze, prepulse inhibition of acoustic startle response, and tail flick tests, generated reproducible results between laboratories for a number of the test output parameters. However, we also identified several uncontrolled variables that constitute confounding factors in behavioral phenotyping. The EUMORPHIA SOPs described here are an important start-point for the ongoing development of increasingly robust phenotyping platforms and their application in large-scale, multicentre mouse phenotyping programs. PMID:18505770

  16. Robust, open-source removal of systematics in Kepler data

    NASA Astrophysics Data System (ADS)

    Aigrain, S.; Parviainen, H.; Roberts, S.; Reece, S.; Evans, T.

    2017-10-01

    We present ARC2 (Astrophysically Robust Correction 2), an open-source python-based systematics-correction pipeline, to correct for the Kepler prime mission long-cadence light curves. The ARC2 pipeline identifies and corrects any isolated discontinuities in the light curves and then removes trends common to many light curves. These trends are modelled using the publicly available co-trending basis vectors, within an (approximate) Bayesian framework with 'shrinkage' priors to minimize the risk of overfitting and the injection of any additional noise into the corrected light curves, while keeping any astrophysical signals intact. We show that the ARC2 pipeline's performance matches that of the standard Kepler PDC-MAP data products using standard noise metrics, and demonstrate its ability to preserve astrophysical signals using injection tests with simulated stellar rotation and planetary transit signals. Although it is not identical, the ARC2 pipeline can thus be used as an open-source alternative to PDC-MAP, whenever the ability to model the impact of the systematics removal process on other kinds of signal is important.

  17. Robustness of radiomic breast features of benign lesions and luminal A cancers across MR magnet strengths

    NASA Astrophysics Data System (ADS)

    Whitney, Heather M.; Drukker, Karen; Edwards, Alexandra; Papaioannou, John; Giger, Maryellen L.

    2018-02-01

    Radiomics features extracted from breast lesion images have shown potential in diagnosis and prognosis of breast cancer. As clinical institutions transition from 1.5 T to 3.0 T magnetic resonance imaging (MRI), it is helpful to identify robust features across these field strengths. In this study, dynamic contrast-enhanced MR images were acquired retrospectively under IRB/HIPAA compliance, yielding 738 cases: 241 and 124 benign lesions imaged at 1.5 T and 3.0 T and 231 and 142 luminal A cancers imaged at 1.5 T and 3.0 T, respectively. Lesions were segmented using a fuzzy C-means method. Extracted radiomic values for each group of lesions by cancer status and field strength of acquisition were compared using a Kolmogorov-Smirnov test for the null hypothesis that two groups being compared came from the same distribution, with p-values being corrected for multiple comparisons by the Holm-Bonferroni method. Two shape features, one texture feature, and three enhancement variance kinetics features were found to be potentially robust. All potentially robust features had areas under the receiver operating characteristic curve (AUC) statistically greater than 0.5 in the task of distinguishing between lesion types (range of means 0.57-0.78). The significant difference in voxel size between field strength of acquisition limits the ability to affirm more features as robust or not robust according to field strength alone, and inhomogeneities in static field strength and radiofrequency field could also have affected the assessment of kinetic curve features as robust or not. Vendor-specific image scaling could have also been a factor. These findings will contribute to the development of radiomic signatures that use features identified as robust across field strength.

  18. Cloud-Scale Genomic Signals Processing for Robust Large-Scale Cancer Genomic Microarray Data Analysis.

    PubMed

    Harvey, Benjamin Simeon; Ji, Soo-Yeon

    2017-01-01

    As microarray data available to scientists continues to increase in size and complexity, it has become overwhelmingly important to find multiple ways to bring forth oncological inference to the bioinformatics community through the analysis of large-scale cancer genomic (LSCG) DNA and mRNA microarray data that is useful to scientists. Though there have been many attempts to elucidate the issue of bringing forth biological interpretation by means of wavelet preprocessing and classification, there has not been a research effort that focuses on a cloud-scale distributed parallel (CSDP) separable 1-D wavelet decomposition technique for denoising through differential expression thresholding and classification of LSCG microarray data. This research presents a novel methodology that utilizes a CSDP separable 1-D method for wavelet-based transformation in order to initialize a threshold which will retain significantly expressed genes through the denoising process for robust classification of cancer patients. Additionally, the overall study was implemented and encompassed within CSDP environment. The utilization of cloud computing and wavelet-based thresholding for denoising was used for the classification of samples within the Global Cancer Map, Cancer Cell Line Encyclopedia, and The Cancer Genome Atlas. The results proved that separable 1-D parallel distributed wavelet denoising in the cloud and differential expression thresholding increased the computational performance and enabled the generation of higher quality LSCG microarray datasets, which led to more accurate classification results.

  19. Robust hashing for 3D models

    NASA Astrophysics Data System (ADS)

    Berchtold, Waldemar; Schäfer, Marcel; Rettig, Michael; Steinebach, Martin

    2014-02-01

    3D models and applications are of utmost interest in both science and industry. With the increment of their usage, their number and thereby the challenge to correctly identify them increases. Content identification is commonly done by cryptographic hashes. However, they fail as a solution in application scenarios such as computer aided design (CAD), scientific visualization or video games, because even the smallest alteration of the 3D model, e.g. conversion or compression operations, massively changes the cryptographic hash as well. Therefore, this work presents a robust hashing algorithm for 3D mesh data. The algorithm applies several different bit extraction methods. They are built to resist desired alterations of the model as well as malicious attacks intending to prevent correct allocation. The different bit extraction methods are tested against each other and, as far as possible, the hashing algorithm is compared to the state of the art. The parameters tested are robustness, security and runtime performance as well as False Acceptance Rate (FAR) and False Rejection Rate (FRR), also the probability calculation of hash collision is included. The introduced hashing algorithm is kept adaptive e.g. in hash length, to serve as a proper tool for all applications in practice.

  20. Less can be more: How to make operations more flexible and robust with fewer resources

    NASA Astrophysics Data System (ADS)

    Haksöz, ćaǧrı; Katsikopoulos, Konstantinos; Gigerenzer, Gerd

    2018-06-01

    We review empirical evidence from practice and general theoretical conditions, under which simple rules of thumb can help to make operations flexible and robust. An operation is flexible when it responds adaptively to adverse events such as natural disasters; an operation is robust when it is less affected by adverse events in the first place. We illustrate the relationship between flexibility and robustness in the context of supply chain risk. In addition to increasing flexibility and robustness, simple rules simultaneously reduce the need for resources such as time, money, information, and computation. We illustrate the simple-rules approach with an easy-to-use graphical aid for diagnosing and managing supply chain risk. More generally, we recommend a four-step process for determining the amount of resources that decision makers should invest in so as to increase flexibility and robustness.

  1. Robustness and structure of complex networks

    NASA Astrophysics Data System (ADS)

    Shao, Shuai

    This dissertation covers the two major parts of my PhD research on statistical physics and complex networks: i) modeling a new type of attack -- localized attack, and investigating robustness of complex networks under this type of attack; ii) discovering the clustering structure in complex networks and its influence on the robustness of coupled networks. Complex networks appear in every aspect of our daily life and are widely studied in Physics, Mathematics, Biology, and Computer Science. One important property of complex networks is their robustness under attacks, which depends crucially on the nature of attacks and the structure of the networks themselves. Previous studies have focused on two types of attack: random attack and targeted attack, which, however, are insufficient to describe many real-world damages. Here we propose a new type of attack -- localized attack, and study the robustness of complex networks under this type of attack, both analytically and via simulation. On the other hand, we also study the clustering structure in the network, and its influence on the robustness of a complex network system. In the first part, we propose a theoretical framework to study the robustness of complex networks under localized attack based on percolation theory and generating function method. We investigate the percolation properties, including the critical threshold of the phase transition pc and the size of the giant component Pinfinity. We compare localized attack with random attack and find that while random regular (RR) networks are more robust against localized attack, Erdoḧs-Renyi (ER) networks are equally robust under both types of attacks. As for scale-free (SF) networks, their robustness depends crucially on the degree exponent lambda. The simulation results show perfect agreement with theoretical predictions. We also test our model on two real-world networks: a peer-to-peer computer network and an airline network, and find that the real-world networks

  2. Robust Non-Wetting PTFE Surfaces by Femtosecond Laser Machining

    PubMed Central

    Liang, Fang; Lehr, Jorge; Danielczak, Lisa; Leask, Richard; Kietzig, Anne-Marie

    2014-01-01

    Nature shows many examples of surfaces with extraordinary wettability, which can often be associated with particular air-trapping surface patterns. Here, robust non-wetting surfaces have been created by femtosecond laser ablation of polytetrafluoroethylene (PTFE). The laser-created surface structure resembles a forest of entangled fibers, which support structural superhydrophobicity even when the surface chemistry is changed by gold coating. SEM analysis showed that the degree of entanglement of hairs and the depth of the forest pattern correlates positively with accumulated laser fluence and can thus be influenced by altering various laser process parameters. The resulting fibrous surfaces exhibit a tremendous decrease in wettability compared to smooth PTFE surfaces; droplets impacting the virgin or gold coated PTFE forest do not wet the surface but bounce off. Exploratory bioadhesion experiments showed that the surfaces are truly air-trapping and do not support cell adhesion. Therewith, the created surfaces successfully mimic biological surfaces such as insect wings with robust anti-wetting behavior and potential for antiadhesive applications. In addition, the fabrication can be carried out in one process step, and our results clearly show the insensitivity of the resulting non-wetting behavior to variations in the process parameters, both of which make it a strong candidate for industrial applications. PMID:25110862

  3. Robust Library Building for Autonomous Classification of Downhole Geophysical Logs Using Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Silversides, Katherine L.; Melkumyan, Arman

    2017-03-01

    Machine learning techniques such as Gaussian Processes can be used to identify stratigraphically important features in geophysical logs. The marker shales in the banded iron formation hosted iron ore deposits of the Hamersley Ranges, Western Australia, form distinctive signatures in the natural gamma logs. The identification of these marker shales is important for stratigraphic identification of unit boundaries for the geological modelling of the deposit. Machine learning techniques each have different unique properties that will impact the results. For Gaussian Processes (GPs), the output values are inclined towards the mean value, particularly when there is not sufficient information in the library. The impact that these inclinations have on the classification can vary depending on the parameter values selected by the user. Therefore, when applying machine learning techniques, care must be taken to fit the technique to the problem correctly. This study focuses on optimising the settings and choices for training a GPs system to identify a specific marker shale. We show that the final results converge even when different, but equally valid starting libraries are used for the training. To analyse the impact on feature identification, GP models were trained so that the output was inclined towards a positive, neutral or negative output. For this type of classification, the best results were when the pull was towards a negative output. We also show that the GP output can be adjusted by using a standard deviation coefficient that changes the balance between certainty and accuracy in the results.

  4. Robustness Recipes for Minimax Robust Optimization in Intensity Modulated Proton Therapy for Oropharyngeal Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voort, Sebastian van der; Section of Nuclear Energy and Radiation Applications, Department of Radiation, Science and Technology, Delft University of Technology, Delft; Water, Steven van de

    Purpose: We aimed to derive a “robustness recipe” giving the range robustness (RR) and setup robustness (SR) settings (ie, the error values) that ensure adequate clinical target volume (CTV) coverage in oropharyngeal cancer patients for given gaussian distributions of systematic setup, random setup, and range errors (characterized by standard deviations of Σ, σ, and ρ, respectively) when used in minimax worst-case robust intensity modulated proton therapy (IMPT) optimization. Methods and Materials: For the analysis, contoured computed tomography (CT) scans of 9 unilateral and 9 bilateral patients were used. An IMPT plan was considered robust if, for at least 98% of themore » simulated fractionated treatments, 98% of the CTV received 95% or more of the prescribed dose. For fast assessment of the CTV coverage for given error distributions (ie, different values of Σ, σ, and ρ), polynomial chaos methods were used. Separate recipes were derived for the unilateral and bilateral cases using one patient from each group, and all 18 patients were included in the validation of the recipes. Results: Treatment plans for bilateral cases are intrinsically more robust than those for unilateral cases. The required RR only depends on the ρ, and SR can be fitted by second-order polynomials in Σ and σ. The formulas for the derived robustness recipes are as follows: Unilateral patients need SR = −0.15Σ{sup 2} + 0.27σ{sup 2} + 1.85Σ − 0.06σ + 1.22 and RR=3% for ρ = 1% and ρ = 2%; bilateral patients need SR = −0.07Σ{sup 2} + 0.19σ{sup 2} + 1.34Σ − 0.07σ + 1.17 and RR=3% and 4% for ρ = 1% and 2%, respectively. For the recipe validation, 2 plans were generated for each of the 18 patients corresponding to Σ = σ = 1.5 mm and ρ = 0% and 2%. Thirty-four plans had adequate CTV coverage in 98% or more of the simulated fractionated treatments; the remaining 2 had adequate coverage in 97.8% and 97.9%. Conclusions: Robustness recipes were derived

  5. Robust image matching via ORB feature and VFC for mismatch removal

    NASA Astrophysics Data System (ADS)

    Ma, Tao; Fu, Wenxing; Fang, Bin; Hu, Fangyu; Quan, Siwen; Ma, Jie

    2018-03-01

    Image matching is at the base of many image processing and computer vision problems, such as object recognition or structure from motion. Current methods rely on good feature descriptors and mismatch removal strategies for detection and matching. In this paper, we proposed a robust image match approach based on ORB feature and VFC for mismatch removal. ORB (Oriented FAST and Rotated BRIEF) is an outstanding feature, it has the same performance as SIFT with lower cost. VFC (Vector Field Consensus) is a state-of-the-art mismatch removing method. The experiment results demonstrate that our method is efficient and robust.

  6. 20 CFR 1010.300 - What processes are to be implemented to identify covered persons?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false What processes are to be implemented to identify covered persons? 1010.300 Section 1010.300 Employees' Benefits OFFICE OF THE ASSISTANT SECRETARY... FOR COVERED PERSONS Applying Priority of Service § 1010.300 What processes are to be implemented to...

  7. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  8. Autonomous Modelling of X-ray Spectra Using Robust Global Optimization Methods

    NASA Astrophysics Data System (ADS)

    Rogers, Adam; Safi-Harb, Samar; Fiege, Jason

    2015-08-01

    The standard approach to model fitting in X-ray astronomy is by means of local optimization methods. However, these local optimizers suffer from a number of problems, such as a tendency for the fit parameters to become trapped in local minima, and can require an involved process of detailed user intervention to guide them through the optimization process. In this work we introduce a general GUI-driven global optimization method for fitting models to X-ray data, written in MATLAB, which searches for optimal models with minimal user interaction. We directly interface with the commonly used XSPEC libraries to access the full complement of pre-existing spectral models that describe a wide range of physics appropriate for modelling astrophysical sources, including supernova remnants and compact objects. Our algorithm is powered by the Ferret genetic algorithm and Locust particle swarm optimizer from the Qubist Global Optimization Toolbox, which are robust at finding families of solutions and identifying degeneracies. This technique will be particularly instrumental for multi-parameter models and high-fidelity data. In this presentation, we provide details of the code and use our techniques to analyze X-ray data obtained from a variety of astrophysical sources.

  9. Robust model predictive control for multi-step short range spacecraft rendezvous

    NASA Astrophysics Data System (ADS)

    Zhu, Shuyi; Sun, Ran; Wang, Jiaolong; Wang, Jihe; Shao, Xiaowei

    2018-07-01

    This work presents a robust model predictive control (MPC) approach for the multi-step short range spacecraft rendezvous problem. During the specific short range phase concerned, the chaser is supposed to be initially outside the line-of-sight (LOS) cone. Therefore, the rendezvous process naturally includes two steps: the first step is to transfer the chaser into the LOS cone and the second step is to transfer the chaser into the aimed region with its motion confined within the LOS cone. A novel MPC framework named after Mixed MPC (M-MPC) is proposed, which is the combination of the Variable-Horizon MPC (VH-MPC) framework and the Fixed-Instant MPC (FI-MPC) framework. The M-MPC framework enables the optimization for the two steps to be implemented jointly rather than to be separated factitiously, and its computation workload is acceptable for the usually low-power processors onboard spacecraft. Then considering that disturbances including modeling error, sensor noise and thrust uncertainty may induce undesired constraint violations, a robust technique is developed and it is attached to the above M-MPC framework to form a robust M-MPC approach. The robust technique is based on the chance-constrained idea, which ensures that constraints can be satisfied with a prescribed probability. It improves the robust technique proposed by Gavilan et al., because it eliminates the unnecessary conservativeness by explicitly incorporating known statistical properties of the navigation uncertainty. The efficacy of the robust M-MPC approach is shown in a simulation study.

  10. Identifying differences in biased affective information processing in major depression.

    PubMed

    Gollan, Jackie K; Pane, Heather T; McCloskey, Michael S; Coccaro, Emil F

    2008-05-30

    This study investigates the extent to which participants with major depression differ from healthy comparison participants in the irregularities in affective information processing, characterized by deficits in facial expression recognition, intensity categorization, and reaction time to identifying emotionally salient and neutral information. Data on diagnoses, symptom severity, and affective information processing using a facial recognition task were collected from 66 participants, male and female between ages 18 and 54 years, grouped by major depressive disorder (N=37) or healthy non-psychiatric (N=29) status. Findings from MANCOVAs revealed that major depression was associated with a significantly longer reaction time to sad facial expressions compared with healthy status. Also, depressed participants demonstrated a negative bias towards interpreting neutral facial expressions as sad significantly more often than healthy participants. In turn, healthy participants interpreted neutral faces as happy significantly more often than depressed participants. No group differences were observed for facial expression recognition and intensity categorization. The observed effects suggest that depression has significant effects on the perception of the intensity of negative affective stimuli, delayed speed of processing sad affective information, and biases towards interpreting neutral faces as sad.

  11. Integrated hot-melt extrusion - injection molding continuous tablet manufacturing platform: Effects of critical process parameters and formulation attributes on product robustness and dimensional stability.

    PubMed

    Desai, Parind M; Hogan, Rachael C; Brancazio, David; Puri, Vibha; Jensen, Keith D; Chun, Jung-Hoon; Myerson, Allan S; Trout, Bernhardt L

    2017-10-05

    This study provides a framework for robust tablet development using an integrated hot-melt extrusion-injection molding (IM) continuous manufacturing platform. Griseofulvin, maltodextrin, xylitol and lactose were employed as drug, carrier, plasticizer and reinforcing agent respectively. A pre-blended drug-excipient mixture was fed from a loss-in-weight feeder to a twin-screw extruder. The extrudate was subsequently injected directly into the integrated IM unit and molded into tablets. Tablets were stored in different storage conditions up to 20 weeks to monitor physical stability and were evaluated by polarized light microscopy, DSC, SEM, XRD and dissolution analysis. Optimized injection pressure provided robust tablet formulations. Tablets manufactured at low and high injection pressures exhibited the flaws of sink marks and flashing respectively. Higher solidification temperature during IM process reduced the thermal induced residual stress and prevented chipping and cracking issues. Polarized light microscopy revealed a homogeneous dispersion of crystalline griseofulvin in an amorphous matrix. DSC underpinned the effect of high tablet residual moisture on maltodextrin-xylitol phase separation that resulted in dimensional instability. Tablets with low residual moisture demonstrated long term dimensional stability. This study serves as a model for IM tablet formulations for mechanistic understanding of critical process parameters and formulation attributes required for optimal product performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Resonant Interneurons Can Increase Robustness of Gamma Oscillations.

    PubMed

    Tikidji-Hamburyan, Ruben A; Martínez, Joan José; White, John A; Canavier, Carmen C

    2015-11-25

    Gamma oscillations are believed to play a critical role in in information processing, encoding, and retrieval. Inhibitory interneuronal network gamma (ING) oscillations may arise from a coupled oscillator mechanism in which individual neurons oscillate or from a population oscillator in which individual neurons fire sparsely and stochastically. All ING mechanisms, including the one proposed herein, rely on alternating waves of inhibition and windows of opportunity for spiking. The coupled oscillator model implemented with Wang-Buzsáki model neurons is not sufficiently robust to heterogeneity in excitatory drive, and therefore intrinsic frequency, to account for in vitro models of ING. Similarly, in a tightly synchronized regime, the stochastic population oscillator model is often characterized by sparse firing, whereas interneurons both in vivo and in vitro do not fire sparsely during gamma, but rather on average every other cycle. We substituted so-called resonator neural models, which exhibit class 2 excitability and postinhibitory rebound (PIR), for the integrators that are typically used. This results in much greater robustness to heterogeneity that actually increases as the average participation in spikes per cycle approximates physiological levels. Moreover, dynamic clamp experiments that show autapse-induced firing in entorhinal cortical interneurons support the idea that PIR can serve as a network gamma mechanism. Furthermore, parvalbumin-positive (PV(+)) cells were much more likely to display both PIR and autapse-induced firing than GAD2(+) cells, supporting the view that PV(+) fast-firing basket cells are more likely to exhibit class 2 excitability than other types of inhibitory interneurons. Gamma oscillations are believed to play a critical role in information processing, encoding, and retrieval. Networks of inhibitory interneurons are thought to be essential for these oscillations. We show that one class of interneurons with an abrupt onset of firing

  13. Awareness of technology-induced errors and processes for identifying and preventing such errors.

    PubMed

    Bellwood, Paule; Borycki, Elizabeth M; Kushniruk, Andre W

    2015-01-01

    There is a need to determine if organizations working with health information technology are aware of technology-induced errors and how they are addressing and preventing them. The purpose of this study was to: a) determine the degree of technology-induced error awareness in various Canadian healthcare organizations, and b) identify those processes and procedures that are currently in place to help address, manage, and prevent technology-induced errors. We identified a lack of technology-induced error awareness among participants. Participants identified there was a lack of well-defined procedures in place for reporting technology-induced errors, addressing them when they arise, and preventing them.

  14. An elementary quantum network using robust nuclear spin qubits in diamond

    NASA Astrophysics Data System (ADS)

    Kalb, Norbert; Reiserer, Andreas; Humphreys, Peter; Blok, Machiel; van Bemmelen, Koen; Twitchen, Daniel; Markham, Matthew; Taminiau, Tim; Hanson, Ronald

    Quantum registers containing multiple robust qubits can form the nodes of future quantum networks for computation and communication. Information storage within such nodes must be resilient to any type of local operation. Here we demonstrate multiple robust memories by employing five nuclear spins adjacent to a nitrogen-vacancy defect centre in diamond. We characterize the storage of quantum superpositions and their resilience to entangling attempts with the electron spin of the defect centre. The storage fidelity is found to be limited by the probabilistic electron spin reset after failed entangling attempts. Control over multiple memories is then utilized to encode states in decoherence protected subspaces with increased robustness. Furthermore we demonstrate memory control in two optically linked network nodes and characterize the storage capabilities of both memories in terms of the process fidelity with the identity. These results pave the way towards multi-qubit quantum algorithms in a remote network setting.

  15. Robust Distribution Network Reconfiguration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Changhyeok; Liu, Cong; Mehrotra, Sanjay

    2015-03-01

    We propose a two-stage robust optimization model for the distribution network reconfiguration problem with load uncertainty. The first-stage decision is to configure the radial distribution network and the second-stage decision is to find the optimal a/c power flow of the reconfigured network for given demand realization. We solve the two-stage robust model by using a column-and-constraint generation algorithm, where the master problem and subproblem are formulated as mixed-integer second-order cone programs. Computational results for 16, 33, 70, and 94-bus test cases are reported. We find that the configuration from the robust model does not compromise much the power loss undermore » the nominal load scenario compared to the configuration from the deterministic model, yet it provides the reliability of the distribution system for all scenarios in the uncertainty set.« less

  16. Design principles for robust vesiculation in clathrin-mediated endocytosis

    PubMed Central

    Hassinger, Julian E.; Oster, George; Drubin, David G.; Rangamani, Padmini

    2017-01-01

    A critical step in cellular-trafficking pathways is the budding of membranes by protein coats, which recent experiments have demonstrated can be inhibited by elevated membrane tension. The robustness of processes like clathrin-mediated endocytosis (CME) across a diverse range of organisms and mechanical environments suggests that the protein machinery in this process has evolved to take advantage of some set of physical design principles to ensure robust vesiculation against opposing forces like membrane tension. Using a theoretical model for membrane mechanics and membrane protein interaction, we have systematically investigated the influence of membrane rigidity, curvature induced by the protein coat, area covered by the protein coat, membrane tension, and force from actin polymerization on bud formation. Under low tension, the membrane smoothly evolves from a flat to budded morphology as the coat area or spontaneous curvature increases, whereas the membrane remains essentially flat at high tensions. At intermediate, physiologically relevant, tensions, the membrane undergoes a “snap-through instability” in which small changes in the coat area, spontaneous curvature or membrane tension cause the membrane to “snap” from an open, U-shape to a closed bud. This instability can be smoothed out by increasing the bending rigidity of the coat, allowing for successful budding at higher membrane tensions. Additionally, applied force from actin polymerization can bypass the instability by inducing a smooth transition from an open to a closed bud. Finally, a combination of increased coat rigidity and force from actin polymerization enables robust vesiculation even at high membrane tensions. PMID:28126722

  17. Robust detrending, rereferencing, outlier detection, and inpainting for multichannel data.

    PubMed

    de Cheveigné, Alain; Arzounian, Dorothée

    2018-05-15

    Electroencephalography (EEG), magnetoencephalography (MEG) and related techniques are prone to glitches, slow drift, steps, etc., that contaminate the data and interfere with the analysis and interpretation. These artifacts are usually addressed in a preprocessing phase that attempts to remove them or minimize their impact. This paper offers a set of useful techniques for this purpose: robust detrending, robust rereferencing, outlier detection, data interpolation (inpainting), step removal, and filter ringing artifact removal. These techniques provide a less wasteful alternative to discarding corrupted trials or channels, and they are relatively immune to artifacts that disrupt alternative approaches such as filtering. Robust detrending allows slow drifts and common mode signals to be factored out while avoiding the deleterious effects of glitches. Robust rereferencing reduces the impact of artifacts on the reference. Inpainting allows corrupt data to be interpolated from intact parts based on the correlation structure estimated over the intact parts. Outlier detection allows the corrupt parts to be identified. Step removal fixes the high-amplitude flux jump artifacts that are common with some MEG systems. Ringing removal allows the ringing response of the antialiasing filter to glitches (steps, pulses) to be suppressed. The performance of the methods is illustrated and evaluated using synthetic data and data from real EEG and MEG systems. These methods, which are mainly automatic and require little tuning, can greatly improve the quality of the data. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Robust Portfolio Optimization Using Pseudodistances.

    PubMed

    Toma, Aida; Leoni-Aubin, Samuela

    2015-01-01

    The presence of outliers in financial asset returns is a frequently occurring phenomenon which may lead to unreliable mean-variance optimized portfolios. This fact is due to the unbounded influence that outliers can have on the mean returns and covariance estimators that are inputs in the optimization procedure. In this paper we present robust estimators of mean and covariance matrix obtained by minimizing an empirical version of a pseudodistance between the assumed model and the true model underlying the data. We prove and discuss theoretical properties of these estimators, such as affine equivariance, B-robustness, asymptotic normality and asymptotic relative efficiency. These estimators can be easily used in place of the classical estimators, thereby providing robust optimized portfolios. A Monte Carlo simulation study and applications to real data show the advantages of the proposed approach. We study both in-sample and out-of-sample performance of the proposed robust portfolios comparing them with some other portfolios known in literature.

  19. Robust Portfolio Optimization Using Pseudodistances

    PubMed Central

    2015-01-01

    The presence of outliers in financial asset returns is a frequently occurring phenomenon which may lead to unreliable mean-variance optimized portfolios. This fact is due to the unbounded influence that outliers can have on the mean returns and covariance estimators that are inputs in the optimization procedure. In this paper we present robust estimators of mean and covariance matrix obtained by minimizing an empirical version of a pseudodistance between the assumed model and the true model underlying the data. We prove and discuss theoretical properties of these estimators, such as affine equivariance, B-robustness, asymptotic normality and asymptotic relative efficiency. These estimators can be easily used in place of the classical estimators, thereby providing robust optimized portfolios. A Monte Carlo simulation study and applications to real data show the advantages of the proposed approach. We study both in-sample and out-of-sample performance of the proposed robust portfolios comparing them with some other portfolios known in literature. PMID:26468948

  20. Centimeter-Level Robust Gnss-Aided Inertial Post-Processing for Mobile Mapping Without Local Reference Stations

    NASA Astrophysics Data System (ADS)

    Hutton, J. J.; Gopaul, N.; Zhang, X.; Wang, J.; Menon, V.; Rieck, D.; Kipka, A.; Pastor, F.

    2016-06-01

    For almost two decades mobile mapping systems have done their georeferencing using Global Navigation Satellite Systems (GNSS) to measure position and inertial sensors to measure orientation. In order to achieve cm level position accuracy, a technique referred to as post-processed carrier phase differential GNSS (DGNSS) is used. For this technique to be effective the maximum distance to a single Reference Station should be no more than 20 km, and when using a network of Reference Stations the distance to the nearest station should no more than about 70 km. This need to set up local Reference Stations limits productivity and increases costs, especially when mapping large areas or long linear features such as roads or pipelines. An alternative technique to DGNSS for high-accuracy positioning from GNSS is the so-called Precise Point Positioning or PPP method. In this case instead of differencing the rover observables with the Reference Station observables to cancel out common errors, an advanced model for every aspect of the GNSS error chain is developed and parameterized to within an accuracy of a few cm. The Trimble Centerpoint RTX positioning solution combines the methodology of PPP with advanced ambiguity resolution technology to produce cm level accuracies without the need for local reference stations. It achieves this through a global deployment of highly redundant monitoring stations that are connected through the internet and are used to determine the precise satellite data with maximum accuracy, robustness, continuity and reliability, along with advance algorithms and receiver and antenna calibrations. This paper presents a new post-processed realization of the Trimble Centerpoint RTX technology integrated into the Applanix POSPac MMS GNSS-Aided Inertial software for mobile mapping. Real-world results from over 100 airborne flights evaluated against a DGNSS network reference are presented which show that the post-processed Centerpoint RTX solution agrees with

  1. Robustness trade-offs and host–microbial symbiosis in the immune system

    PubMed Central

    Kitano, Hiroaki; Oda, Kanae

    2006-01-01

    The immune system provides organisms with robustness against pathogen threats, yet it also often adversely affects the organism as in autoimmune diseases. Recently, the molecular interactions involved in the immune system have been uncovered. At the same time, the role of the bacterial flora and its interactions with the host immune system have been identified. In this article, we try to reconcile these findings to draw a consistent picture of the host defense system. Specifically, we first argue that the network of molecular interactions involved in immune functions has a bow-tie architecture that entails inherent trade-offs among robustness, fragility, resource limitation, and performance. Second, we discuss the possibility that commensal bacteria and the host immune system constitute an integrated defense system. This symbiotic association has evolved to optimize its robustness against pathogen attacks and nutrient perturbations by harboring a broad range of microorganisms. Owing to the inherent propensity of a host immune system toward hyperactivity, maintenance of bacterial flora homeostasis might be particularly important in the development of preventive strategies against immune disorders such as autoimmune diseases. PMID:16738567

  2. Robustness analysis of a green chemistry-based model for the classification of silver nanoparticles synthesis processes

    EPA Science Inventory

    This paper proposes a robustness analysis based on Multiple Criteria Decision Aiding (MCDA). The ensuing model was used to assess the implementation of green chemistry principles in the synthesis of silver nanoparticles. Its recommendations were also compared to an earlier develo...

  3. Tail mean and related robust solution concepts

    NASA Astrophysics Data System (ADS)

    Ogryczak, Włodzimierz

    2014-01-01

    Robust optimisation might be viewed as a multicriteria optimisation problem where objectives correspond to the scenarios although their probabilities are unknown or imprecise. The simplest robust solution concept represents a conservative approach focused on the worst-case scenario results optimisation. A softer concept allows one to optimise the tail mean thus combining performances under multiple worst scenarios. We show that while considering robust models allowing the probabilities to vary only within given intervals, the tail mean represents the robust solution for only upper bounded probabilities. For any arbitrary intervals of probabilities the corresponding robust solution may be expressed by the optimisation of appropriately combined mean and tail mean criteria thus remaining easily implementable with auxiliary linear inequalities. Moreover, we use the tail mean concept to develope linear programming implementable robust solution concepts related to risk averse optimisation criteria.

  4. Structural and robustness properties of smart-city transportation networks

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen-Gang; Ding, Zhuo; Fan, Jing-Fang; Meng, Jun; Ding, Yi-Min; Ye, Fang-Fu; Chen, Xiao-Song

    2015-09-01

    The concept of smart city gives an excellent resolution to construct and develop modern cities, and also demands infrastructure construction. How to build a safe, stable, and highly efficient public transportation system becomes an important topic in the process of city construction. In this work, we study the structural and robustness properties of transportation networks and their sub-networks. We introduce a complementary network model to study the relevance and complementarity between bus network and subway network. Our numerical results show that the mutual supplement of networks can improve the network robustness. This conclusion provides a theoretical basis for the construction of public traffic networks, and it also supports reasonable operation of managing smart cities. Project supported by the Major Projects of the China National Social Science Fund (Grant No. 11 & ZD154).

  5. Robustness and Vulnerability of Networks with Dynamical Dependency Groups.

    PubMed

    Bai, Ya-Nan; Huang, Ning; Wang, Lei; Wu, Zhi-Xi

    2016-11-28

    The dependency property and self-recovery of failure nodes both have great effects on the robustness of networks during the cascading process. Existing investigations focused mainly on the failure mechanism of static dependency groups without considering the time-dependency of interdependent nodes and the recovery mechanism in reality. In this study, we present an evolving network model consisting of failure mechanisms and a recovery mechanism to explore network robustness, where the dependency relations among nodes vary over time. Based on generating function techniques, we provide an analytical framework for random networks with arbitrary degree distribution. In particular, we theoretically find that an abrupt percolation transition exists corresponding to the dynamical dependency groups for a wide range of topologies after initial random removal. Moreover, when the abrupt transition point is above the failure threshold of dependency groups, the evolving network with the larger dependency groups is more vulnerable; when below it, the larger dependency groups make the network more robust. Numerical simulations employing the Erdős-Rényi network and Barabási-Albert scale free network are performed to validate our theoretical results.

  6. Design for robustness of unique, multi-component engineering systems

    NASA Astrophysics Data System (ADS)

    Shelton, Kenneth A.

    2007-12-01

    design concept. These allele values are unitless themselves, but map to both configuration descriptions and attribute values. The Value Distance and Component Distance are metrics that measure the relative differences between two design concepts using the allele values, and all differences in a population of design concepts are calculated relative to a reference design, called the "base design". The base design is the top-ranked member of the population in weighted terms of robustness and performance. Robustness is determined based on the change in multi-objective performance as Value Distance and Component Distance (and thus differences in design) increases. It is assessed as acceptable if differences in design configurations up to specified tolerances result in performance changes that remain within a specified performance range. The design configuration difference tolerances and performance range together define the designer's risk management preferences for the final design concepts. Additionally, a complementary visualization capability was developed, called the "Design Solution Topography". This concept allows the visualization of a population of design concepts, and is a 3-axis plot where each point represents an entire design concept. The axes are the Value Distance, Component Distance and Performance Objective. The key benefit of the Design Solution Topography is that it allows the designer to visually identify and interpret the overall robustness of the current population of design concepts for a particular performance objective. In a multi-objective problem, each performance objective has its own Design Solution Topography view. These new concepts are implemented in an evolutionary computation-based conceptual designing method called the "Design for Robustness Method" that produces robust design concepts. The design procedures associated with this method enable designers to evaluate and ensure robustness in selected designs that also perform within a desired

  7. How Robust is Your System Resilience?

    NASA Astrophysics Data System (ADS)

    Homayounfar, M.; Muneepeerakul, R.

    2017-12-01

    Robustness and resilience are concepts in system thinking that have grown in importance and popularity. For many complex social-ecological systems, however, robustness and resilience are difficult to quantify and the connections and trade-offs between them difficult to study. Most studies have either focused on qualitative approaches to discuss their connections or considered only one of them under particular classes of disturbances. In this study, we present an analytical framework to address the linkage between robustness and resilience more systematically. Our analysis is based on a stylized dynamical model that operationalizes a widely used concept framework for social-ecological systems. The model enables us to rigorously define robustness and resilience and consequently investigate their connections. The results reveal the tradeoffs among performance, robustness, and resilience. They also show how the nature of the such tradeoffs varies with the choices of certain policies (e.g., taxation and investment in public infrastructure), internal stresses and external disturbances.

  8. Investigation on changes of modularity and robustness by edge-removal mutations in signaling networks.

    PubMed

    Truong, Cong-Doan; Kwon, Yung-Keun

    2017-12-21

    Biological networks consisting of molecular components and interactions are represented by a graph model. There have been some studies based on that model to analyze a relationship between structural characteristics and dynamical behaviors in signaling network. However, little attention has been paid to changes of modularity and robustness in mutant networks. In this paper, we investigated the changes of modularity and robustness by edge-removal mutations in three signaling networks. We first observed that both the modularity and robustness increased on average in the mutant network by the edge-removal mutations. However, the modularity change was negatively correlated with the robustness change. This implies that it is unlikely that both the modularity and the robustness values simultaneously increase by the edge-removal mutations. Another interesting finding is that the modularity change was positively correlated with the degree, the number of feedback loops, and the edge betweenness of the removed edges whereas the robustness change was negatively correlated with them. We note that these results were consistently observed in randomly structure networks. Additionally, we identified two groups of genes which are incident to the highly-modularity-increasing and the highly-robustness-decreasing edges with respect to the edge-removal mutations, respectively, and observed that they are likely to be central by forming a connected component of a considerably large size. The gene-ontology enrichment of each of these gene groups was significantly different from the rest of genes. Finally, we showed that the highly-robustness-decreasing edges can be promising edgetic drug-targets, which validates the usefulness of our analysis. Taken together, the analysis of changes of robustness and modularity against edge-removal mutations can be useful to unravel novel dynamical characteristics underlying in signaling networks.

  9. Cell communities and robustness in development.

    PubMed

    Monk, N A

    1997-11-01

    The robustness of patterning events in development is a key feature that must be accounted for in proposed models of these events. When considering explicitly cellular systems, robustness can be exhibited at different levels of organization. Consideration of two widespread patterning mechanisms suggests that robustness at the level of cell communities can result from variable development at the level of individual cells; models of these mechanisms show how interactions between participating cells guarantee community-level robustness. Cooperative interactions enhance homogeneity within communities of like cells and the sharpness of boundaries between communities of distinct cells, while competitive interactions amplify small inhomogeneities within communities of initially equivalent cells, resulting in fine-grained patterns of cell specialization.

  10. Addressing Climate Change in Long-Term Water Planning Using Robust Decisionmaking

    NASA Astrophysics Data System (ADS)

    Groves, D. G.; Lempert, R.

    2008-12-01

    Addressing climate change in long-term natural resource planning is difficult because future management conditions are deeply uncertain and the range of possible adaptation options are so extensive. These conditions pose challenges to standard optimization decision-support techniques. This talk will describe a methodology called Robust Decisionmaking (RDM) that can complement more traditional analytic approaches by utilizing screening-level water management models to evaluate large numbers of strategies against a wide range of plausible future scenarios. The presentation will describe a recent application of the methodology to evaluate climate adaptation strategies for the Inland Empire Utilities Agency in Southern California. This project found that RDM can provide a useful way for addressing climate change uncertainty and identify robust adaptation strategies.

  11. Self-paced model learning for robust visual tracking

    NASA Astrophysics Data System (ADS)

    Huang, Wenhui; Gu, Jason; Ma, Xin; Li, Yibin

    2017-01-01

    In visual tracking, learning a robust and efficient appearance model is a challenging task. Model learning determines both the strategy and the frequency of model updating, which contains many details that could affect the tracking results. Self-paced learning (SPL) has recently been attracting considerable interest in the fields of machine learning and computer vision. SPL is inspired by the learning principle underlying the cognitive process of humans, whose learning process is generally from easier samples to more complex aspects of a task. We propose a tracking method that integrates the learning paradigm of SPL into visual tracking, so reliable samples can be automatically selected for model learning. In contrast to many existing model learning strategies in visual tracking, we discover the missing link between sample selection and model learning, which are combined into a single objective function in our approach. Sample weights and model parameters can be learned by minimizing this single objective function. Additionally, to solve the real-valued learning weight of samples, an error-tolerant self-paced function that considers the characteristics of visual tracking is proposed. We demonstrate the robustness and efficiency of our tracker on a recent tracking benchmark data set with 50 video sequences.

  12. On the asymptotic standard error of a class of robust estimators of ability in dichotomous item response models.

    PubMed

    Magis, David

    2014-11-01

    In item response theory, the classical estimators of ability are highly sensitive to response disturbances and can return strongly biased estimates of the true underlying ability level. Robust methods were introduced to lessen the impact of such aberrant responses on the estimation process. The computation of asymptotic (i.e., large-sample) standard errors (ASE) for these robust estimators, however, has not yet been fully considered. This paper focuses on a broad class of robust ability estimators, defined by an appropriate selection of the weight function and the residual measure, for which the ASE is derived from the theory of estimating equations. The maximum likelihood (ML) and the robust estimators, together with their estimated ASEs, are then compared in a simulation study by generating random guessing disturbances. It is concluded that both the estimators and their ASE perform similarly in the absence of random guessing, while the robust estimator and its estimated ASE are less biased and outperform their ML counterparts in the presence of random guessing with large impact on the item response process. © 2013 The British Psychological Society.

  13. Using multiobjective tradeoff sets and Multivariate Regression Trees to identify critical and robust decisions for long term water utility planning

    NASA Astrophysics Data System (ADS)

    Smith, R.; Kasprzyk, J. R.; Balaji, R.

    2017-12-01

    In light of deeply uncertain factors like future climate change and population shifts, responsible resource management will require new types of information and strategies. For water utilities, this entails potential expansion and efficient management of water supply infrastructure systems for changes in overall supply; changes in frequency and severity of climate extremes such as droughts and floods; and variable demands, all while accounting for conflicting long and short term performance objectives. Multiobjective Evolutionary Algorithms (MOEAs) are emerging decision support tools that have been used by researchers and, more recently, water utilities to efficiently generate and evaluate thousands of planning portfolios. The tradeoffs between conflicting objectives are explored in an automated way to produce (often large) suites of portfolios that strike different balances of performance. Once generated, the sets of optimized portfolios are used to support relatively subjective assertions of priorities and human reasoning, leading to adoption of a plan. These large tradeoff sets contain information about complex relationships between decisions and between groups of decisions and performance that, until now, has not been quantitatively described. We present a novel use of Multivariate Regression Trees (MRTs) to analyze tradeoff sets to reveal these relationships and critical decisions. Additionally, when MRTs are applied to tradeoff sets developed for different realizations of an uncertain future, they can identify decisions that are robust across a wide range of conditions and produce fundamental insights about the system being optimized.

  14. A multiplex PCR mini-barcode assay to identify processed shark products in the global trade.

    PubMed

    Cardeñosa, Diego; Fields, Andrew; Abercrombie, Debra; Feldheim, Kevin; Shea, Stanley K H; Chapman, Demian D

    2017-01-01

    Protecting sharks from overexploitation has become global priority after widespread population declines have occurred. Tracking catches and trade on a species-specific basis has proven challenging, in part due to difficulties in identifying processed shark products such as fins, meat, and liver oil. This has hindered efforts to implement regulations aimed at promoting sustainable use of commercially important species and protection of imperiled species. Genetic approaches to identify shark products exist but are typically based on sequencing or amplifying large DNA regions and may fail to work on heavily processed products in which DNA is degraded. Here, we describe a novel multiplex PCR mini-barcode assay based on two short fragments of the cytochrome oxidase I (COI) gene. This assay can identify to species all sharks currently listed on the Convention of International Trade of Endangered Species (CITES) and most shark species present in the international trade. It achieves species diagnosis based on a single PCR and one to two downstream DNA sequencing reactions. The assay is capable of identifying highly processed shark products including fins, cooked shark fin soup, and skin-care products containing liver oil. This is a straightforward and reliable identification method for data collection and enforcement of regulations implemented for certain species at all governance levels.

  15. A multiplex PCR mini-barcode assay to identify processed shark products in the global trade

    PubMed Central

    Fields, Andrew; Abercrombie, Debra; Feldheim, Kevin; Shea, Stanley K. H.; Chapman, Demian D.

    2017-01-01

    Protecting sharks from overexploitation has become global priority after widespread population declines have occurred. Tracking catches and trade on a species-specific basis has proven challenging, in part due to difficulties in identifying processed shark products such as fins, meat, and liver oil. This has hindered efforts to implement regulations aimed at promoting sustainable use of commercially important species and protection of imperiled species. Genetic approaches to identify shark products exist but are typically based on sequencing or amplifying large DNA regions and may fail to work on heavily processed products in which DNA is degraded. Here, we describe a novel multiplex PCR mini-barcode assay based on two short fragments of the cytochrome oxidase I (COI) gene. This assay can identify to species all sharks currently listed on the Convention of International Trade of Endangered Species (CITES) and most shark species present in the international trade. It achieves species diagnosis based on a single PCR and one to two downstream DNA sequencing reactions. The assay is capable of identifying highly processed shark products including fins, cooked shark fin soup, and skin-care products containing liver oil. This is a straightforward and reliable identification method for data collection and enforcement of regulations implemented for certain species at all governance levels. PMID:29020095

  16. Robustness of airline route networks

    NASA Astrophysics Data System (ADS)

    Lordan, Oriol; Sallan, Jose M.; Escorihuela, Nuria; Gonzalez-Prieto, David

    2016-03-01

    Airlines shape their route network by defining their routes through supply and demand considerations, paying little attention to network performance indicators, such as network robustness. However, the collapse of an airline network can produce high financial costs for the airline and all its geographical area of influence. The aim of this study is to analyze the topology and robustness of the network route of airlines following Low Cost Carriers (LCCs) and Full Service Carriers (FSCs) business models. Results show that FSC hubs are more central than LCC bases in their route network. As a result, LCC route networks are more robust than FSC networks.

  17. Robust sensor fault detection and isolation of gas turbine engines subjected to time-varying parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Pourbabaee, Bahareh; Meskin, Nader; Khorasani, Khashayar

    2016-08-01

    In this paper, a novel robust sensor fault detection and isolation (FDI) strategy using the multiple model-based (MM) approach is proposed that remains robust with respect to both time-varying parameter uncertainties and process and measurement noise in all the channels. The scheme is composed of robust Kalman filters (RKF) that are constructed for multiple piecewise linear (PWL) models that are constructed at various operating points of an uncertain nonlinear system. The parameter uncertainty is modeled by using a time-varying norm bounded admissible structure that affects all the PWL state space matrices. The robust Kalman filter gain matrices are designed by solving two algebraic Riccati equations (AREs) that are expressed as two linear matrix inequality (LMI) feasibility conditions. The proposed multiple RKF-based FDI scheme is simulated for a single spool gas turbine engine to diagnose various sensor faults despite the presence of parameter uncertainties, process and measurement noise. Our comparative studies confirm the superiority of our proposed FDI method when compared to the methods that are available in the literature.

  18. A network property necessary for concentration robustness

    NASA Astrophysics Data System (ADS)

    Eloundou-Mbebi, Jeanne M. O.; Küken, Anika; Omranian, Nooshin; Kleessen, Sabrina; Neigenfind, Jost; Basler, Georg; Nikoloski, Zoran

    2016-10-01

    Maintenance of functionality of complex cellular networks and entire organisms exposed to environmental perturbations often depends on concentration robustness of the underlying components. Yet, the reasons and consequences of concentration robustness in large-scale cellular networks remain largely unknown. Here, we derive a necessary condition for concentration robustness based only on the structure of networks endowed with mass action kinetics. The structural condition can be used to design targeted experiments to study concentration robustness. We show that metabolites satisfying the necessary condition are present in metabolic networks from diverse species, suggesting prevalence of this property across kingdoms of life. We also demonstrate that our predictions about concentration robustness of energy-related metabolites are in line with experimental evidence from Escherichia coli. The necessary condition is applicable to mass action biological systems of arbitrary size, and will enable understanding the implications of concentration robustness in genetic engineering strategies and medical applications.

  19. A network property necessary for concentration robustness.

    PubMed

    Eloundou-Mbebi, Jeanne M O; Küken, Anika; Omranian, Nooshin; Kleessen, Sabrina; Neigenfind, Jost; Basler, Georg; Nikoloski, Zoran

    2016-10-19

    Maintenance of functionality of complex cellular networks and entire organisms exposed to environmental perturbations often depends on concentration robustness of the underlying components. Yet, the reasons and consequences of concentration robustness in large-scale cellular networks remain largely unknown. Here, we derive a necessary condition for concentration robustness based only on the structure of networks endowed with mass action kinetics. The structural condition can be used to design targeted experiments to study concentration robustness. We show that metabolites satisfying the necessary condition are present in metabolic networks from diverse species, suggesting prevalence of this property across kingdoms of life. We also demonstrate that our predictions about concentration robustness of energy-related metabolites are in line with experimental evidence from Escherichia coli. The necessary condition is applicable to mass action biological systems of arbitrary size, and will enable understanding the implications of concentration robustness in genetic engineering strategies and medical applications.

  20. Improving the 'tool box' for robust industrial enzymes.

    PubMed

    Littlechild, J A

    2017-05-01

    The speed of sequencing of microbial genomes and metagenomes is providing an ever increasing resource for the identification of new robust biocatalysts with industrial applications for many different aspects of industrial biotechnology. Using 'natures catalysts' provides a sustainable approach to chemical synthesis of fine chemicals, general chemicals such as surfactants and new consumer-based materials such as biodegradable plastics. This provides a sustainable and 'green chemistry' route to chemical synthesis which generates no toxic waste and is environmentally friendly. In addition, enzymes can play important roles in other applications such as carbon dioxide capture, breakdown of food and other waste streams to provide a route to the concept of a 'circular economy' where nothing is wasted. The use of improved bioinformatic approaches and the development of new rapid enzyme activity screening methodology can provide an endless resource for new robust industrial biocatalysts.This mini-review will discuss several recent case studies where industrial enzymes of 'high priority' have been identified and characterised. It will highlight specific hydrolase enzymes and recent case studies which have been carried out within our group in Exeter.

  1. Morphological change in machines accelerates the evolution of robust behavior

    PubMed Central

    Bongard, Josh

    2011-01-01

    Most animals exhibit significant neurological and morphological change throughout their lifetime. No robots to date, however, grow new morphological structure while behaving. This is due to technological limitations but also because it is unclear that morphological change provides a benefit to the acquisition of robust behavior in machines. Here I show that in evolving populations of simulated robots, if robots grow from anguilliform into legged robots during their lifetime in the early stages of evolution, and the anguilliform body plan is gradually lost during later stages of evolution, gaits are evolved for the final, legged form of the robot more rapidly—and the evolved gaits are more robust—compared to evolving populations of legged robots that do not transition through the anguilliform body plan. This suggests that morphological change, as well as the evolution of development, are two important processes that improve the automatic generation of robust behaviors for machines. It also provides an experimental platform for investigating the relationship between the evolution of development and robust behavior in biological organisms. PMID:21220304

  2. A Robust Image Watermarking in the Joint Time-Frequency Domain

    NASA Astrophysics Data System (ADS)

    Öztürk, Mahmut; Akan, Aydın; Çekiç, Yalçın

    2010-12-01

    With the rapid development of computers and internet applications, copyright protection of multimedia data has become an important problem. Watermarking techniques are proposed as a solution to copyright protection of digital media files. In this paper, a new, robust, and high-capacity watermarking method that is based on spatiofrequency (SF) representation is presented. We use the discrete evolutionary transform (DET) calculated by the Gabor expansion to represent an image in the joint SF domain. The watermark is embedded onto selected coefficients in the joint SF domain. Hence, by combining the advantages of spatial and spectral domain watermarking methods, a robust, invisible, secure, and high-capacity watermarking method is presented. A correlation-based detector is also proposed to detect and extract any possible watermarks on an image. The proposed watermarking method was tested on some commonly used test images under different signal processing attacks like additive noise, Wiener and Median filtering, JPEG compression, rotation, and cropping. Simulation results show that our method is robust against all of the attacks.

  3. Functional Groups Based on Leaf Physiology: Are they Spatially and Temporally Robust?

    NASA Technical Reports Server (NTRS)

    Foster, Tammy E.; Brooks, J. Renee; Quincy, Charles (Technical Monitor)

    2002-01-01

    The functional grouping hypothesis, which suggests that complexity in function can be simplified by grouping species with similar responses, was tested in the Florida scrub habitat. Functional groups were identified based on how species in fire maintained FL scrub function in terms of carbon, water and nitrogen dynamics. The suite of physiologic parameters measured to determine function included both instantaneous gas exchange measurements obtained from photosynthetic light response curves and integrated measures of function. Using cluster analysis, five distinct physiologically-based functional groups were identified. Using non-parametric multivariate analyses, it was determined that these five groupings were not altered by plot differences or by the three different management regimes; prescribed burn, mechanically treated and burn, and fire-suppressed. The physiological groupings also remained robust between the two years 1999 and 2000. In order for these groupings to be of use for scaling ecosystem processes, there needs to be an easy-to-measure morphological indicator of function. Life form classifications were able to depict the physiological groupings more adequately than either specific leaf area or leaf thickness. THe ability of life forms to depict the groupings was improved by separating the parasitic Ximenia americana from the shrub category.

  4. Modern CACSD using the Robust-Control Toolbox

    NASA Technical Reports Server (NTRS)

    Chiang, Richard Y.; Safonov, Michael G.

    1989-01-01

    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.

  5. Accurate and Robust Unitary Transformations of a High-Dimensional Quantum System

    NASA Astrophysics Data System (ADS)

    Anderson, B. E.; Sosa-Martinez, H.; Riofrío, C. A.; Deutsch, Ivan H.; Jessen, Poul S.

    2015-06-01

    Unitary transformations are the most general input-output maps available in closed quantum systems. Good control protocols have been developed for qubits, but questions remain about the use of optimal control theory to design unitary maps in high-dimensional Hilbert spaces, and about the feasibility of their robust implementation in the laboratory. Here we design and implement unitary maps in a 16-dimensional Hilbert space associated with the 6 S1 /2 ground state of 133Cs, achieving fidelities >0.98 with built-in robustness to static and dynamic perturbations. Our work has relevance for quantum information processing and provides a template for similar advances on other physical platforms.

  6. Robustness of Oscillatory Behavior in Correlated Networks

    PubMed Central

    Sasai, Takeyuki; Morino, Kai; Tanaka, Gouhei; Almendral, Juan A.; Aihara, Kazuyuki

    2015-01-01

    Understanding network robustness against failures of network units is useful for preventing large-scale breakdowns and damages in real-world networked systems. The tolerance of networked systems whose functions are maintained by collective dynamical behavior of the network units has recently been analyzed in the framework called dynamical robustness of complex networks. The effect of network structure on the dynamical robustness has been examined with various types of network topology, but the role of network assortativity, or degree–degree correlations, is still unclear. Here we study the dynamical robustness of correlated (assortative and disassortative) networks consisting of diffusively coupled oscillators. Numerical analyses for the correlated networks with Poisson and power-law degree distributions show that network assortativity enhances the dynamical robustness of the oscillator networks but the impact of network disassortativity depends on the detailed network connectivity. Furthermore, we theoretically analyze the dynamical robustness of correlated bimodal networks with two-peak degree distributions and show the positive impact of the network assortativity. PMID:25894574

  7. A network property necessary for concentration robustness

    PubMed Central

    Eloundou-Mbebi, Jeanne M. O.; Küken, Anika; Omranian, Nooshin; Kleessen, Sabrina; Neigenfind, Jost; Basler, Georg; Nikoloski, Zoran

    2016-01-01

    Maintenance of functionality of complex cellular networks and entire organisms exposed to environmental perturbations often depends on concentration robustness of the underlying components. Yet, the reasons and consequences of concentration robustness in large-scale cellular networks remain largely unknown. Here, we derive a necessary condition for concentration robustness based only on the structure of networks endowed with mass action kinetics. The structural condition can be used to design targeted experiments to study concentration robustness. We show that metabolites satisfying the necessary condition are present in metabolic networks from diverse species, suggesting prevalence of this property across kingdoms of life. We also demonstrate that our predictions about concentration robustness of energy-related metabolites are in line with experimental evidence from Escherichia coli. The necessary condition is applicable to mass action biological systems of arbitrary size, and will enable understanding the implications of concentration robustness in genetic engineering strategies and medical applications. PMID:27759015

  8. Bio-Inspired Microsystem for Robust Genetic Assay Recognition

    PubMed Central

    Lue, Jaw-Chyng; Fang, Wai-Chi

    2008-01-01

    A compact integrated system-on-chip (SoC) architecture solution for robust, real-time, and on-site genetic analysis has been proposed. This microsystem solution is noise-tolerable and suitable for analyzing the weak fluorescence patterns from a PCR prepared dual-labeled DNA microchip assay. In the architecture, a preceding VLSI differential logarithm microchip is designed for effectively computing the logarithm of the normalized input fluorescence signals. A posterior VLSI artificial neural network (ANN) processor chip is used for analyzing the processed signals from the differential logarithm stage. A single-channel logarithmic circuit was fabricated and characterized. A prototype ANN chip with unsupervised winner-take-all (WTA) function was designed, fabricated, and tested. An ANN learning algorithm using a novel sigmoid-logarithmic transfer function based on the supervised backpropagation (BP) algorithm is proposed for robustly recognizing low-intensity patterns. Our results show that the trained new ANN can recognize low-fluorescence patterns better than an ANN using the conventional sigmoid function. PMID:18566679

  9. Environmental change makes robust ecological networks fragile

    USGS Publications Warehouse

    Strona, Giovanni; Lafferty, Kevin D.

    2016-01-01

    Complex ecological networks appear robust to primary extinctions, possibly due to consumers’ tendency to specialize on dependable (available and persistent) resources. However, modifications to the conditions under which the network has evolved might alter resource dependability. Here, we ask whether adaptation to historical conditions can increase community robustness, and whether such robustness can protect communities from collapse when conditions change. Using artificial life simulations, we first evolved digital consumer-resource networks that we subsequently subjected to rapid environmental change. We then investigated how empirical host–parasite networks would respond to historical, random and expected extinction sequences. In both the cases, networks were far more robust to historical conditions than new ones, suggesting that new environmental challenges, as expected under global change, might collapse otherwise robust natural ecosystems.

  10. Are All Letters Really Processed Equally and in Parallel? Further Evidence of a Robust First Letter Advantage

    PubMed Central

    Scaltritti, Michele; Balota, David A.

    2013-01-01

    This present study examined accuracy and response latency of letter processing as a function of position within a horizontal array. In a series of 4 Experiments, target-strings were briefly (33 ms for Experiment 1 to 3, 83 ms for Experiment 4) displayed and both forward and backward masked. Participants then made a two alternative forced choice. The two alternative responses differed just in one element of the string, and position of mismatch was systematically manipulated. In Experiment 1, words of different lengths (from 3 to 6 letters) were presented in separate blocks. Across different lengths, there was a robust advantage in performance when the alternative response was different for the letter occurring at the first position, compared to when the difference occurred at any other position. Experiment 2 replicated this finding with the same materials used in Experiment 1, but with words of different lengths randomly intermixed within blocks. Experiment 3 provided evidence of the first position advantage with legal nonwords and strings of consonants, but did not provide any first position advantage for non-alphabetic symbols. The lack of a first position advantage for symbols was replicated in Experiment 4, where target-strings were displayed for a longer duration (83 ms). Taken together these results suggest that the first position advantage is a phenomenon that occurs specifically and selectively for letters, independent of lexical constraints. We argue that the results are consistent with models that assume a processing advantage for coding letters in the first position, and are inconsistent with the commonly held assumption in visual word recognition models that letters are equally processed in parallel independent of letter position. PMID:24012723

  11. Robust High-Capacity Audio Watermarking Based on FFT Amplitude Modification

    NASA Astrophysics Data System (ADS)

    Fallahpour, Mehdi; Megías, David

    This paper proposes a novel robust audio watermarking algorithm to embed data and extract it in a bit-exact manner based on changing the magnitudes of the FFT spectrum. The key point is selecting a frequency band for embedding based on the comparison between the original and the MP3 compressed/decompressed signal and on a suitable scaling factor. The experimental results show that the method has a very high capacity (about 5kbps), without significant perceptual distortion (ODG about -0.25) and provides robustness against common audio signal processing such as added noise, filtering and MPEG compression (MP3). Furthermore, the proposed method has a larger capacity (number of embedded bits to number of host bits rate) than recent image data hiding methods.

  12. Robust, Optimal Subsonic Airfoil Shapes

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan

    2014-01-01

    A method has been developed to create an airfoil robust enough to operate satisfactorily in different environments. This method determines a robust, optimal, subsonic airfoil shape, beginning with an arbitrary initial airfoil shape, and imposes the necessary constraints on the design. Also, this method is flexible and extendible to a larger class of requirements and changes in constraints imposed.

  13. Generalized shortcuts to adiabaticity and enhanced robustness against decoherence

    NASA Astrophysics Data System (ADS)

    Santos, Alan C.; Sarandy, Marcelo S.

    2018-01-01

    Shortcuts to adiabaticity provide a general approach to mimic adiabatic quantum processes via arbitrarily fast evolutions in Hilbert space. For these counter-diabatic evolutions, higher speed comes at higher energy cost. Here, the counter-diabatic theory is employed as a minimal energy demanding scheme for speeding up adiabatic tasks. As a by-product, we show that this approach can be used to obtain infinite classes of transitionless models, including time-independent Hamiltonians under certain conditions over the eigenstates of the original Hamiltonian. We apply these results to investigate shortcuts to adiabaticity in decohering environments by introducing the requirement of a fixed energy resource. In this scenario, we show that generalized transitionless evolutions can be more robust against decoherence than their adiabatic counterparts. We illustrate this enhanced robustness both for the Landau-Zener model and for quantum gate Hamiltonians.

  14. A Robust False Matching Points Detection Method for Remote Sensing Image Registration

    NASA Astrophysics Data System (ADS)

    Shan, X. J.; Tang, P.

    2015-04-01

    Given the influences of illumination, imaging angle, and geometric distortion, among others, false matching points still occur in all image registration algorithms. Therefore, false matching points detection is an important step in remote sensing image registration. Random Sample Consensus (RANSAC) is typically used to detect false matching points. However, RANSAC method cannot detect all false matching points in some remote sensing images. Therefore, a robust false matching points detection method based on Knearest- neighbour (K-NN) graph (KGD) is proposed in this method to obtain robust and high accuracy result. The KGD method starts with the construction of the K-NN graph in one image. K-NN graph can be first generated for each matching points and its K nearest matching points. Local transformation model for each matching point is then obtained by using its K nearest matching points. The error of each matching point is computed by using its transformation model. Last, L matching points with largest error are identified false matching points and removed. This process is iterative until all errors are smaller than the given threshold. In addition, KGD method can be used in combination with other methods, such as RANSAC. Several remote sensing images with different resolutions and terrains are used in the experiment. We evaluate the performance of KGD method, RANSAC + KGD method, RANSAC, and Graph Transformation Matching (GTM). The experimental results demonstrate the superior performance of the KGD and RANSAC + KGD methods.

  15. Investigation of progressive failure robustness and alternate load paths for damage tolerant structures

    NASA Astrophysics Data System (ADS)

    Marhadi, Kun Saptohartyadi

    Structural optimization for damage tolerance under various unforeseen damage scenarios is computationally challenging. It couples non-linear progressive failure analysis with sampling-based stochastic analysis of random damage. The goal of this research was to understand the relationship between alternate load paths available in a structure and its damage tolerance, and to use this information to develop computationally efficient methods for designing damage tolerant structures. Progressive failure of a redundant truss structure subjected to small random variability was investigated to identify features that correlate with robustness and predictability of the structure's progressive failure. The identified features were used to develop numerical surrogate measures that permit computationally efficient deterministic optimization to achieve robustness and predictability of progressive failure. Analysis of damage tolerance on designs with robust progressive failure indicated that robustness and predictability of progressive failure do not guarantee damage tolerance. Damage tolerance requires a structure to redistribute its load to alternate load paths. In order to investigate the load distribution characteristics that lead to damage tolerance in structures, designs with varying degrees of damage tolerance were generated using brute force stochastic optimization. A method based on principal component analysis was used to describe load distributions (alternate load paths) in the structures. Results indicate that a structure that can develop alternate paths is not necessarily damage tolerant. The alternate load paths must have a required minimum load capability. Robustness analysis of damage tolerant optimum designs indicates that designs are tailored to specified damage. A design Optimized under one damage specification can be sensitive to other damages not considered. Effectiveness of existing load path definitions and characterizations were investigated for continuum

  16. On the robustness of SAC silencing in closed mitosis

    NASA Astrophysics Data System (ADS)

    Ruth, Donovan; Liu, Jian

    Mitosis equally partitions sister chromatids to two daughter cells. This is achieved by properly attaching these chromatids via their kinetochores to microtubules that emanate from the spindle poles. Once the last kinetochore is properly attached, the spindle microtubules pull the sister chromatids apart. Due to the dynamic nature of microtubules, however, kinetochore-microtubule attachment often goes wrong. When this erroneous attachment occurs, it locally activates an ensemble of proteins, called the spindle assembly checkpoint proteins (SAC), which halts the mitotic progression until all the kinetochores are properly attached by spindle microtubules. The timing of SAC silencing thus determines the fidelity of chromosome segregation. We previously established a spatiotemporal model that addresses the robustness of SAC silencing in open mitosis for the first time. Here, we focus on closed mitosis by examining yeast mitosis as a model system. Though much experimental work has been done to study the SAC in cells undergoing closed mitosis, the processes responsible are not well understood. We leverage and extend our previous model to study SAC silencing mechanism in closed mitosis. We show that a robust signal of the SAC protein accumulation at the spindle pole body can be achieved. This signal is a nonlinear increasing function of number of kinetochore-microtubule attachments, and can thus serve as a robust trigger to time the SAC silencing. Together, our mechanism provides a unified framework across species that ensures robust SAC silencing and fidelity of chromosome segregation in mitosis. Intramural research program in NHLBI at NIH.

  17. Measure of robustness for complex networks

    NASA Astrophysics Data System (ADS)

    Youssef, Mina Nabil

    Critical infrastructures are repeatedly attacked by external triggers causing tremendous amount of damages. Any infrastructure can be studied using the powerful theory of complex networks. A complex network is composed of extremely large number of different elements that exchange commodities providing significant services. The main functions of complex networks can be damaged by different types of attacks and failures that degrade the network performance. These attacks and failures are considered as disturbing dynamics, such as the spread of viruses in computer networks, the spread of epidemics in social networks, and the cascading failures in power grids. Depending on the network structure and the attack strength, every network differently suffers damages and performance degradation. Hence, quantifying the robustness of complex networks becomes an essential task. In this dissertation, new metrics are introduced to measure the robustness of technological and social networks with respect to the spread of epidemics, and the robustness of power grids with respect to cascading failures. First, we introduce a new metric called the Viral Conductance (VCSIS ) to assess the robustness of networks with respect to the spread of epidemics that are modeled through the susceptible/infected/susceptible (SIS) epidemic approach. In contrast to assessing the robustness of networks based on a classical metric, the epidemic threshold, the new metric integrates the fraction of infected nodes at steady state for all possible effective infection strengths. Through examples, VCSIS provides more insights about the robustness of networks than the epidemic threshold. In addition, both the paradoxical robustness of Barabasi-Albert preferential attachment networks and the effect of the topology on the steady state infection are studied, to show the importance of quantifying the robustness of networks. Second, a new metric VCSIR is introduced to assess the robustness of networks with respect

  18. Robust Statistical Fusion of Image Labels

    PubMed Central

    Landman, Bennett A.; Asman, Andrew J.; Scoggins, Andrew G.; Bogovic, John A.; Xing, Fangxu; Prince, Jerry L.

    2011-01-01

    Image labeling and parcellation (i.e. assigning structure to a collection of voxels) are critical tasks for the assessment of volumetric and morphometric features in medical imaging data. The process of image labeling is inherently error prone as images are corrupted by noise and artifacts. Even expert interpretations are subject to subjectivity and the precision of the individual raters. Hence, all labels must be considered imperfect with some degree of inherent variability. One may seek multiple independent assessments to both reduce this variability and quantify the degree of uncertainty. Existing techniques have exploited maximum a posteriori statistics to combine data from multiple raters and simultaneously estimate rater reliabilities. Although quite successful, wide-scale application has been hampered by unstable estimation with practical datasets, for example, with label sets with small or thin objects to be labeled or with partial or limited datasets. As well, these approaches have required each rater to generate a complete dataset, which is often impossible given both human foibles and the typical turnover rate of raters in a research or clinical environment. Herein, we propose a robust approach to improve estimation performance with small anatomical structures, allow for missing data, account for repeated label sets, and utilize training/catch trial data. With this approach, numerous raters can label small, overlapping portions of a large dataset, and rater heterogeneity can be robustly controlled while simultaneously estimating a single, reliable label set and characterizing uncertainty. The proposed approach enables many individuals to collaborate in the construction of large datasets for labeling tasks (e.g., human parallel processing) and reduces the otherwise detrimental impact of rater unavailability. PMID:22010145

  19. Environmental change makes robust ecological networks fragile

    PubMed Central

    Strona, Giovanni; Lafferty, Kevin D.

    2016-01-01

    Complex ecological networks appear robust to primary extinctions, possibly due to consumers' tendency to specialize on dependable (available and persistent) resources. However, modifications to the conditions under which the network has evolved might alter resource dependability. Here, we ask whether adaptation to historical conditions can increase community robustness, and whether such robustness can protect communities from collapse when conditions change. Using artificial life simulations, we first evolved digital consumer-resource networks that we subsequently subjected to rapid environmental change. We then investigated how empirical host–parasite networks would respond to historical, random and expected extinction sequences. In both the cases, networks were far more robust to historical conditions than new ones, suggesting that new environmental challenges, as expected under global change, might collapse otherwise robust natural ecosystems. PMID:27511722

  20. The control of the controller: molecular mechanisms for robust perfect adaptation and temperature compensation.

    PubMed

    Ni, Xiao Yu; Drengstig, Tormod; Ruoff, Peter

    2009-09-02

    Organisms have the property to adapt to a changing environment and keep certain components within a cell regulated at the same level (homeostasis). "Perfect adaptation" describes an organism's response to an external stepwise perturbation by regulating some of its variables/components precisely to their original preperturbation values. Numerous examples of perfect adaptation/homeostasis have been found, as for example, in bacterial chemotaxis, photoreceptor responses, MAP kinase activities, or in metal-ion homeostasis. Two concepts have evolved to explain how perfect adaptation may be understood: In one approach (robust perfect adaptation), the adaptation is a network property, which is mostly, but not entirely, independent of rate constant values; in the other approach (nonrobust perfect adaptation), a fine-tuning of rate constant values is needed. Here we identify two classes of robust molecular homeostatic mechanisms, which compensate for environmental variations in a controlled variable's inflow or outflow fluxes, and allow for the presence of robust temperature compensation. These two classes of homeostatic mechanisms arise due to the fact that concentrations must have positive values. We show that the concept of integral control (or integral feedback), which leads to robust homeostasis, is associated with a control species that has to work under zero-order flux conditions and does not necessarily require the presence of a physico-chemical feedback structure. There are interesting links between the two identified classes of homeostatic mechanisms and molecular mechanisms found in mammalian iron and calcium homeostasis, indicating that homeostatic mechanisms may underlie similar molecular control structures.

  1. A methodology for the synthesis of robust feedback systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Milich, David Albert

    1988-01-01

    A new methodology is developed for the synthesis of linear, time-variant (LTI) controllers for multivariable LTI systems. The resulting closed-loop system is nominally stable and exhibits a known level of performance. In addition, robustness of the feedback system is guaranteed, i.e., stability and performance are retained in the presence of multiple unstructured uncertainty blocks located at various points in the feedback loop. The design technique is referred to as the Causality Recovery Methodology (CRM). The CRM relies on the Youla parameterization of all stabilizing compensators to ensure nominal stability of the feedback system. A frequency-domain inequality in terms of the structured singular value mu defines the robustness specification. The optimal compensator, with respect to the mu condition, is shown to be noncausal in general. The aim of the CRM is to find a stable, causal transfer function matrix that approximates the robustness characteristics of the optimal solution. The CRM, via a series of infinite-dimensional convex programs, produces a closed-loop system whose performance robustness is at least as good as that of any initial design. The algorithm is approximated by a finite dimensional process for the purposes of implementation. Two numerical examples confirm the potential viability of the CRM concept; however, the robustness improvement comes at the expense of increased computational burden and compensator complexity.

  2. Application of multi-factorial design of experiments to successfully optimize immunoassays for robust measurements of therapeutic proteins.

    PubMed

    Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh

    2009-02-20

    Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method

  3. Robust matching for voice recognition

    NASA Astrophysics Data System (ADS)

    Higgins, Alan; Bahler, L.; Porter, J.; Blais, P.

    1994-10-01

    This paper describes an automated method of comparing a voice sample of an unknown individual with samples from known speakers in order to establish or verify the individual's identity. The method is based on a statistical pattern matching approach that employs a simple training procedure, requires no human intervention (transcription, work or phonetic marketing, etc.), and makes no assumptions regarding the expected form of the statistical distributions of the observations. The content of the speech material (vocabulary, grammar, etc.) is not assumed to be constrained in any way. An algorithm is described which incorporates frame pruning and channel equalization processes designed to achieve robust performance with reasonable computational resources. An experimental implementation demonstrating the feasibility of the concept is described.

  4. Robust analysis of semiparametric renewal process models

    PubMed Central

    Lin, Feng-Chang; Truong, Young K.; Fine, Jason P.

    2013-01-01

    Summary A rate model is proposed for a modulated renewal process comprising a single long sequence, where the covariate process may not capture the dependencies in the sequence as in standard intensity models. We consider partial likelihood-based inferences under a semiparametric multiplicative rate model, which has been widely studied in the context of independent and identical data. Under an intensity model, gap times in a single long sequence may be used naively in the partial likelihood with variance estimation utilizing the observed information matrix. Under a rate model, the gap times cannot be treated as independent and studying the partial likelihood is much more challenging. We employ a mixing condition in the application of limit theory for stationary sequences to obtain consistency and asymptotic normality. The estimator's variance is quite complicated owing to the unknown gap times dependence structure. We adapt block bootstrapping and cluster variance estimators to the partial likelihood. Simulation studies and an analysis of a semiparametric extension of a popular model for neural spike train data demonstrate the practical utility of the rate approach in comparison with the intensity approach. PMID:24550568

  5. Robustness of mission plans for unmanned aircraft

    NASA Astrophysics Data System (ADS)

    Niendorf, Moritz

    , and criticalities are derived. This analysis is extended to Euclidean minimum spanning trees. This thesis aims at enabling increased mission performance by providing means of assessing the robustness and optimality of a mission and methods for identifying critical elements. Examples of the application to mission planning in contested environments, cargo aircraft mission planning, multi-objective mission planning, and planning optimal communication topologies for teams of unmanned aircraft are given.

  6. How simple autonomous decisions evolve into robust behaviours? A review from neurorobotics, cognitive, self-organized and artificial immune systems fields.

    PubMed

    Fernandez-Leon, Jose A; Acosta, Gerardo G; Rozenfeld, Alejandro

    2014-10-01

    Researchers in diverse fields, such as in neuroscience, systems biology and autonomous robotics, have been intrigued by the origin and mechanisms for biological robustness. Darwinian evolution, in general, has suggested that adaptive mechanisms as a way of reaching robustness, could evolve by natural selection acting successively on numerous heritable variations. However, is this understanding enough for realizing how biological systems remain robust during their interactions with the surroundings? Here, we describe selected studies of bio-inspired systems that show behavioral robustness. From neurorobotics, cognitive, self-organizing and artificial immune system perspectives, our discussions focus mainly on how robust behaviors evolve or emerge in these systems, having the capacity of interacting with their surroundings. These descriptions are twofold. Initially, we introduce examples from autonomous robotics to illustrate how the process of designing robust control can be idealized in complex environments for autonomous navigation in terrain and underwater vehicles. We also include descriptions of bio-inspired self-organizing systems. Then, we introduce other studies that contextualize experimental evolution with simulated organisms and physical robots to exemplify how the process of natural selection can lead to the evolution of robustness by means of adaptive behaviors. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement

  8. A robust H∞ control-based hierarchical mode transition control system for plug-in hybrid electric vehicle

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Jiao, Xiaohong; Li, Liang; Zhang, Yuanbo; Chen, Zheng

    2018-01-01

    To realize a fast and smooth operating mode transition process from electric driving mode to engine-on driving mode, this paper presents a novel robust hierarchical mode transition control method for a plug-in hybrid electric bus (PHEB) with pre-transmission parallel hybrid powertrain. Firstly, the mode transition process is divided into five stages to clearly describe the powertrain dynamics. Based on the dynamics models of powertrain and clutch actuating mechanism, a hierarchical control structure including two robust H∞ controllers in both upper layer and lower layer is proposed. In upper layer, the demand clutch torque can be calculated by a robust H∞controller considering the clutch engaging time and the vehicle jerk. While in lower layer a robust tracking controller with L2-gain is designed to perform the accurate position tracking control, especially when the parameters uncertainties and external disturbance occur in the clutch actuating mechanism. Simulation and hardware-in-the-loop (HIL) test are carried out in a traditional driving condition of PHEB. Results show that the proposed hierarchical control approach can obtain the good control performance: mode transition time is greatly reduced with the acceptable jerk. Meanwhile, the designed control system shows the obvious robustness with the uncertain parameters and disturbance. Therefore, the proposed approach may offer a theoretical reference for the actual vehicle controller.

  9. A robust sound perception model suitable for neuromorphic implementation.

    PubMed

    Coath, Martin; Sheik, Sadique; Chicca, Elisabetta; Indiveri, Giacomo; Denham, Susan L; Wennekers, Thomas

    2013-01-01

    We have recently demonstrated the emergence of dynamic feature sensitivity through exposure to formative stimuli in a real-time neuromorphic system implementing a hybrid analog/digital network of spiking neurons. This network, inspired by models of auditory processing in mammals, includes several mutually connected layers with distance-dependent transmission delays and learning in the form of spike timing dependent plasticity, which effects stimulus-driven changes in the network connectivity. Here we present results that demonstrate that the network is robust to a range of variations in the stimulus pattern, such as are found in naturalistic stimuli and neural responses. This robustness is a property critical to the development of realistic, electronic neuromorphic systems. We analyze the variability of the response of the network to "noisy" stimuli which allows us to characterize the acuity in information-theoretic terms. This provides an objective basis for the quantitative comparison of networks, their connectivity patterns, and learning strategies, which can inform future design decisions. We also show, using stimuli derived from speech samples, that the principles are robust to other challenges, such as variable presentation rate, that would have to be met by systems deployed in the real world. Finally we demonstrate the potential applicability of the approach to real sounds.

  10. Robust Design of Biological Circuits: Evolutionary Systems Biology Approach

    PubMed Central

    Chen, Bor-Sen; Hsu, Chih-Yuan; Liou, Jing-Jia

    2011-01-01

    Artificial gene circuits have been proposed to be embedded into microbial cells that function as switches, timers, oscillators, and the Boolean logic gates. Building more complex systems from these basic gene circuit components is one key advance for biologic circuit design and synthetic biology. However, the behavior of bioengineered gene circuits remains unstable and uncertain. In this study, a nonlinear stochastic system is proposed to model the biological systems with intrinsic parameter fluctuations and environmental molecular noise from the cellular context in the host cell. Based on evolutionary systems biology algorithm, the design parameters of target gene circuits can evolve to specific values in order to robustly track a desired biologic function in spite of intrinsic and environmental noise. The fitness function is selected to be inversely proportional to the tracking error so that the evolutionary biological circuit can achieve the optimal tracking mimicking the evolutionary process of a gene circuit. Finally, several design examples are given in silico with the Monte Carlo simulation to illustrate the design procedure and to confirm the robust performance of the proposed design method. The result shows that the designed gene circuits can robustly track desired behaviors with minimal errors even with nontrivial intrinsic and external noise. PMID:22187523

  11. Robust design of biological circuits: evolutionary systems biology approach.

    PubMed

    Chen, Bor-Sen; Hsu, Chih-Yuan; Liou, Jing-Jia

    2011-01-01

    Artificial gene circuits have been proposed to be embedded into microbial cells that function as switches, timers, oscillators, and the Boolean logic gates. Building more complex systems from these basic gene circuit components is one key advance for biologic circuit design and synthetic biology. However, the behavior of bioengineered gene circuits remains unstable and uncertain. In this study, a nonlinear stochastic system is proposed to model the biological systems with intrinsic parameter fluctuations and environmental molecular noise from the cellular context in the host cell. Based on evolutionary systems biology algorithm, the design parameters of target gene circuits can evolve to specific values in order to robustly track a desired biologic function in spite of intrinsic and environmental noise. The fitness function is selected to be inversely proportional to the tracking error so that the evolutionary biological circuit can achieve the optimal tracking mimicking the evolutionary process of a gene circuit. Finally, several design examples are given in silico with the Monte Carlo simulation to illustrate the design procedure and to confirm the robust performance of the proposed design method. The result shows that the designed gene circuits can robustly track desired behaviors with minimal errors even with nontrivial intrinsic and external noise.

  12. Interrogating the topological robustness of gene regulatory circuits by randomization

    PubMed Central

    Levine, Herbert; Onuchic, Jose N.

    2017-01-01

    One of the most important roles of cells is performing their cellular tasks properly for survival. Cells usually achieve robust functionality, for example, cell-fate decision-making and signal transduction, through multiple layers of regulation involving many genes. Despite the combinatorial complexity of gene regulation, its quantitative behavior has been typically studied on the basis of experimentally verified core gene regulatory circuitry, composed of a small set of important elements. It is still unclear how such a core circuit operates in the presence of many other regulatory molecules and in a crowded and noisy cellular environment. Here we report a new computational method, named random circuit perturbation (RACIPE), for interrogating the robust dynamical behavior of a gene regulatory circuit even without accurate measurements of circuit kinetic parameters. RACIPE generates an ensemble of random kinetic models corresponding to a fixed circuit topology, and utilizes statistical tools to identify generic properties of the circuit. By applying RACIPE to simple toggle-switch-like motifs, we observed that the stable states of all models converge to experimentally observed gene state clusters even when the parameters are strongly perturbed. RACIPE was further applied to a proposed 22-gene network of the Epithelial-to-Mesenchymal Transition (EMT), from which we identified four experimentally observed gene states, including the states that are associated with two different types of hybrid Epithelial/Mesenchymal phenotypes. Our results suggest that dynamics of a gene circuit is mainly determined by its topology, not by detailed circuit parameters. Our work provides a theoretical foundation for circuit-based systems biology modeling. We anticipate RACIPE to be a powerful tool to predict and decode circuit design principles in an unbiased manner, and to quantitatively evaluate the robustness and heterogeneity of gene expression. PMID:28362798

  13. Phenotypic Robustness and the Assortativity Signature of Human Transcription Factor Networks

    PubMed Central

    Pechenick, Dov A.; Payne, Joshua L.; Moore, Jason H.

    2014-01-01

    Many developmental, physiological, and behavioral processes depend on the precise expression of genes in space and time. Such spatiotemporal gene expression phenotypes arise from the binding of sequence-specific transcription factors (TFs) to DNA, and from the regulation of nearby genes that such binding causes. These nearby genes may themselves encode TFs, giving rise to a transcription factor network (TFN), wherein nodes represent TFs and directed edges denote regulatory interactions between TFs. Computational studies have linked several topological properties of TFNs — such as their degree distribution — with the robustness of a TFN's gene expression phenotype to genetic and environmental perturbation. Another important topological property is assortativity, which measures the tendency of nodes with similar numbers of edges to connect. In directed networks, assortativity comprises four distinct components that collectively form an assortativity signature. We know very little about how a TFN's assortativity signature affects the robustness of its gene expression phenotype to perturbation. While recent theoretical results suggest that increasing one specific component of a TFN's assortativity signature leads to increased phenotypic robustness, the biological context of this finding is currently limited because the assortativity signatures of real-world TFNs have not been characterized. It is therefore unclear whether these earlier theoretical findings are biologically relevant. Moreover, it is not known how the other three components of the assortativity signature contribute to the phenotypic robustness of TFNs. Here, we use publicly available DNaseI-seq data to measure the assortativity signatures of genome-wide TFNs in 41 distinct human cell and tissue types. We find that all TFNs share a common assortativity signature and that this signature confers phenotypic robustness to model TFNs. Lastly, we determine the extent to which each of the four components of

  14. Proteomics of Aspergillus fumigatus Conidia-containing Phagolysosomes Identifies Processes Governing Immune Evasion.

    PubMed

    Schmidt, Hella; Vlaic, Sebastian; Krüger, Thomas; Schmidt, Franziska; Balkenhol, Johannes; Dandekar, Thomas; Guthke, Reinhard; Kniemeyer, Olaf; Heinekamp, Thorsten; Brakhage, Axel A

    2018-06-01

    Invasive infections by the human pathogenic fungus Aspergillus fumigatus start with the outgrowth of asexual, airborne spores (conidia) into the lung tissue of immunocompromised patients. The resident alveolar macrophages phagocytose conidia, which end up in phagolysosomes. However, A. fumigatus conidia resist phagocytic degradation to a certain degree. This is mainly attributable to the pigment 1,8-dihydroxynaphthalene (DHN) melanin located in the cell wall of conidia, which manipulates the phagolysosomal maturation and prevents their intracellular killing. To get insight in the underlying molecular mechanisms, we comparatively analyzed proteins of mouse macrophage phagolysosomes containing melanized wild-type (wt) or nonmelanized pksP mutant conidia. For this purpose, a protocol to isolate conidia-containing phagolysosomes was established and a reference protein map of phagolysosomes was generated. We identified 637 host and 22 A. fumigatus proteins that were differentially abundant in the phagolysosome. 472 of the host proteins were overrepresented in the pksP mutant and 165 in the wt conidia-containing phagolysosome. Eight of the fungal proteins were produced only in pksP mutant and 14 proteins in wt conidia-containing phagolysosomes. Bioinformatical analysis compiled a regulatory module, which indicates host processes affected by the fungus. These processes include vATPase-driven phagolysosomal acidification, Rab5 and Vamp8-dependent endocytic trafficking, signaling pathways, as well as recruitment of the Lamp1 phagolysosomal maturation marker and the lysosomal cysteine protease cathepsin Z. Western blotting and immunofluorescence analyses confirmed the proteome data and moreover showed differential abundance of the major metabolic regulator mTOR. Taken together, with the help of a protocol optimized to isolate A. fumigatus conidia-containing phagolysosomes and a potent bioinformatics algorithm, we were able to confirm A. fumigatus conidia

  15. Robust Optimization Design for Turbine Blade-Tip Radial Running Clearance using Hierarchically Response Surface Method

    NASA Astrophysics Data System (ADS)

    Zhiying, Chen; Ping, Zhou

    2017-11-01

    Considering the robust optimization computational precision and efficiency for complex mechanical assembly relationship like turbine blade-tip radial running clearance, a hierarchically response surface robust optimization algorithm is proposed. The distribute collaborative response surface method is used to generate assembly system level approximation model of overall parameters and blade-tip clearance, and then a set samples of design parameters and objective response mean and/or standard deviation is generated by using system approximation model and design of experiment method. Finally, a new response surface approximation model is constructed by using those samples, and this approximation model is used for robust optimization process. The analyses results demonstrate the proposed method can dramatic reduce the computational cost and ensure the computational precision. The presented research offers an effective way for the robust optimization design of turbine blade-tip radial running clearance.

  16. A Robust Design Methodology for Optimal Microscale Secondary Flow Control in Compact Inlet Diffusers

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Keller, Dennis J.

    2001-01-01

    economical way of exploring the concept of Robust inlet design, where the mission variables are brought directly into the inlet design process and insensitivity or robustness to the mission variables becomes a design objective.

  17. Achieving Robustness to Uncertainty for Financial Decision-making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gapmore » of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models

  18. Robust Adaptive Modified Newton Algorithm for Generalized Eigendecomposition and Its Application

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Yang, Feng; Xi, Hong-Sheng; Guo, Wei; Sheng, Yanmin

    2007-12-01

    We propose a robust adaptive algorithm for generalized eigendecomposition problems that arise in modern signal processing applications. To that extent, the generalized eigendecomposition problem is reinterpreted as an unconstrained nonlinear optimization problem. Starting from the proposed cost function and making use of an approximation of the Hessian matrix, a robust modified Newton algorithm is derived. A rigorous analysis of its convergence properties is presented by using stochastic approximation theory. We also apply this theory to solve the signal reception problem of multicarrier DS-CDMA to illustrate its practical application. The simulation results show that the proposed algorithm has fast convergence and excellent tracking capability, which are important in a practical time-varying communication environment.

  19. Simulated discharge trends indicate robustness of hydrological models in a changing climate

    NASA Astrophysics Data System (ADS)

    Addor, Nans; Nikolova, Silviya; Seibert, Jan

    2016-04-01

    Assessing the robustness of hydrological models under contrasted climatic conditions should be part any hydrological model evaluation. Robust models are particularly important for climate impact studies, as models performing well under current conditions are not necessarily capable of correctly simulating hydrological perturbations caused by climate change. A pressing issue is the usually assumed stationarity of parameter values over time. Modeling experiments using conceptual hydrological models revealed that assuming transposability of parameters values in changing climatic conditions can lead to significant biases in discharge simulations. This raises the question whether parameter values should to be modified over time to reflect changes in hydrological processes induced by climate change. Such a question denotes a focus on the contribution of internal processes (i.e., catchment processes) to discharge generation. Here we adopt a different perspective and explore the contribution of external forcing (i.e., changes in precipitation and temperature) to changes in discharge. We argue that in a robust hydrological model, discharge variability should be induced by changes in the boundary conditions, and not by changes in parameter values. In this study, we explore how well the conceptual hydrological model HBV captures transient changes in hydrological signatures over the period 1970-2009. Our analysis focuses on research catchments in Switzerland undisturbed by human activities. The precipitation and temperature forcing are extracted from recently released 2km gridded data sets. We use a genetic algorithm to calibrate HBV for the whole 40-year period and for the eight successive 5-year periods to assess eventual trends in parameter values. Model calibration is run multiple times to account for parameter uncertainty. We find that in alpine catchments showing a significant increase of winter discharge, this trend can be captured reasonably well with constant

  20. Data-Adaptive Bias-Reduced Doubly Robust Estimation.

    PubMed

    Vermeulen, Karel; Vansteelandt, Stijn

    2016-05-01

    Doubly robust estimators have now been proposed for a variety of target parameters in the causal inference and missing data literature. These consistently estimate the parameter of interest under a semiparametric model when one of two nuisance working models is correctly specified, regardless of which. The recently proposed bias-reduced doubly robust estimation procedure aims to partially retain this robustness in more realistic settings where both working models are misspecified. These so-called bias-reduced doubly robust estimators make use of special (finite-dimensional) nuisance parameter estimators that are designed to locally minimize the squared asymptotic bias of the doubly robust estimator in certain directions of these finite-dimensional nuisance parameters under misspecification of both parametric working models. In this article, we extend this idea to incorporate the use of data-adaptive estimators (infinite-dimensional nuisance parameters), by exploiting the bias reduction estimation principle in the direction of only one nuisance parameter. We additionally provide an asymptotic linearity theorem which gives the influence function of the proposed doubly robust estimator under correct specification of a parametric nuisance working model for the missingness mechanism/propensity score but a possibly misspecified (finite- or infinite-dimensional) outcome working model. Simulation studies confirm the desirable finite-sample performance of the proposed estimators relative to a variety of other doubly robust estimators.

  1. Optimization-Based Robust Nonlinear Control

    DTIC Science & Technology

    2006-08-01

    ABSTRACT New control algorithms were developed for robust stabilization of nonlinear dynamical systems . Novel, linear matrix inequality-based synthesis...was to further advance optimization-based robust nonlinear control design, for general nonlinear systems (especially in discrete time ), for linear...Teel, IEEE Transactions on Control Systems Technology, vol. 14, no. 3, p. 398-407, May 2006. 3. "A unified framework for input-to-state stability in

  2. Methodological systematic review identifies major limitations in prioritization processes for updating.

    PubMed

    Martínez García, Laura; Pardo-Hernandez, Hector; Superchi, Cecilia; Niño de Guzman, Ena; Ballesteros, Monica; Ibargoyen Roteta, Nora; McFarlane, Emma; Posso, Margarita; Roqué I Figuls, Marta; Rotaeche Del Campo, Rafael; Sanabria, Andrea Juliana; Selva, Anna; Solà, Ivan; Vernooij, Robin W M; Alonso-Coello, Pablo

    2017-06-01

    The aim of the study was to identify and describe strategies to prioritize the updating of systematic reviews (SRs), health technology assessments (HTAs), or clinical guidelines (CGs). We conducted an SR of studies describing one or more methods to prioritize SRs, HTAs, or CGs for updating. We searched MEDLINE (PubMed, from 1966 to August 2016) and The Cochrane Methodology Register (The Cochrane Library, Issue 8 2016). We hand searched abstract books, reviewed reference lists, and contacted experts. Two reviewers independently screened the references and extracted data. We included 14 studies. Six studies were classified as descriptive (6 of 14, 42.9%) and eight as implementation studies (8 of 14, 57.1%). Six studies reported an updating strategy (6 of 14, 42.9%), six a prioritization process (6 of 14, 42.9%), and two a prioritization criterion (2 of 14, 14.2%). Eight studies focused on SRs (8 of 14, 57.1%), six studies focused on CGs (6 of 14, 42.9%), and none were about HTAs. We identified 76 prioritization criteria that can be applied when prioritizing documents for updating. The most frequently cited criteria were as follows: available evidence (19 of 76, 25.0%), clinical relevance (10 of 76; 13.2%), and users' interest (10 of 76; 13.2%). There is wide variability and suboptimal reporting of the methods used to develop and implement processes to prioritize updating of SRs, HTAs, and CGs. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Practical, Real-Time, and Robust Watermarking on the Spatial Domain for High-Definition Video Contents

    NASA Astrophysics Data System (ADS)

    Kim, Kyung-Su; Lee, Hae-Yeoun; Im, Dong-Hyuck; Lee, Heung-Kyu

    Commercial markets employ digital right management (DRM) systems to protect valuable high-definition (HD) quality videos. DRM system uses watermarking to provide copyright protection and ownership authentication of multimedia contents. We propose a real-time video watermarking scheme for HD video in the uncompressed domain. Especially, our approach is in aspect of practical perspectives to satisfy perceptual quality, real-time processing, and robustness requirements. We simplify and optimize human visual system mask for real-time performance and also apply dithering technique for invisibility. Extensive experiments are performed to prove that the proposed scheme satisfies the invisibility, real-time processing, and robustness requirements against video processing attacks. We concentrate upon video processing attacks that commonly occur in HD quality videos to display on portable devices. These attacks include not only scaling and low bit-rate encoding, but also malicious attacks such as format conversion and frame rate change.

  4. Robust graphene membranes in a silicon carbide frame.

    PubMed

    Waldmann, Daniel; Butz, Benjamin; Bauer, Sebastian; Englert, Jan M; Jobst, Johannes; Ullmann, Konrad; Fromm, Felix; Ammon, Maximilian; Enzelberger, Michael; Hirsch, Andreas; Maier, Sabine; Schmuki, Patrik; Seyller, Thomas; Spiecker, Erdmann; Weber, Heiko B

    2013-05-28

    We present a fabrication process for freely suspended membranes consisting of bi- and trilayer graphene grown on silicon carbide. The procedure, involving photoelectrochemical etching, enables the simultaneous fabrication of hundreds of arbitrarily shaped membranes with an area up to 500 μm(2) and a yield of around 90%. Micro-Raman and atomic force microscopy measurements confirm that the graphene layer withstands the electrochemical etching and show that the membranes are virtually unstrained. The process delivers membranes with a cleanliness suited for high-resolution transmission electron microscopy (HRTEM) at atomic scale. The membrane, and its frame, is very robust with respect to thermal cycling above 1000 °C as well as harsh acidic or alkaline treatment.

  5. Process description language: an experiment in robust programming for manufacturing systems

    NASA Astrophysics Data System (ADS)

    Spooner, Natalie R.; Creak, G. Alan

    1998-10-01

    Maintaining stable, robust, and consistent software is difficult in face of the increasing rate of change of customers' preferences, materials, manufacturing techniques, computer equipment, and other characteristic features of manufacturing systems. It is argued that software is commonly difficult to keep up to date because many of the implications of these changing features on software details are obscure. A possible solution is to use a software generation system in which the transformation of system properties into system software is made explicit. The proposed generation system stores the system properties, such as machine properties, product properties and information on manufacturing techniques, in databases. As a result this information, on which system control is based, can also be made available to other programs. In particular, artificial intelligence programs such as fault diagnosis programs, can benefit from using the same information as the control system, rather than a separate database which must be developed and maintained separately to ensure consistency. Experience in developing a simplified model of such a system is presented.

  6. 2-DE analysis indicates that Acinetobacter baumannii displays a robust and versatile metabolism

    PubMed Central

    Soares, Nelson C; Cabral, Maria P; Parreira, José R; Gayoso, Carmen; Barba, Maria J; Bou, Germán

    2009-01-01

    Background Acinetobacter baumannii is a nosocomial pathogen that has been associated with outbreak infections in hospitals. Despite increasing awareness about this bacterium, its proteome remains poorly characterised, however recently the complete genome of A. baumannii reference strain ATCC 17978 has been sequenced. Here, we have used 2-DE and MALDI-TOF/TOF approach to characterise the proteome of this strain. Results The membrane and cytoplasmatic protein extracts were analysed separately, these analyses revealed the reproducible presence of 239 and 511 membrane and cytoplamatic protein spots, respectively. MALDI-TOF/TOF characterisation identified a total of 192 protein spots (37 membrane and 155 cytoplasmatic) and revealed that the identified membrane proteins were mainly transport-related proteins, whereas the cytoplasmatic proteins were of diverse nature, although mainly related to metabolic processes. Conclusion This work indicates that A. baumannii has a versatile and robust metabolism and also reveal a number of proteins that may play a key role in the mechanism of drug resistance and virulence. The data obtained complements earlier reports of A. baumannii proteome and provides new tools to increase our knowledge on the protein expression profile of this pathogen. PMID:19785748

  7. Multi-criteria robustness analysis of metro networks

    NASA Astrophysics Data System (ADS)

    Wang, Xiangrong; Koç, Yakup; Derrible, Sybil; Ahmad, Sk Nasir; Pino, Willem J. A.; Kooij, Robert E.

    2017-05-01

    Metros (heavy rail transit systems) are integral parts of urban transportation systems. Failures in their operations can have serious impacts on urban mobility, and measuring their robustness is therefore critical. Moreover, as physical networks, metros can be viewed as topological entities, and as such they possess measurable network properties. In this article, by using network science and graph theory, we investigate ten theoretical and four numerical robustness metrics and their performance in quantifying the robustness of 33 metro networks under random failures or targeted attacks. We find that the ten theoretical metrics capture two distinct aspects of robustness of metro networks. First, several metrics place an emphasis on alternative paths. Second, other metrics place an emphasis on the length of the paths. To account for all aspects, we standardize all ten indicators and plot them on radar diagrams to assess the overall robustness for metro networks. Overall, we find that Tokyo and Rome are the most robust networks. Rome benefits from short transferring and Tokyo has a significant number of transfer stations, both in the city center and in the peripheral area of the city, promoting both a higher number of alternative paths and overall relatively short path-lengths.

  8. Addressing Antibiotic Resistance Requires Robust International Accountability Mechanisms.

    PubMed

    Hoffman, Steven J; Ottersen, Trygve

    2015-01-01

    A proposed international agreement on antibiotic resistance will depend on robust accountability mechanisms for real-world impact. This article examines the central aspects of accountability relationships in international agreements and lays out ways to strengthen them. We provide a menu of accountability mechanisms that facilitate transparency, oversight, complaint, and enforcement, describe how these mechanisms can promote compliance, and identify key considerations for a proposed international agreement on antibiotic resistance. These insights can be useful for bringing about the revolutionary changes that new international agreements aspire to achieve. © 2015 American Society of Law, Medicine & Ethics, Inc.

  9. TARCMO: Theory and Algorithms for Robust, Combinatorial, Multicriteria Optimization

    DTIC Science & Technology

    2016-11-28

    objective 9 4.6 On The Recoverable Robust Traveling Salesman Problem . . . . . 11 4.7 A Bicriteria Approach to Robust Optimization...be found. 4.6 On The Recoverable Robust Traveling Salesman Problem The traveling salesman problem (TSP) is a well-known combinatorial optimiza- tion...procedure for the robust traveling salesman problem . While this iterative algorithms results in an optimal solution to the robust TSP, computation

  10. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    NASA Astrophysics Data System (ADS)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  11. Robust lane detection and tracking using multiple visual cues under stochastic lane shape conditions

    NASA Astrophysics Data System (ADS)

    Huang, Zhi; Fan, Baozheng; Song, Xiaolin

    2018-03-01

    As one of the essential components of environment perception techniques for an intelligent vehicle, lane detection is confronted with challenges including robustness against the complicated disturbance and illumination, also adaptability to stochastic lane shapes. To overcome these issues, we proposed a robust lane detection method named classification-generation-growth-based (CGG) operator to the detected lines, whereby the linear lane markings are identified by synergizing multiple visual cues with the a priori knowledge and spatial-temporal information. According to the quality of linear lane fitting, the linear and linear-parabolic models are dynamically switched to describe the actual lane. The Kalman filter with adaptive noise covariance and the region of interests (ROI) tracking are applied to improve the robustness and efficiency. Experiments were conducted with images covering various challenging scenarios. The experimental results evaluate the effectiveness of the presented method for complicated disturbances, illumination, and stochastic lane shapes.

  12. Using Multi-Objective Optimization to Explore Robust Policies in the Colorado River Basin

    NASA Astrophysics Data System (ADS)

    Alexander, E.; Kasprzyk, J. R.; Zagona, E. A.; Prairie, J. R.; Jerla, C.; Butler, A.

    2017-12-01

    The long term reliability of water deliveries in the Colorado River Basin has degraded due to the imbalance of growing demand and dwindling supply. The Colorado River meanders 1,450 miles across a watershed that covers seven US states and Mexico and is an important cultural, economic, and natural resource for nearly 40 million people. Its complex operating policy is based on the "Law of the River," which has evolved since the Colorado River Compact in 1922. Recent (2007) refinements to address shortage reductions and coordinated operations of Lakes Powell and Mead were negotiated with stakeholders in which thousands of scenarios were explored to identify operating guidelines that could ultimately be agreed on. This study explores a different approach to searching for robust operating policies to inform the policy making process. The Colorado River Simulation System (CRSS), a long-term water management simulation model implemented in RiverWare, is combined with the Borg multi-objective evolutionary algorithm (MOEA) to solve an eight objective problem formulation. Basin-wide performance metrics are closely tied to system health through incorporating critical reservoir pool elevations, duration, frequency and quantity of shortage reductions in the objective set. For example, an objective to minimize the frequency that Lake Powell falls below the minimum power pool elevation of 3,490 feet for Glen Canyon Dam protects a vital economic and renewable energy source for the southwestern US. The decision variables correspond to operating tiers in Lakes Powell and Mead that drive the implementation of various shortage and release policies, thus affecting system performance. The result will be a set of non-dominated solutions that can be compared with respect to their trade-offs based on the various objectives. These could inform policy making processes by eliminating dominated solutions and revealing robust solutions that could remain hidden under conventional analysis.

  13. Histomorphometry and cortical robusticity of the adult human femur.

    PubMed

    Miszkiewicz, Justyna Jolanta; Mahoney, Patrick

    2018-01-13

    Recent quantitative analyses of human bone microanatomy, as well as theoretical models that propose bone microstructure and gross anatomical associations, have started to reveal insights into biological links that may facilitate remodeling processes. However, relationships between bone size and the underlying cortical bone histology remain largely unexplored. The goal of this study is to determine the extent to which static indicators of bone remodeling and vascularity, measured using histomorphometric techniques, relate to femoral midshaft cortical width and robusticity. Using previously published and new quantitative data from 450 adult human male (n = 233) and female (n = 217) femora, we determine if these aspects of femoral size relate to bone microanatomy. Scaling relationships are explored and interpreted within the context of tissue form and function. Analyses revealed that the area and diameter of Haversian canals and secondary osteons, and densities of secondary osteons and osteocyte lacunae from the sub-periosteal region of the posterior midshaft femur cortex were significantly, but not consistently, associated with femoral size. Cortical width and bone robusticity were correlated with osteocyte lacunae density and scaled with positive allometry. Diameter and area of osteons and Haversian canals decreased as the width of cortex and bone robusticity increased, revealing a negative allometric relationship. These results indicate that microscopic products of cortical bone remodeling and vascularity are linked to femur size. Allometric relationships between more robust human femora with thicker cortical bone and histological products of bone remodeling correspond with principles of bone functional adaptation. Future studies may benefit from exploring scaling relationships between bone histomorphometric data and measurements of bone macrostructure.

  14. Replication and robustness in developmental research.

    PubMed

    Duncan, Greg J; Engel, Mimi; Claessens, Amy; Dowsett, Chantelle J

    2014-11-01

    Replications and robustness checks are key elements of the scientific method and a staple in many disciplines. However, leading journals in developmental psychology rarely include explicit replications of prior research conducted by different investigators, and few require authors to establish in their articles or online appendices that their key results are robust across estimation methods, data sets, and demographic subgroups. This article makes the case for prioritizing both explicit replications and, especially, within-study robustness checks in developmental psychology. It provides evidence on variation in effect sizes in developmental studies and documents strikingly different replication and robustness-checking practices in a sample of journals in developmental psychology and a sister behavioral science-applied economics. Our goal is not to show that any one behavioral science has a monopoly on best practices, but rather to show how journals from a related discipline address vital concerns of replication and generalizability shared by all social and behavioral sciences. We provide recommendations for promoting graduate training in replication and robustness-checking methods and for editorial policies that encourage these practices. Although some of our recommendations may shift the form and substance of developmental research articles, we argue that they would generate considerable scientific benefits for the field. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  15. An index-based robust decision making framework for watershed management in a changing climate.

    PubMed

    Kim, Yeonjoo; Chung, Eun-Sung

    2014-03-01

    This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management

  16. Robust digital image watermarking using distortion-compensated dither modulation

    NASA Astrophysics Data System (ADS)

    Li, Mianjie; Yuan, Xiaochen

    2018-04-01

    In this paper, we propose a robust feature extraction based digital image watermarking method using Distortion- Compensated Dither Modulation (DC-DM). Our proposed local watermarking method provides stronger robustness and better flexibility than traditional global watermarking methods. We improve robustness by introducing feature extraction and DC-DM method. To extract the robust feature points, we propose a DAISY-based Robust Feature Extraction (DRFE) method by employing the DAISY descriptor and applying the entropy calculation based filtering. The experimental results show that the proposed method achieves satisfactory robustness under the premise of ensuring watermark imperceptibility quality compared to other existing methods.

  17. Representation, Classification and Information Fusion for Robust and Efficient Multimodal Human States Recognition

    ERIC Educational Resources Information Center

    Li, Ming

    2013-01-01

    The goal of this work is to enhance the robustness and efficiency of the multimodal human states recognition task. Human states recognition can be considered as a joint term for identifying/verifing various kinds of human related states, such as biometric identity, language spoken, age, gender, emotion, intoxication level, physical activity, vocal…

  18. Robust control algorithms for Mars aerobraking

    NASA Technical Reports Server (NTRS)

    Shipley, Buford W., Jr.; Ward, Donald T.

    1992-01-01

    Four atmospheric guidance concepts have been adapted to control an interplanetary vehicle aerobraking in the Martian atmosphere. The first two offer improvements to the Analytic Predictor Corrector (APC) to increase its robustness to density variations. The second two are variations of a new Liapunov tracking exit phase algorithm, developed to guide the vehicle along a reference trajectory. These four new controllers are tested using a six degree of freedom computer simulation to evaluate their robustness. MARSGRAM is used to develop realistic atmospheres for the study. When square wave density pulses perturb the atmosphere all four controllers are successful. The algorithms are tested against atmospheres where the inbound and outbound density functions are different. Square wave density pulses are again used, but only for the outbound leg of the trajectory. Additionally, sine waves are used to perturb the density function. The new algorithms are found to be more robust than any previously tested and a Liapunov controller is selected as the most robust control algorithm overall examined.

  19. The Influence of Assortativity on the Robustness of Signal-Integration Logic in Gene Regulatory Networks

    PubMed Central

    Pechenick, Dov A.; Payne, Joshua L.; Moore, Jason H.

    2011-01-01

    Gene regulatory networks (GRNs) drive the cellular processes that sustain life. To do so reliably, GRNs must be robust to perturbations, such as gene deletion and the addition or removal of regulatory interactions. GRNs must also be robust to genetic changes in regulatory regions that define the logic of signal-integration, as these changes can affect how specific combinations of regulatory signals are mapped to particular gene expression states. Previous theoretical analyses have demonstrated that the robustness of a GRN is influenced by its underlying topological properties, such as degree distribution and modularity. Another important topological property is assortativity, which measures the propensity with which nodes of similar connectivity are connected to one another. How assortativity influences the robustness of the signal-integration logic of GRNs remains an open question. Here, we use computational models of GRNs to investigate this relationship. We separately consider each of the three dynamical regimes of this model for a variety of degree distributions. We find that in the chaotic regime, robustness exhibits a pronounced increase as assortativity becomes more positive, while in the critical and ordered regimes, robustness is generally less sensitive to changes in assortativity. We attribute the increased robustness to a decrease in the duration of the gene expression pattern, which is caused by a reduction in the average size of a GRN’s in-components. This study provides the first direct evidence that assortativity influences the robustness of the signal-integration logic of computational models of GRNs, illuminates a mechanistic explanation for this influence, and furthers our understanding of the relationship between topology and robustness in complex biological systems. PMID:22155134

  20. Robust reinforcement learning.

    PubMed

    Morimoto, Jun; Doya, Kenji

    2005-02-01

    This letter proposes a new reinforcement learning (RL) paradigm that explicitly takes into account input disturbance as well as modeling errors. The use of environmental models in RL is quite popular for both offline learning using simulations and for online action planning. However, the difference between the model and the real environment can lead to unpredictable, and often unwanted, results. Based on the theory of H(infinity) control, we consider a differential game in which a "disturbing" agent tries to make the worst possible disturbance while a "control" agent tries to make the best control input. The problem is formulated as finding a min-max solution of a value function that takes into account the amount of the reward and the norm of the disturbance. We derive online learning algorithms for estimating the value function and for calculating the worst disturbance and the best control in reference to the value function. We tested the paradigm, which we call robust reinforcement learning (RRL), on the control task of an inverted pendulum. In the linear domain, the policy and the value function learned by online algorithms coincided with those derived analytically by the linear H(infinity) control theory. For a fully nonlinear swing-up task, RRL achieved robust performance with changes in the pendulum weight and friction, while a standard reinforcement learning algorithm could not deal with these changes. We also applied RRL to the cart-pole swing-up task, and a robust swing-up policy was acquired.

  1. A robust signal processing method for quantitative high-cycle fatigue crack monitoring using soft elastomeric capacitor sensors

    NASA Astrophysics Data System (ADS)

    Kong, Xiangxiong; Li, Jian; Collins, William; Bennett, Caroline; Laflamme, Simon; Jo, Hongki

    2017-04-01

    A large-area electronics (LAE) strain sensor, termed soft elastomeric capacitor (SEC), has shown great promise in fatigue crack monitoring. The SEC is able to monitor strain changes over a mesoscale structural surface and endure large deformations without being damaged under cracking. Previous tests verified that the SEC is able to detect, localize, and monitor fatigue crack activities under low-cycle fatigue loading. In this paper, to examine the SEC's capability of monitoring high-cycle fatigue cracks, a compact specimen is tested under cyclic tension, designed to ensure realistic crack opening sizes representative of those in real steel bridges. To overcome the difficulty of low signal amplitude and relatively high noise level under high-cycle fatigue loading, a robust signal processing method is proposed to convert the measured capacitance time history from the SEC sensor to power spectral densities (PSD) in the frequency domain, such that signal's peak-to-peak amplitude can be extracted at the dominant loading frequency. A crack damage indicator is proposed as the ratio between the square root of the amplitude of PSD and load range. Results show that the crack damage indicator offers consistent indication of crack growth.

  2. Adaptive Critic Nonlinear Robust Control: A Survey.

    PubMed

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    Adaptive dynamic programming (ADP) and reinforcement learning are quite relevant to each other when performing intelligent optimization. They are both regarded as promising methods involving important components of evaluation and improvement, at the background of information technology, such as artificial intelligence, big data, and deep learning. Although great progresses have been achieved and surveyed when addressing nonlinear optimal control problems, the research on robustness of ADP-based control strategies under uncertain environment has not been fully summarized. Hence, this survey reviews the recent main results of adaptive-critic-based robust control design of continuous-time nonlinear systems. The ADP-based nonlinear optimal regulation is reviewed, followed by robust stabilization of nonlinear systems with matched uncertainties, guaranteed cost control design of unmatched plants, and decentralized stabilization of interconnected systems. Additionally, further comprehensive discussions are presented, including event-based robust control design, improvement of the critic learning rule, nonlinear H ∞ control design, and several notes on future perspectives. By applying the ADP-based optimal and robust control methods to a practical power system and an overhead crane plant, two typical examples are provided to verify the effectiveness of theoretical results. Overall, this survey is beneficial to promote the development of adaptive critic control methods with robustness guarantee and the construction of higher level intelligent systems.

  3. Fabrication and Characterization of Hybrid Organic-Inorganic Electron Extraction Layers for Polymer Solar Cells toward Improved Processing Robustness and Air Stability.

    PubMed

    Fredj, Donia; Pourcin, Florent; Alkarsifi, Riva; Kilinc, Volkan; Liu, Xianjie; Ben Dkhil, Sadok; Boudjada, Nassira Chniba; Fahlman, Mats; Videlot-Ackermann, Christine; Margeat, Olivier; Ackermann, Jörg; Boujelbene, Mohamed

    2018-05-23

    Organic-inorganic hybrid materials composed of bismuth and diaminopyridine are studied as novel materials for electron extraction layers in polymer solar cells using regular device structures. The hybrid materials are solution processed on top of two different low band gap polymers (PTB7 or PTB7-Th) as donor materials mixed with fullerene PC 70 BM as the acceptor. The intercalation of the hybrid layer between the photoactive layer and the aluminum cathode leads to solar cells with a power conversion efficiency of 7.8% because of significant improvements in all photovoltaic parameters, that is, short-circuit current density, fill factor, and open-circuit voltage, similar to the reference devices using ZnO as the interfacial layer. However when using thick layers of such hybrid materials for electron extraction, only small losses in photocurrent density are observed in contrast to the reference material ZnO of pronounced losses because of optical spacer effects. Importantly, these hybrid electron extraction layers also strongly improve the device stability in air compared with solar cells processed with ZnO interlayers. Both results underline the high potential of this new class of hybrid materials as electron extraction materials toward robust processing of air stable organic solar cells.

  4. Robust Planning for Effects-Based Operations

    DTIC Science & Technology

    2006-06-01

    Algorithm ......................................... 34 2.6 Robust Optimization Literature ..................................... 36 2.6.1 Protecting Against...Model Formulation ...................... 55 3.1.5 Deterministic EBO Model Example and Performance ............. 59 3.1.6 Greedy Algorithm ...111 4.1.9 Conclusions on Robust EBO Model Performance .................... 116 4.2 Greedy Algorithm versus EBO Models

  5. Low-Power Photoplethysmogram Acquisition Integrated Circuit with Robust Light Interference Compensation.

    PubMed

    Kim, Jongpal; Kim, Jihoon; Ko, Hyoungho

    2015-12-31

    To overcome light interference, including a large DC offset and ambient light variation, a robust photoplethysmogram (PPG) readout chip is fabricated using a 0.13-μm complementary metal-oxide-semiconductor (CMOS) process. Against the large DC offset, a saturation detection and current feedback circuit is proposed to compensate for an offset current of up to 30 μA. For robustness against optical path variation, an automatic emitted light compensation method is adopted. To prevent ambient light interference, an alternating sampling and charge redistribution technique is also proposed. In the proposed technique, no additional power is consumed, and only three differential switches and one capacitor are required. The PPG readout channel consumes 26.4 μW and has an input referred current noise of 260 pArms.

  6. How MAP kinase modules function as robust, yet adaptable, circuits.

    PubMed

    Tian, Tianhai; Harding, Angus

    2014-01-01

    Genetic and biochemical studies have revealed that the diversity of cell types and developmental patterns evident within the animal kingdom is generated by a handful of conserved, core modules. Core biological modules must be robust, able to maintain functionality despite perturbations, and yet sufficiently adaptable for random mutations to generate phenotypic variation during evolution. Understanding how robust, adaptable modules have influenced the evolution of eukaryotes will inform both evolutionary and synthetic biology. One such system is the MAP kinase module, which consists of a 3-tiered kinase circuit configuration that has been evolutionarily conserved from yeast to man. MAP kinase signal transduction pathways are used across eukaryotic phyla to drive biological functions that are crucial for life. Here we ask the fundamental question, why do MAPK modules follow a conserved 3-tiered topology rather than some other number? Using computational simulations, we identify a fundamental 2-tiered circuit topology that can be readily reconfigured by feedback loops and scaffolds to generate diverse signal outputs. When this 2-kinase circuit is connected to proximal input kinases, a 3-tiered modular configuration is created that is both robust and adaptable, providing a biological circuit that can regulate multiple phenotypes and maintain functionality in an uncertain world. We propose that the 3-tiered signal transduction module has been conserved through positive selection, because it facilitated the generation of phenotypic variation during eukaryotic evolution.

  7. Emergence of robust growth laws from optimal regulation of ribosome synthesis.

    PubMed

    Scott, Matthew; Klumpp, Stefan; Mateescu, Eduard M; Hwa, Terence

    2014-08-22

    Bacteria must constantly adapt their growth to changes in nutrient availability; yet despite large-scale changes in protein expression associated with sensing, adaptation, and processing different environmental nutrients, simple growth laws connect the ribosome abundance and the growth rate. Here, we investigate the origin of these growth laws by analyzing the features of ribosomal regulation that coordinate proteome-wide expression changes with cell growth in a variety of nutrient conditions in the model organism Escherichia coli. We identify supply-driven feedforward activation of ribosomal protein synthesis as the key regulatory motif maximizing amino acid flux, and autonomously guiding a cell to achieve optimal growth in different environments. The growth laws emerge naturally from the robust regulatory strategy underlying growth rate control, irrespective of the details of the molecular implementation. The study highlights the interplay between phenomenological modeling and molecular mechanisms in uncovering fundamental operating constraints, with implications for endogenous and synthetic design of microorganisms. © 2014 The Authors. Published under the terms of the CC BY 4.0 license.

  8. Extremely Robust and Patternable Electrodes for Copy-Paper-Based Electronics.

    PubMed

    Ahn, Jaeho; Seo, Ji-Won; Lee, Tae-Ik; Kwon, Donguk; Park, Inkyu; Kim, Taek-Soo; Lee, Jung-Yong

    2016-07-27

    We propose a fabrication process for extremely robust and easily patternable silver nanowire (AgNW) electrodes on paper. Using an auxiliary donor layer and a simple laminating process, AgNWs can be easily transferred to copy paper as well as various other substrates using a dry process. Intercalating a polymeric binder between the AgNWs and the substrate through a simple printing technique enhances adhesion, not only guaranteeing high foldability of the electrodes, but also facilitating selective patterning of the AgNWs. Using the proposed process, extremely crease-tolerant electronics based on copy paper can be fabricated, such as a printed circuit board for a 7-segment display, portable heater, and capacitive touch sensor, demonstrating the applicability of the AgNWs-based electrodes to paper electronics.

  9. Sampling Technique for Robust Odorant Detection Based on MIT RealNose Data

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A.

    2012-01-01

    This technique enhances the detection capability of the autonomous Real-Nose system from MIT to detect odorants and their concentrations in noisy and transient environments. The lowcost, portable system with low power consumption will operate at high speed and is suited for unmanned and remotely operated long-life applications. A deterministic mathematical model was developed to detect odorants and calculate their concentration in noisy environments. Real data from MIT's NanoNose was examined, from which a signal conditioning technique was proposed to enable robust odorant detection for the RealNose system. Its sensitivity can reach to sub-part-per-billion (sub-ppb). A Space Invariant Independent Component Analysis (SPICA) algorithm was developed to deal with non-linear mixing that is an over-complete case, and it is used as a preprocessing step to recover the original odorant sources for detection. This approach, combined with the Cascade Error Projection (CEP) Neural Network algorithm, was used to perform odorant identification. Signal conditioning is used to identify potential processing windows to enable robust detection for autonomous systems. So far, the software has been developed and evaluated with current data sets provided by the MIT team. However, continuous data streams are made available where even the occurrence of a new odorant is unannounced and needs to be noticed by the system autonomously before its unambiguous detection. The challenge for the software is to be able to separate the potential valid signal from the odorant and from the noisy transition region when the odorant is just introduced.

  10. Robust synergetic control design under inputs and states constraints

    NASA Astrophysics Data System (ADS)

    Rastegar, Saeid; Araújo, Rui; Sadati, Jalil

    2018-03-01

    In this paper, a novel robust-constrained control methodology for discrete-time linear parameter-varying (DT-LPV) systems is proposed based on a synergetic control theory (SCT) approach. It is shown that in DT-LPV systems without uncertainty, and for any unmeasured bounded additive disturbance, the proposed controller accomplishes the goal of stabilising the system by asymptotically driving the error of the controlled variable to a bounded set containing the origin and then maintaining it there. Moreover, given an uncertain DT-LPV system jointly subject to unmeasured and constrained additive disturbances, and constraints in states, input commands and reference signals (set points), then invariant set theory is used to find an appropriate polyhedral robust invariant region in which the proposed control framework is guaranteed to robustly stabilise the closed-loop system. Furthermore, this is achieved even for the case of varying non-zero control set points in such uncertain DT-LPV systems. The controller is characterised to have a simple structure leading to an easy implementation, and a non-complex design process. The effectiveness of the proposed method and the implications of the controller design on feasibility and closed-loop performance are demonstrated through application examples on the temperature control on a continuous-stirred tank reactor plant, on the control of a real-coupled DC motor plant, and on an open-loop unstable system example.

  11. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  12. A Secure and Robust Object-Based Video Authentication System

    NASA Astrophysics Data System (ADS)

    He, Dajun; Sun, Qibin; Tian, Qi

    2004-12-01

    An object-based video authentication system, which combines watermarking, error correction coding (ECC), and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART) coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT) coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI).

  13. Identifying Hydrologic Processes in Agricultural Watersheds Using Precipitation-Runoff Models

    USGS Publications Warehouse

    Linard, Joshua I.; Wolock, David M.; Webb, Richard M.T.; Wieczorek, Michael

    2009-01-01

    Understanding the fate and transport of agricultural chemicals applied to agricultural fields will assist in designing the most effective strategies to prevent water-quality impairments. At a watershed scale, the processes controlling the fate and transport of agricultural chemicals are generally understood only conceptually. To examine the applicability of conceptual models to the processes actually occurring, two precipitation-runoff models - the Soil and Water Assessment Tool (SWAT) and the Water, Energy, and Biogeochemical Model (WEBMOD) - were applied in different agricultural settings of the contiguous United States. Each model, through different physical processes, simulated the transport of water to a stream from the surface, the unsaturated zone, and the saturated zone. Models were calibrated for watersheds in Maryland, Indiana, and Nebraska. The calibrated sets of input parameters for each model at each watershed are discussed, and the criteria used to validate the models are explained. The SWAT and WEBMOD model results at each watershed conformed to each other and to the processes identified in each watershed's conceptual hydrology. In Maryland the conceptual understanding of the hydrology indicated groundwater flow was the largest annual source of streamflow; the simulation results for the validation period confirm this. The dominant source of water to the Indiana watershed was thought to be tile drains. Although tile drains were not explicitly simulated in the SWAT model, a large component of streamflow was received from lateral flow, which could be attributed to tile drains. Being able to explicitly account for tile drains, WEBMOD indicated water from tile drains constituted most of the annual streamflow in the Indiana watershed. The Nebraska models indicated annual streamflow was composed primarily of perennial groundwater flow and infiltration-excess runoff, which conformed to the conceptual hydrology developed for that watershed. The hydrologic

  14. Robust numerical electromagnetic eigenfunction expansion algorithms

    NASA Astrophysics Data System (ADS)

    Sainath, Kamalesh

    This thesis summarizes developments in rigorous, full-wave, numerical spectral-domain (integral plane wave eigenfunction expansion [PWE]) evaluation algorithms concerning time-harmonic electromagnetic (EM) fields radiated by generally-oriented and positioned sources within planar and tilted-planar layered media exhibiting general anisotropy, thickness, layer number, and loss characteristics. The work is motivated by the need to accurately and rapidly model EM fields radiated by subsurface geophysical exploration sensors probing layered, conductive media, where complex geophysical and man-made processes can lead to micro-laminate and micro-fractured geophysical formations exhibiting, at the lower (sub-2MHz) frequencies typically employed for deep EM wave penetration through conductive geophysical media, bulk-scale anisotropic (i.e., directional) electrical conductivity characteristics. When the planar-layered approximation (layers of piecewise-constant material variation and transversely-infinite spatial extent) is locally, near the sensor region, considered valid, numerical spectral-domain algorithms are suitable due to their strong low-frequency stability characteristic, and ability to numerically predict time-harmonic EM field propagation in media with response characterized by arbitrarily lossy and (diagonalizable) dense, anisotropic tensors. If certain practical limitations are addressed, PWE can robustly model sensors with general position and orientation that probe generally numerous, anisotropic, lossy, and thick layers. The main thesis contributions, leading to a sensor and geophysical environment-robust numerical modeling algorithm, are as follows: (1) Simple, rapid estimator of the region (within the complex plane) containing poles, branch points, and branch cuts (critical points) (Chapter 2), (2) Sensor and material-adaptive azimuthal coordinate rotation, integration contour deformation, integration domain sub-region partition and sub

  15. Incentive-Compatible Robust Line Planning

    NASA Astrophysics Data System (ADS)

    Bessas, Apostolos; Kontogiannis, Spyros; Zaroliagis, Christos

    The problem of robust line planning requests for a set of origin-destination paths (lines) along with their frequencies in an underlying railway network infrastructure, which are robust to fluctuations of real-time parameters of the solution. In this work, we investigate a variant of robust line planning stemming from recent regulations in the railway sector that introduce competition and free railway markets, and set up a new application scenario: there is a (potentially large) number of line operators that have their lines fixed and operate as competing entities issuing frequency requests, while the management of the infrastructure itself remains the responsibility of a single entity, the network operator. The line operators are typically unwilling to reveal their true incentives, while the network operator strives to ensure a fair (or socially optimal) usage of the infrastructure, e.g., by maximizing the (unknown to him) aggregate incentives of the line operators.

  16. Robustness enhancement of neurocontroller and state estimator

    NASA Technical Reports Server (NTRS)

    Troudet, Terry

    1993-01-01

    The feasibility of enhancing neurocontrol robustness, through training of the neurocontroller and state estimator in the presence of system uncertainties, is investigated on the example of a multivariable aircraft control problem. The performance and robustness of the newly trained neurocontroller are compared to those for an existing neurocontrol design scheme. The newly designed dynamic neurocontroller exhibits a better trade-off between phase and gain stability margins, and it is significantly more robust to degradations of the plant dynamics.

  17. Robust radio interferometric calibration using the t-distribution

    NASA Astrophysics Data System (ADS)

    Kazemi, S.; Yatawatta, S.

    2013-10-01

    A major stage of radio interferometric data processing is calibration or the estimation of systematic errors in the data and the correction for such errors. A stochastic error (noise) model is assumed, and in most cases, this underlying model is assumed to be Gaussian. However, outliers in the data due to interference or due to errors in the sky model would have adverse effects on processing based on a Gaussian noise model. Most of the shortcomings of calibration such as the loss in flux or coherence, and the appearance of spurious sources, could be attributed to the deviations of the underlying noise model. In this paper, we propose to improve the robustness of calibration by using a noise model based on Student's t-distribution. Student's t-noise is a special case of Gaussian noise when the variance is unknown. Unlike Gaussian-noise-model-based calibration, traditional least-squares minimization would not directly extend to a case when we have a Student's t-noise model. Therefore, we use a variant of the expectation-maximization algorithm, called the expectation-conditional maximization either algorithm, when we have a Student's t-noise model and use the Levenberg-Marquardt algorithm in the maximization step. We give simulation results to show the robustness of the proposed calibration method as opposed to traditional Gaussian-noise-model-based calibration, especially in preserving the flux of weaker sources that are not included in the calibration model.

  18. Robustness of coevolution in resolving prisoner's dilemma games on interdependent networks subject to attack

    NASA Astrophysics Data System (ADS)

    Liu, Penghui; Liu, Jing

    2017-08-01

    Recently, coevolution between strategy and network structure has been established as a rule to resolve social dilemmas and reach optimal situations for cooperation. Many follow-up researches have focused on studying how coevolution helps networks reorganize to deter the defectors and many coevolution methods have been proposed. However, the robustness of the coevolution rules against attacks have not been studied much. Since attacks may directly influence the original evolutionary process of cooperation, the robustness should be an important index while evaluating the quality of a coevolution method. In this paper, we focus on investigating the robustness of an elementary coevolution method in resolving the prisoner's dilemma game upon the interdependent networks. Three different types of time-independent attacks, named as edge attacks, instigation attacks and node attacks have been employed to test its robustness. Through analyzing the simulation results obtained, we find this coevolution method is relatively robust against the edge attack and the node attack as it successfully maintains cooperation in the population over the entire attack range. However, when the instigation probability of the attacked individuals is large or the attack range of instigation attack is wide enough, coevolutionary rule finally fails in maintaining cooperation in the population.

  19. Persistent Identifiers, Discoverability and Open Science (Communication)

    NASA Astrophysics Data System (ADS)

    Murphy, Fiona; Lehnert, Kerstin; Hanson, Brooks

    2016-04-01

    Early in 2016, the American Geophysical Union announced it was incorporating ORCIDs into its submission workflows. This was accompanied by a strong statement supporting the use of other persistent identifiers - such as IGSNs, and the CrossRef open registry 'funding data'. This was partly in response to funders' desire to track and manage their outputs. However the more compelling argument, and the reason why the AGU has also signed up to the Center for Open Science's Transparency and Openness Promotion (TOP) Guidelines (http://cos.io/top), is that ultimately science and scientists will be the richer for these initiatives due to increased opportunities for interoperability, reproduceability and accreditation. The AGU has appealed to the wider community to engage with these initiatives, recognising that - unlike the introduction of Digital Object Identifiers (DOIs) for articles by CrossRef - full, enriched use of persistent identifiers throughout the scientific process requires buy-in from a range of scholarly communications stakeholders. At the same time, across the general research landscape, initiatives such as Project CRediT (contributor roles taxonomy), Publons (reviewer acknowledgements) and the forthcoming CrossRef DOI Event Tracker are contributing to our understanding and accreditation of contributions and impact. More specifically for earth science and scientists, the cross-functional Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) was formed in October 2014 and is working to 'provide an organizational framework for Earth and space science publishers and data facilities to jointly implement and promote common policies and procedures for the publication and citation of data across Earth Science journals'. Clearly, the judicious integration of standards, registries and persistent identifiers such as ORCIDs and International Geo Sample Numbers (IGSNs) to the research and research output processes is key to the success of this venture

  20. An ant colony optimization based algorithm for identifying gene regulatory elements.

    PubMed

    Liu, Wei; Chen, Hanwu; Chen, Ling

    2013-08-01

    It is one of the most important tasks in bioinformatics to identify the regulatory elements in gene sequences. Most of the existing algorithms for identifying regulatory elements are inclined to converge into a local optimum, and have high time complexity. Ant Colony Optimization (ACO) is a meta-heuristic method based on swarm intelligence and is derived from a model inspired by the collective foraging behavior of real ants. Taking advantage of the ACO in traits such as self-organization and robustness, this paper designs and implements an ACO based algorithm named ACRI (ant-colony-regulatory-identification) for identifying all possible binding sites of transcription factor from the upstream of co-expressed genes. To accelerate the ants' searching process, a strategy of local optimization is presented to adjust the ants' start positions on the searched sequences. By exploiting the powerful optimization ability of ACO, the algorithm ACRI can not only improve precision of the results, but also achieve a very high speed. Experimental results on real world datasets show that ACRI can outperform other traditional algorithms in the respects of speed and quality of solutions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Robust-yet-fragile nature of interdependent networks

    NASA Astrophysics Data System (ADS)

    Tan, Fei; Xia, Yongxiang; Wei, Zhi

    2015-05-01

    Interdependent networks have been shown to be extremely vulnerable based on the percolation model. Parshani et al. [Europhys. Lett. 92, 68002 (2010), 10.1209/0295-5075/92/68002] further indicated that the more intersimilar networks are, the more robust they are to random failures. When traffic load is considered, how do the coupling patterns impact cascading failures in interdependent networks? This question has been largely unexplored until now. In this paper, we address this question by investigating the robustness of interdependent Erdös-Rényi random graphs and Barabási-Albert scale-free networks under either random failures or intentional attacks. It is found that interdependent Erdös-Rényi random graphs are robust yet fragile under either random failures or intentional attacks. Interdependent Barabási-Albert scale-free networks, however, are only robust yet fragile under random failures but fragile under intentional attacks. We further analyze the interdependent communication network and power grid and achieve similar results. These results advance our understanding of how interdependency shapes network robustness.

  2. A robust interrupted time series model for analyzing complex health care intervention data.

    PubMed

    Cruz, Maricela; Bender, Miriam; Ombao, Hernando

    2017-12-20

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be "interrupted" by a change in a particular method of health care delivery. Interrupted time series (ITS) is a robust quasi-experimental design with the ability to infer the effectiveness of an intervention that accounts for data dependency. Current standardized methods for analyzing ITS data do not model changes in variation and correlation following the intervention. This is a key limitation since it is plausible for data variability and dependency to change because of the intervention. Moreover, present methodology either assumes a prespecified interruption time point with an instantaneous effect or removes data for which the effect of intervention is not fully realized. In this paper, we describe and develop a novel robust interrupted time series (robust-ITS) model that overcomes these omissions and limitations. The robust-ITS model formally performs inference on (1) identifying the change point; (2) differences in preintervention and postintervention correlation; (3) differences in the outcome variance preintervention and postintervention; and (4) differences in the mean preintervention and postintervention. We illustrate the proposed method by analyzing patient satisfaction data from a hospital that implemented and evaluated a new nursing care delivery model as the intervention of interest. The robust-ITS model is implemented in an R Shiny toolbox, which is freely available to the community. Copyright © 2017 John Wiley & Sons, Ltd.

  3. A robust classic.

    PubMed

    Kutzner, Florian; Vogel, Tobias; Freytag, Peter; Fiedler, Klaus

    2011-01-01

    In the present research, we argue for the robustness of illusory correlations (ICs, Hamilton & Gifford, 1976) regarding two boundary conditions suggested in previous research. First, we argue that ICs are maintained under extended experience. Using simulations, we derive conflicting predictions. Whereas noise-based accounts predict ICs to be maintained (Fielder, 2000; Smith, 1991), a prominent account based on discrepancy-reducing feedback learning predicts ICs to disappear (Van Rooy et al., 2003). An experiment involving 320 observations with majority and minority members supports the claim that ICs are maintained. Second, we show that actively using the stereotype to make predictions that are met with reward and punishment does not eliminate the bias. In addition, participants' operant reactions afford a novel online measure of ICs. In sum, our findings highlight the robustness of ICs that can be explained as a result of unbiased but noisy learning.

  4. Improvement of mechanical robustness of the superhydrophobic wood surface by coating PVA/SiO2 composite polymer

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Wang, Shuliang; Zhang, Ming; Ma, Miaolian; Wang, Chengyu; Li, Jian

    2013-09-01

    Improvement of the robustness of superhydrophobic surfaces is crucial for the purpose of achieving commercial applications of these surfaces in such various areas as self-cleaning, water repellency and corrosion resistance. We have investigated a fabrication of polyvinyl alcohol (PVA)/silica (SiO2) composite polymer coating on wooden substrates with super repellency toward water, low sliding angles, low contact angle hysteresis, and relatively better mechanical robustness. The composite polymer slurry, consisting of well-mixing SiO2 particles and PVA, is prepared simply and subsequently coated over wooden substrates with good adhesion. In this study, the mechanical robustness of superhydrophobic wood surfaces was evaluated. The effect of petaloid structures of the composite polymer on robustness was investigated using an abrasion test and the results were compared with those of superhydrophobic wood surfaces fabricated by other processes. The produced wood surfaces exhibited promising superhydrophobic properties with a contact angle of 159̊ and a sliding angle of 4̊, and the relatively better mechanical robustness.

  5. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Review: Deciphering animal robustness. A synthesis to facilitate its use in livestock breeding and management.

    PubMed

    Friggens, N C; Blanc, F; Berry, D P; Puillet, L

    2017-12-01

    As the environments in which livestock are reared become more variable, animal robustness becomes an increasingly valuable attribute. Consequently, there is increasing focus on managing and breeding for it. However, robustness is a difficult phenotype to properly characterise because it is a complex trait composed of multiple components, including dynamic elements such as the rates of response to, and recovery from, environmental perturbations. In this review, the following definition of robustness is used: the ability, in the face of environmental constraints, to carry on doing the various things that the animal needs to do to favour its future ability to reproduce. The different elements of this definition are discussed to provide a clearer understanding of the components of robustness. The implications for quantifying robustness are that there is no single measure of robustness but rather that it is the combination of multiple and interacting component mechanisms whose relative value is context dependent. This context encompasses both the prevailing environment and the prevailing selection pressure. One key issue for measuring robustness is to be clear on the use to which the robustness measurements will employed. If the purpose is to identify biomarkers that may be useful for molecular phenotyping or genotyping, the measurements should focus on the physiological mechanisms underlying robustness. However, if the purpose of measuring robustness is to quantify the extent to which animals can adapt to limiting conditions then the measurements should focus on the life functions, the trade-offs between them and the animal's capacity to increase resource acquisition. The time-related aspect of robustness also has important implications. Single time-point measurements are of limited value because they do not permit measurement of responses to (and recovery from) environmental perturbations. The exception being single measurements of the accumulated consequence of a

  7. Robust Eye Center Localization through Face Alignment and Invariant Isocentric Patterns

    PubMed Central

    Teng, Dongdong; Chen, Dihu; Tan, Hongzhou

    2015-01-01

    The localization of eye centers is a very useful cue for numerous applications like face recognition, facial expression recognition, and the early screening of neurological pathologies. Several methods relying on available light for accurate eye-center localization have been exploited. However, despite the considerable improvements that eye-center localization systems have undergone in recent years, only few of these developments deal with the challenges posed by the profile (non-frontal face). In this paper, we first use the explicit shape regression method to obtain the rough location of the eye centers. Because this method extracts global information from the human face, it is robust against any changes in the eye region. We exploit this robustness and utilize it as a constraint. To locate the eye centers accurately, we employ isophote curvature features, the accuracy of which has been demonstrated in a previous study. By applying these features, we obtain a series of eye-center locations which are candidates for the actual position of the eye-center. Among these locations, the estimated locations which minimize the reconstruction error between the two methods mentioned above are taken as the closest approximation for the eye centers locations. Therefore, we combine explicit shape regression and isophote curvature feature analysis to achieve robustness and accuracy, respectively. In practical experiments, we use BioID and FERET datasets to test our approach to obtaining an accurate eye-center location while retaining robustness against changes in scale and pose. In addition, we apply our method to non-frontal faces to test its robustness and accuracy, which are essential in gaze estimation but have seldom been mentioned in previous works. Through extensive experimentation, we show that the proposed method can achieve a significant improvement in accuracy and robustness over state-of-the-art techniques, with our method ranking second in terms of accuracy

  8. Petroleum refinery operational planning using robust optimization

    NASA Astrophysics Data System (ADS)

    Leiras, A.; Hamacher, S.; Elkamel, A.

    2010-12-01

    In this article, the robust optimization methodology is applied to deal with uncertainties in the prices of saleable products, operating costs, product demand, and product yield in the context of refinery operational planning. A numerical study demonstrates the effectiveness of the proposed robust approach. The benefits of incorporating uncertainty in the different model parameters were evaluated in terms of the cost of ignoring uncertainty in the problem. The calculations suggest that this benefit is equivalent to 7.47% of the deterministic solution value, which indicates that the robust model may offer advantages to those involved with refinery operational planning. In addition, the probability bounds of constraint violation are calculated to help the decision-maker adopt a more appropriate parameter to control robustness and judge the tradeoff between conservatism and total profit.

  9. Mitochondrial Protein Interaction Mapping Identifies Regulators of Respiratory Chain Function.

    PubMed

    Floyd, Brendan J; Wilkerson, Emily M; Veling, Mike T; Minogue, Catie E; Xia, Chuanwu; Beebe, Emily T; Wrobel, Russell L; Cho, Holly; Kremer, Laura S; Alston, Charlotte L; Gromek, Katarzyna A; Dolan, Brendan K; Ulbrich, Arne; Stefely, Jonathan A; Bohl, Sarah L; Werner, Kelly M; Jochem, Adam; Westphall, Michael S; Rensvold, Jarred W; Taylor, Robert W; Prokisch, Holger; Kim, Jung-Ja P; Coon, Joshua J; Pagliarini, David J

    2016-08-18

    Mitochondria are essential for numerous cellular processes, yet hundreds of their proteins lack robust functional annotation. To reveal functions for these proteins (termed MXPs), we assessed condition-specific protein-protein interactions for 50 select MXPs using affinity enrichment mass spectrometry. Our data connect MXPs to diverse mitochondrial processes, including multiple aspects of respiratory chain function. Building upon these observations, we validated C17orf89 as a complex I (CI) assembly factor. Disruption of C17orf89 markedly reduced CI activity, and its depletion is found in an unresolved case of CI deficiency. We likewise discovered that LYRM5 interacts with and deflavinates the electron-transferring flavoprotein that shuttles electrons to coenzyme Q (CoQ). Finally, we identified a dynamic human CoQ biosynthetic complex involving multiple MXPs whose topology we map using purified components. Collectively, our data lend mechanistic insight into respiratory chain-related activities and prioritize hundreds of additional interactions for further exploration of mitochondrial protein function. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Robust Tomography using Randomized Benchmarking

    NASA Astrophysics Data System (ADS)

    Silva, Marcus; Kimmel, Shelby; Johnson, Blake; Ryan, Colm; Ohki, Thomas

    2013-03-01

    Conventional randomized benchmarking (RB) can be used to estimate the fidelity of Clifford operations in a manner that is robust against preparation and measurement errors -- thus allowing for a more accurate and relevant characterization of the average error in Clifford gates compared to standard tomography protocols. Interleaved RB (IRB) extends this result to the extraction of error rates for individual Clifford gates. In this talk we will show how to combine multiple IRB experiments to extract all information about the unital part of any trace preserving quantum process. Consequently, one can compute the average fidelity to any unitary, not just the Clifford group, with tighter bounds than IRB. Moreover, the additional information can be used to design improvements in control. MS, BJ, CR and TO acknowledge support from IARPA under contract W911NF-10-1-0324.

  11. Risk, Robustness and Water Resources Planning Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Borgomeo, Edoardo; Mortazavi-Naeini, Mohammad; Hall, Jim W.; Guillod, Benoit P.

    2018-03-01

    Risk-based water resources planning is based on the premise that water managers should invest up to the point where the marginal benefit of risk reduction equals the marginal cost of achieving that benefit. However, this cost-benefit approach may not guarantee robustness under uncertain future conditions, for instance under climatic changes. In this paper, we expand risk-based decision analysis to explore possible ways of enhancing robustness in engineered water resources systems under different risk attitudes. Risk is measured as the expected annual cost of water use restrictions, while robustness is interpreted in the decision-theoretic sense as the ability of a water resource system to maintain performance—expressed as a tolerable risk of water use restrictions—under a wide range of possible future conditions. Linking risk attitudes with robustness allows stakeholders to explicitly trade-off incremental increases in robustness with investment costs for a given level of risk. We illustrate the framework through a case study of London's water supply system using state-of-the -art regional climate simulations to inform the estimation of risk and robustness.

  12. Adding flexibility to the search for robust portfolios in non-linear water resource planning

    NASA Astrophysics Data System (ADS)

    Tomlinson, James; Harou, Julien

    2017-04-01

    To date robust optimisation of water supply systems has sought to find portfolios or strategies that are robust to a range of uncertainties or scenarios. The search for a single portfolio that is robust in all scenarios is necessarily suboptimal compared to portfolios optimised for a single scenario deterministic future. By contrast establishing a separate portfolio for each future scenario is unhelpful to the planner who must make a single decision today under deep uncertainty. In this work we show that a middle ground is possible by allowing a small number of different portfolios to be found that are each robust to a different subset of the global scenarios. We use evolutionary algorithms and a simple water resource system model to demonstrate this approach. The primary contribution is to demonstrate that flexibility can be added to the search for portfolios, in complex non-linear systems, at the expense of complete robustness across all future scenarios. In this context we define flexibility as the ability to design a portfolio in which some decisions are delayed, but those decisions that are not delayed are themselves shown to be robust to the future. We recognise that some decisions in our portfolio are more important than others. An adaptive portfolio is found by allowing no flexibility for these near-term "important" decisions, but maintaining flexibility in the remaining longer term decisions. In this sense we create an effective 2-stage decision process for a non-linear water resource supply system. We show how this reduces a measure of regret versus the inflexible robust solution for the same system.

  13. SU-F-R-31: Identification of Robust Normal Lung CT Texture Features for the Prediction of Radiation-Induced Lung Disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, W; Riyahi, S; Lu, W

    Purpose: Normal lung CT texture features have been used for the prediction of radiation-induced lung disease (radiation pneumonitis and radiation fibrosis). For these features to be clinically useful, they need to be relatively invariant (robust) to tumor size and not correlated with normal lung volume. Methods: The free-breathing CTs of 14 lung SBRT patients were studied. Different sizes of GTVs were simulated with spheres placed at the upper lobe and lower lobe respectively in the normal lung (contralateral to tumor). 27 texture features (9 from intensity histogram, 8 from grey-level co-occurrence matrix [GLCM] and 10 from grey-level run-length matrix [GLRM])more » were extracted from [normal lung-GTV]. To measure the variability of a feature F, the relative difference D=|Fref -Fsim|/Fref*100% was calculated, where Fref was for the entire normal lung and Fsim was for [normal lung-GTV]. A feature was considered as robust if the largest non-outlier (Q3+1.5*IQR) D was less than 5%, and considered as not correlated with normal lung volume when their Pearson correlation was lower than 0.50. Results: Only 11 features were robust. All first-order intensity-histogram features (mean, max, etc.) were robust, while most higher-order features (skewness, kurtosis, etc.) were unrobust. Only two of the GLCM and four of the GLRM features were robust. Larger GTV resulted greater feature variation, this was particularly true for unrobust features. All robust features were not correlated with normal lung volume while three unrobust features showed high correlation. Excessive variations were observed in two low grey-level run features and were later identified to be from one patient with local lung diseases (atelectasis) in the normal lung. There was no dependence on GTV location. Conclusion: We identified 11 robust normal lung CT texture features that can be further examined for the prediction of radiation-induced lung disease. Interestingly, low grey-level run features

  14. Preliminary results of miniaturized and robust ultrasound guided diffuse optical tomography system for breast cancer detection

    NASA Astrophysics Data System (ADS)

    Vavadi, Hamed; Mostafa, Atahar; Li, Jinglong; Zhou, Feifei; Uddin, Shihab; Xu, Chen; Zhu, Quing

    2017-02-01

    According to the World Health Organization, breast cancer is the most common cancer among women worldwide, claiming the lives of hundreds of thousands of women each year. Near infrared diffuse optical tomography (DOT) has demonstrated a great potential as an adjunct modality for differentiation of malignant and benign breast lesions and for monitoring treatment response of patients with locally advanced breast cancers. The path toward commercialization of DOT techniques depends upon the improvement of robustness and user-friendliness of this technique in hardware and software. In the past, our group have developed three frequency domain prototype systems which were used in several clinical studies. In this study, we introduce our newly under development US-guided DOT system which is being improved in terms of size, robustness and user friendliness by several custom electronic and mechanical design. A new and robust probe designed to reduce preparation time in clinical process. The processing procedure, data selection and user interface software also updated. With all these improvements, our new system is more robust and accurate which is one step closer to commercialization and wide use of this technology in clinical settings. This system is aimed to be used by minimally trained user in the clinical settings with robust performance. The system performance has been tested in the phantom experiment and initial results are demonstrated in this study. We are currently working on finalizing this system and do further testing to validate the performance of this system. We are aiming toward use of this system in clinical setting for patients with breast cancer.

  15. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    PubMed

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  17. Low-Power Photoplethysmogram Acquisition Integrated Circuit with Robust Light Interference Compensation

    PubMed Central

    Kim, Jongpal; Kim, Jihoon; Ko, Hyoungho

    2015-01-01

    To overcome light interference, including a large DC offset and ambient light variation, a robust photoplethysmogram (PPG) readout chip is fabricated using a 0.13-μm complementary metal–oxide–semiconductor (CMOS) process. Against the large DC offset, a saturation detection and current feedback circuit is proposed to compensate for an offset current of up to 30 μA. For robustness against optical path variation, an automatic emitted light compensation method is adopted. To prevent ambient light interference, an alternating sampling and charge redistribution technique is also proposed. In the proposed technique, no additional power is consumed, and only three differential switches and one capacitor are required. The PPG readout channel consumes 26.4 μW and has an input referred current noise of 260 pArms. PMID:26729122

  18. Observations on the Use of SCAN To Identify Children at Risk for Central Auditory Processing Disorder.

    ERIC Educational Resources Information Center

    Emerson, Maria F.; And Others

    1997-01-01

    The SCAN: A Screening Test for Auditory Processing Disorders was administered to 14 elementary children with a history of otitis media and 14 typical children, to evaluate the validity of the test in identifying children with central auditory processing disorder. Another experiment found that test results differed based on the testing environment…

  19. 25 CFR 170.501 - What happens when the review process identifies areas for improvement?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What happens when the review process identifies areas for improvement? 170.501 Section 170.501 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER INDIAN RESERVATION ROADS PROGRAM Planning, Design, and Construction of Indian Reservation Roads...

  20. Robust and Heterogeneous Hydrological Changes under Global Warming

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Zwiers, F. W.; Dirmeyer, P.; Lawrence, D. M.; Shrestha, R. R.; Werner, A. T.

    2015-12-01

    The Intergovernmental Panel on Climate Change (IPCC) has continued to find it difficult to make clear assessments of streamflow changes [Assessment Report 5, Working Group II, Chapter 3] in large part because of the heterogeneity of observed and projected hydrological changes. While prior studies have found some evidence of human influence on precipitation changes, the detection of streamflow changes is not robust. Here, we show that the terrestrial branch of the hydrological cycle, namely the partitioning of precipitation into evapotranspiration and runoff, is an important piece of the puzzle that may explain the apparent disconnect between the detectability of precipitation and streamflow changes. We apply Budyko framework to quantify sensitivity of hydrological changes to climate driven changes in water balance regionally. We demonstrate that the hydrological sensitivity is 3 times greater in regions where the hydrological cycle is energy limited (wet regions) than water limited (dry regions), and therefore the detectability of streamflow changes is also greater by 30-40% in wet regions. Evidence from observations in western North America and an analysis of Coupled Model Intercomparison Project Phase 5 climate models at global scales indicate that use of the Budyko framework can help identify robust and spatially heterogeneous hydrological responses to external forcing on the climate system.

  1. (Im)Perfect robustness and adaptation of metabolic networks subject to metabolic and gene-expression regulation: marrying control engineering with metabolic control analysis.

    PubMed

    He, Fei; Fromion, Vincent; Westerhoff, Hans V

    2013-11-21

    Metabolic control analysis (MCA) and supply-demand theory have led to appreciable understanding of the systems properties of metabolic networks that are subject exclusively to metabolic regulation. Supply-demand theory has not yet considered gene-expression regulation explicitly whilst a variant of MCA, i.e. Hierarchical Control Analysis (HCA), has done so. Existing analyses based on control engineering approaches have not been very explicit about whether metabolic or gene-expression regulation would be involved, but designed different ways in which regulation could be organized, with the potential of causing adaptation to be perfect. This study integrates control engineering and classical MCA augmented with supply-demand theory and HCA. Because gene-expression regulation involves time integration, it is identified as a natural instantiation of the 'integral control' (or near integral control) known in control engineering. This study then focuses on robustness against and adaptation to perturbations of process activities in the network, which could result from environmental perturbations, mutations or slow noise. It is shown however that this type of 'integral control' should rarely be expected to lead to the 'perfect adaptation': although the gene-expression regulation increases the robustness of important metabolite concentrations, it rarely makes them infinitely robust. For perfect adaptation to occur, the protein degradation reactions should be zero order in the concentration of the protein, which may be rare biologically for cells growing steadily. A proposed new framework integrating the methodologies of control engineering and metabolic and hierarchical control analysis, improves the understanding of biological systems that are regulated both metabolically and by gene expression. In particular, the new approach enables one to address the issue whether the intracellular biochemical networks that have been and are being identified by genomics and systems

  2. (Im)Perfect robustness and adaptation of metabolic networks subject to metabolic and gene-expression regulation: marrying control engineering with metabolic control analysis

    PubMed Central

    2013-01-01

    Background Metabolic control analysis (MCA) and supply–demand theory have led to appreciable understanding of the systems properties of metabolic networks that are subject exclusively to metabolic regulation. Supply–demand theory has not yet considered gene-expression regulation explicitly whilst a variant of MCA, i.e. Hierarchical Control Analysis (HCA), has done so. Existing analyses based on control engineering approaches have not been very explicit about whether metabolic or gene-expression regulation would be involved, but designed different ways in which regulation could be organized, with the potential of causing adaptation to be perfect. Results This study integrates control engineering and classical MCA augmented with supply–demand theory and HCA. Because gene-expression regulation involves time integration, it is identified as a natural instantiation of the ‘integral control’ (or near integral control) known in control engineering. This study then focuses on robustness against and adaptation to perturbations of process activities in the network, which could result from environmental perturbations, mutations or slow noise. It is shown however that this type of ‘integral control’ should rarely be expected to lead to the ‘perfect adaptation’: although the gene-expression regulation increases the robustness of important metabolite concentrations, it rarely makes them infinitely robust. For perfect adaptation to occur, the protein degradation reactions should be zero order in the concentration of the protein, which may be rare biologically for cells growing steadily. Conclusions A proposed new framework integrating the methodologies of control engineering and metabolic and hierarchical control analysis, improves the understanding of biological systems that are regulated both metabolically and by gene expression. In particular, the new approach enables one to address the issue whether the intracellular biochemical networks that have been and

  3. Massive-scale gene co-expression network construction and robustness testing using random matrix theory.

    PubMed

    Gibson, Scott M; Ficklin, Stephen P; Isaacson, Sven; Luo, Feng; Feltus, Frank A; Smith, Melissa C

    2013-01-01

    The study of gene relationships and their effect on biological function and phenotype is a focal point in systems biology. Gene co-expression networks built using microarray expression profiles are one technique for discovering and interpreting gene relationships. A knowledge-independent thresholding technique, such as Random Matrix Theory (RMT), is useful for identifying meaningful relationships. Highly connected genes in the thresholded network are then grouped into modules that provide insight into their collective functionality. While it has been shown that co-expression networks are biologically relevant, it has not been determined to what extent any given network is functionally robust given perturbations in the input sample set. For such a test, hundreds of networks are needed and hence a tool to rapidly construct these networks. To examine functional robustness of networks with varying input, we enhanced an existing RMT implementation for improved scalability and tested functional robustness of human (Homo sapiens), rice (Oryza sativa) and budding yeast (Saccharomyces cerevisiae). We demonstrate dramatic decrease in network construction time and computational requirements and show that despite some variation in global properties between networks, functional similarity remains high. Moreover, the biological function captured by co-expression networks thresholded by RMT is highly robust.

  4. Massive-Scale Gene Co-Expression Network Construction and Robustness Testing Using Random Matrix Theory

    PubMed Central

    Isaacson, Sven; Luo, Feng; Feltus, Frank A.; Smith, Melissa C.

    2013-01-01

    The study of gene relationships and their effect on biological function and phenotype is a focal point in systems biology. Gene co-expression networks built using microarray expression profiles are one technique for discovering and interpreting gene relationships. A knowledge-independent thresholding technique, such as Random Matrix Theory (RMT), is useful for identifying meaningful relationships. Highly connected genes in the thresholded network are then grouped into modules that provide insight into their collective functionality. While it has been shown that co-expression networks are biologically relevant, it has not been determined to what extent any given network is functionally robust given perturbations in the input sample set. For such a test, hundreds of networks are needed and hence a tool to rapidly construct these networks. To examine functional robustness of networks with varying input, we enhanced an existing RMT implementation for improved scalability and tested functional robustness of human (Homo sapiens), rice (Oryza sativa) and budding yeast (Saccharomyces cerevisiae). We demonstrate dramatic decrease in network construction time and computational requirements and show that despite some variation in global properties between networks, functional similarity remains high. Moreover, the biological function captured by co-expression networks thresholded by RMT is highly robust. PMID:23409071

  5. An Integrated Environmental Assessment of Green and Gray Infrastructure Strategies for Robust Decision Making.

    PubMed

    Casal-Campos, Arturo; Fu, Guangtao; Butler, David; Moore, Andrew

    2015-07-21

    The robustness of a range of watershed-scale "green" and "gray" drainage strategies in the future is explored through comprehensive modeling of a fully integrated urban wastewater system case. Four socio-economic future scenarios, defined by parameters affecting the environmental performance of the system, are proposed to account for the uncertain variability of conditions in the year 2050. A regret-based approach is applied to assess the relative performance of strategies in multiple impact categories (environmental, economic, and social) as well as to evaluate their robustness across future scenarios. The concept of regret proves useful in identifying performance trade-offs and recognizing states of the world most critical to decisions. The study highlights the robustness of green strategies (particularly rain gardens, resulting in half the regret of most options) over end-of-pipe gray alternatives (surface water separation or sewer and storage rehabilitation), which may be costly (on average, 25% of the total regret of these options) and tend to focus on sewer flooding and CSO alleviation while compromising on downstream system performance (this accounts for around 50% of their total regret). Trade-offs and scenario regrets observed in the analysis suggest that the combination of green and gray strategies may still offer further potential for robustness.

  6. Towards robust specularity detection and inpainting in cardiac images

    NASA Astrophysics Data System (ADS)

    Alsaleh, Samar M.; Aviles, Angelica I.; Sobrevilla, Pilar; Casals, Alicia; Hahn, James

    2016-03-01

    Computer-assisted cardiac surgeries had major advances throughout the years and are gaining more popularity over conventional cardiac procedures as they offer many benefits to both patients and surgeons. One obvious advantage is that they enable surgeons to perform delicate tasks on the heart while it is still beating, avoiding the risks associated with cardiac arrest. Consequently, the surgical system needs to accurately compensate the physiological motion of the heart which is a very challenging task in medical robotics since there exist different sources of disturbances. One of which is the bright light reflections, known as specular highlights, that appear on the glossy surface of the heart and partially occlude the field of view. This work is focused on developing a robust approach that accurately detects and removes those highlights to reduce their disturbance to the surgeon and the motion compensation algorithm. As a first step, we exploit both color attributes and Fuzzy edge detector to identify specular regions in each acquired image frame. These two techniques together work as restricted thresholding and are able to accurately identify specular regions. Then, in order to eliminate the specularity artifact and give the surgeon a better perception of the heart, the second part of our solution is dedicated to correct the detected regions using inpainting to propagate and smooth the results. Our experimental results, which we carry out in realistic datasets, reveal how efficient and precise the proposed solution is, as well as demonstrate its robustness and real-time performance.

  7. Natural Language Processing: Toward Large-Scale, Robust Systems.

    ERIC Educational Resources Information Center

    Haas, Stephanie W.

    1996-01-01

    Natural language processing (NLP) is concerned with getting computers to do useful things with natural language. Major applications include machine translation, text generation, information retrieval, and natural language interfaces. Reviews important developments since 1987 that have led to advances in NLP; current NLP applications; and problems…

  8. Robust tracking control of a magnetically suspended rigid body

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.; Cox, David E.

    1994-01-01

    This study is an application of H-infinity and micro-synthesis for designing robust tracking controllers for the Large Angle Magnetic Suspension Test Facility. The modeling, design, analysis, simulation, and testing of a control law that guarantees tracking performance under external disturbances and model uncertainties is investigated. The type of uncertainties considered and the tracking performance metric used is discussed. This study demonstrates the tradeoff between tracking performance at low frequencies and robustness at high frequencies. Two sets of controllers were designed and tested. The first set emphasized performance over robustness, while the second set traded off performance for robustness. Comparisons of simulation and test results are also included. Current simulation and experimental results indicate that reasonably good robust tracking performance can be attained for this system using multivariable robust control approach.

  9. Robust Variable Selection with Exponential Squared Loss.

    PubMed

    Wang, Xueqin; Jiang, Yunlu; Huang, Mian; Zhang, Heping

    2013-04-01

    Robust variable selection procedures through penalized regression have been gaining increased attention in the literature. They can be used to perform variable selection and are expected to yield robust estimates. However, to the best of our knowledge, the robustness of those penalized regression procedures has not been well characterized. In this paper, we propose a class of penalized robust regression estimators based on exponential squared loss. The motivation for this new procedure is that it enables us to characterize its robustness that has not been done for the existing procedures, while its performance is near optimal and superior to some recently developed methods. Specifically, under defined regularity conditions, our estimators are [Formula: see text] and possess the oracle property. Importantly, we show that our estimators can achieve the highest asymptotic breakdown point of 1/2 and that their influence functions are bounded with respect to the outliers in either the response or the covariate domain. We performed simulation studies to compare our proposed method with some recent methods, using the oracle method as the benchmark. We consider common sources of influential points. Our simulation studies reveal that our proposed method performs similarly to the oracle method in terms of the model error and the positive selection rate even in the presence of influential points. In contrast, other existing procedures have a much lower non-causal selection rate. Furthermore, we re-analyze the Boston Housing Price Dataset and the Plasma Beta-Carotene Level Dataset that are commonly used examples for regression diagnostics of influential points. Our analysis unravels the discrepancies of using our robust method versus the other penalized regression method, underscoring the importance of developing and applying robust penalized regression methods.

  10. Robust Variable Selection with Exponential Squared Loss

    PubMed Central

    Wang, Xueqin; Jiang, Yunlu; Huang, Mian; Zhang, Heping

    2013-01-01

    Robust variable selection procedures through penalized regression have been gaining increased attention in the literature. They can be used to perform variable selection and are expected to yield robust estimates. However, to the best of our knowledge, the robustness of those penalized regression procedures has not been well characterized. In this paper, we propose a class of penalized robust regression estimators based on exponential squared loss. The motivation for this new procedure is that it enables us to characterize its robustness that has not been done for the existing procedures, while its performance is near optimal and superior to some recently developed methods. Specifically, under defined regularity conditions, our estimators are n-consistent and possess the oracle property. Importantly, we show that our estimators can achieve the highest asymptotic breakdown point of 1/2 and that their influence functions are bounded with respect to the outliers in either the response or the covariate domain. We performed simulation studies to compare our proposed method with some recent methods, using the oracle method as the benchmark. We consider common sources of influential points. Our simulation studies reveal that our proposed method performs similarly to the oracle method in terms of the model error and the positive selection rate even in the presence of influential points. In contrast, other existing procedures have a much lower non-causal selection rate. Furthermore, we re-analyze the Boston Housing Price Dataset and the Plasma Beta-Carotene Level Dataset that are commonly used examples for regression diagnostics of influential points. Our analysis unravels the discrepancies of using our robust method versus the other penalized regression method, underscoring the importance of developing and applying robust penalized regression methods. PMID:23913996

  11. Commonsense Conceptions of Emergent Processes: Why Some Misconceptions Are Robust

    ERIC Educational Resources Information Center

    Chi, Michelene T. H.

    2005-01-01

    This article offers a plausible domain-general explanation for why some concepts of processes are resistant to instructional remediation although other, apparently similar concepts are more easily understood. The explanation assumes that processes may differ in ontological ways: that some processes (such as the apparent flow in diffusion of dye in…

  12. Robust flight design for an advanced launch system vehicle

    NASA Astrophysics Data System (ADS)

    Dhand, Sanjeev K.; Wong, Kelvin K.

    Current launch vehicle trajectory design philosophies are generally based on maximizing payload capability. This approach results in an expensive trajectory design process for each mission. Two concepts of robust flight design have been developed to significantly reduce this cost: Standardized Trajectories and Command Multiplier Steering (CMS). These concepts were analyzed for an Advanced Launch System (ALS) vehicle, although their applicability is not restricted to any particular vehicle. Preliminary analysis has demonstrated the feasibility of these concepts at minimal loss in payload capability.

  13. Robust large-scale parallel nonlinear solvers for simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their usemore » in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple

  14. Computational methods using genome-wide association studies to predict radiotherapy complications and to identify correlative molecular processes

    NASA Astrophysics Data System (ADS)

    Oh, Jung Hun; Kerns, Sarah; Ostrer, Harry; Powell, Simon N.; Rosenstein, Barry; Deasy, Joseph O.

    2017-02-01

    The biological cause of clinically observed variability of normal tissue damage following radiotherapy is poorly understood. We hypothesized that machine/statistical learning methods using single nucleotide polymorphism (SNP)-based genome-wide association studies (GWAS) would identify groups of patients of differing complication risk, and furthermore could be used to identify key biological sources of variability. We developed a novel learning algorithm, called pre-conditioned random forest regression (PRFR), to construct polygenic risk models using hundreds of SNPs, thereby capturing genomic features that confer small differential risk. Predictive models were trained and validated on a cohort of 368 prostate cancer patients for two post-radiotherapy clinical endpoints: late rectal bleeding and erectile dysfunction. The proposed method results in better predictive performance compared with existing computational methods. Gene ontology enrichment analysis and protein-protein interaction network analysis are used to identify key biological processes and proteins that were plausible based on other published studies. In conclusion, we confirm that novel machine learning methods can produce large predictive models (hundreds of SNPs), yielding clinically useful risk stratification models, as well as identifying important underlying biological processes in the radiation damage and tissue repair process. The methods are generally applicable to GWAS data and are not specific to radiotherapy endpoints.

  15. Robust symmetry-protected metrology with the Haldane phase

    NASA Astrophysics Data System (ADS)

    Bartlett, Stephen D.; Brennen, Gavin K.; Miyake, Akimasa

    2018-01-01

    We propose a metrology scheme that is made robust to a wide range of noise processes by using the passive, error-preventing properties of symmetry-protected topological phases. The so-called fractionalized edge mode of an antiferromagnetic Heisenberg spin-1 chain in a rotationally- symmetric Haldane phase can be used to measure the direction of an unknown electric field, by exploiting the way in which the field direction reduces the symmetry of the chain. Specifically, the direction (and when supplementing with a known background field, also the strength) of the field is registered in the holonomy under an adiabatic sensing protocol, and the degenerate fractionalized edge mode is protected through this process by the remaining reduced symmetry. We illustrate the scheme with respect to a potential realization by Rydberg dressed atoms.

  16. How MAP kinase modules function as robust, yet adaptable, circuits

    PubMed Central

    Tian, Tianhai; Harding, Angus

    2014-01-01

    Genetic and biochemical studies have revealed that the diversity of cell types and developmental patterns evident within the animal kingdom is generated by a handful of conserved, core modules. Core biological modules must be robust, able to maintain functionality despite perturbations, and yet sufficiently adaptable for random mutations to generate phenotypic variation during evolution. Understanding how robust, adaptable modules have influenced the evolution of eukaryotes will inform both evolutionary and synthetic biology. One such system is the MAP kinase module, which consists of a 3-tiered kinase circuit configuration that has been evolutionarily conserved from yeast to man. MAP kinase signal transduction pathways are used across eukaryotic phyla to drive biological functions that are crucial for life. Here we ask the fundamental question, why do MAPK modules follow a conserved 3-tiered topology rather than some other number? Using computational simulations, we identify a fundamental 2-tiered circuit topology that can be readily reconfigured by feedback loops and scaffolds to generate diverse signal outputs. When this 2-kinase circuit is connected to proximal input kinases, a 3-tiered modular configuration is created that is both robust and adaptable, providing a biological circuit that can regulate multiple phenotypes and maintain functionality in an uncertain world. We propose that the 3-tiered signal transduction module has been conserved through positive selection, because it facilitated the generation of phenotypic variation during eukaryotic evolution. PMID:25483189

  17. Using Bayesian Inference Framework towards Identifying Gas Species and Concentration from High Temperature Resistive Sensor Array Data

    DOE PAGES

    Liu, Yixin; Zhou, Kai; Lei, Yu

    2015-01-01

    High temperature gas sensors have been highly demanded for combustion process optimization and toxic emissions control, which usually suffer from poor selectivity. In order to solve this selectivity issue and identify unknown reducing gas species (CO, CH 4 , and CH 8 ) and concentrations, a high temperature resistive sensor array data set was built in this study based on 5 reported sensors. As each sensor showed specific responses towards different types of reducing gas with certain concentrations, based on which calibration curves were fitted, providing benchmark sensor array response database, then Bayesian inference framework was utilized to process themore » sensor array data and build a sample selection program to simultaneously identify gas species and concentration, by formulating proper likelihood between input measured sensor array response pattern of an unknown gas and each sampled sensor array response pattern in benchmark database. This algorithm shows good robustness which can accurately identify gas species and predict gas concentration with a small error of less than 10% based on limited amount of experiment data. These features indicate that Bayesian probabilistic approach is a simple and efficient way to process sensor array data, which can significantly reduce the required computational overhead and training data.« less

  18. Pharmaceutical identifier confirmation via DART-TOF.

    PubMed

    Easter, Jacob L; Steiner, Robert R

    2014-07-01

    Pharmaceutical analysis comprises a large amount of the casework in forensic controlled substances laboratories. In order to reduce the time of analysis for pharmaceuticals, a Direct Analysis in Real Time ion source coupled with an accurate mass time-of-flight (DART-TOF) mass spectrometer was used to confirm identity. DART-TOF spectral data for pharmaceutical samples were analyzed and evaluated by comparison to standard spectra. Identical mass pharmaceuticals were differentiated using collision induced dissociation fragmentation, present/absent ions, and abundance comparison box plots; principal component analysis (PCA) and linear discriminant analysis (LDA) were used for differentiation of identical mass mixed drug spectra. Mass assignment reproducibility and robustness tests were performed on the DART-TOF spectra. Impacts on the forensic science community include a decrease in analysis time over the traditional gas chromatograph/mass spectrometry (GC/MS) confirmations, better laboratory efficiency, and simpler sample preparation. Using physical identifiers and the DART-TOF to confirm pharmaceutical identity will eliminate the use of GC/MS and effectively reduce analysis time while still complying with accepted analysis protocols. This will prove helpful in laboratories with large backlogs and will simplify the confirmation process. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Characterization of stem cells and cancer cells on the basis of gene expression profile stability, plasticity, and robustness: dynamical systems theory of gene expressions under cell-cell interaction explains mutational robustness of differentiated cells and suggests how cancer cells emerge.

    PubMed

    Kaneko, Kunihiko

    2011-06-01

    Here I present and discuss a model that, among other things, appears able to describe the dynamics of cancer cell origin from the perspective of stable and unstable gene expression profiles. In identifying such aberrant gene expression profiles as lying outside the normal stable states attracted through development and normal cell differentiation, the hypothesis explains why cancer cells accumulate mutations, to which they are not robust, and why these mutations create a new stable state far from the normal gene expression profile space. Such cells are in strong contrast with normal cell types that appeared as an attractor state in the gene expression dynamical system under cell-cell interaction and achieved robustness to noise through evolution, which in turn also conferred robustness to mutation. In complex gene regulation networks, other aberrant cellular states lacking such high robustness are expected to remain, which would correspond to cancer cells. Copyright © 2011 WILEY Periodicals, Inc.

  20. Robust optimization of a tandem grating solar thermal absorber

    NASA Astrophysics Data System (ADS)

    Choi, Jongin; Kim, Mingeon; Kang, Kyeonghwan; Lee, Ikjin; Lee, Bong Jae

    2018-04-01

    Ideal solar thermal absorbers need to have a high value of the spectral absorptance in the broad solar spectrum to utilize the solar radiation effectively. Majority of recent studies about solar thermal absorbers focus on achieving nearly perfect absorption using nanostructures, whose characteristic dimension is smaller than the wavelength of sunlight. However, precise fabrication of such nanostructures is not easy in reality; that is, unavoidable errors always occur to some extent in the dimension of fabricated nanostructures, causing an undesirable deviation of the absorption performance between the designed structure and the actually fabricated one. In order to minimize the variation in the solar absorptance due to the fabrication error, the robust optimization can be performed during the design process. However, the optimization of solar thermal absorber considering all design variables often requires tremendous computational costs to find an optimum combination of design variables with the robustness as well as the high performance. To achieve this goal, we apply the robust optimization using the Kriging method and the genetic algorithm for designing a tandem grating solar absorber. By constructing a surrogate model through the Kriging method, computational cost can be substantially reduced because exact calculation of the performance for every combination of variables is not necessary. Using the surrogate model and the genetic algorithm, we successfully design an effective solar thermal absorber exhibiting a low-level of performance degradation due to the fabrication uncertainty of design variables.

  1. Torque coordinating robust control of shifting process for dry dual clutch transmission equipped in a hybrid car

    NASA Astrophysics Data System (ADS)

    Zhao, Z.-G.; Chen, H.-J.; Yang, Y.-Y.; He, L.

    2015-09-01

    For a hybrid car equipped with dual clutch transmission (DCT), the coordination control problems of clutches and power sources are investigated while taking full advantage of the integrated starter generator motor's fast response speed and high accuracy (speed and torque). First, a dynamic model of the shifting process is established, the vehicle acceleration is quantified according to the intentions of the driver, and the torque transmitted by clutches is calculated based on the designed disengaging principle during the torque phase. Next, a robust H∞ controller is designed to ensure speed synchronisation despite the existence of model uncertainties, measurement noise, and engine torque lag. The engine torque lag and measurement noise are used as external disturbances to initially modify the output torque of the power source. Additionally, during the torque switch phase, the torque of the power sources is smoothly transitioned to the driver's demanded torque. Finally, the torque of the power sources is further distributed based on the optimisation of system efficiency, and the throttle opening of the engine is constrained to avoid sharp torque variations. The simulation results verify that the proposed control strategies effectively address the problem of coordinating control of clutches and power sources, establishing a foundation for the application of DCT in hybrid cars.

  2. Designing Phononic Crystals with Wide and Robust Band Gaps

    NASA Astrophysics Data System (ADS)

    Jia, Zian; Chen, Yanyu; Yang, Haoxiang; Wang, Lifeng

    2018-04-01

    Phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with wide and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.

  3. A novel mini-DNA barcoding assay to identify processed fins from internationally protected shark species.

    PubMed

    Fields, Andrew T; Abercrombie, Debra L; Eng, Rowena; Feldheim, Kevin; Chapman, Demian D

    2015-01-01

    There is a growing need to identify shark products in trade, in part due to the recent listing of five commercially important species on the Appendices of the Convention on International Trade in Endangered Species (CITES; porbeagle, Lamna nasus, oceanic whitetip, Carcharhinus longimanus scalloped hammerhead, Sphyrna lewini, smooth hammerhead, S. zygaena and great hammerhead S. mokarran) in addition to three species listed in the early part of this century (whale, Rhincodon typus, basking, Cetorhinus maximus, and white, Carcharodon carcharias). Shark fins are traded internationally to supply the Asian dried seafood market, in which they are used to make the luxury dish shark fin soup. Shark fins usually enter international trade with their skin still intact and can be identified using morphological characters or standard DNA-barcoding approaches. Once they reach Asia and are traded in this region the skin is removed and they are treated with chemicals that eliminate many key diagnostic characters and degrade their DNA ("processed fins"). Here, we present a validated mini-barcode assay based on partial sequences of the cytochrome oxidase I gene that can reliably identify the processed fins of seven of the eight CITES listed shark species. We also demonstrate that the assay can even frequently identify the species or genus of origin of shark fin soup (31 out of 50 samples).

  4. A Novel Mini-DNA Barcoding Assay to Identify Processed Fins from Internationally Protected Shark Species

    PubMed Central

    Fields, Andrew T.; Abercrombie, Debra L.; Eng, Rowena; Feldheim, Kevin; Chapman, Demian D.

    2015-01-01

    There is a growing need to identify shark products in trade, in part due to the recent listing of five commercially important species on the Appendices of the Convention on International Trade in Endangered Species (CITES; porbeagle, Lamna nasus, oceanic whitetip, Carcharhinus longimanus scalloped hammerhead, Sphyrna lewini, smooth hammerhead, S. zygaena and great hammerhead S. mokarran) in addition to three species listed in the early part of this century (whale, Rhincodon typus, basking, Cetorhinus maximus, and white, Carcharodon carcharias). Shark fins are traded internationally to supply the Asian dried seafood market, in which they are used to make the luxury dish shark fin soup. Shark fins usually enter international trade with their skin still intact and can be identified using morphological characters or standard DNA-barcoding approaches. Once they reach Asia and are traded in this region the skin is removed and they are treated with chemicals that eliminate many key diagnostic characters and degrade their DNA (“processed fins”). Here, we present a validated mini-barcode assay based on partial sequences of the cytochrome oxidase I gene that can reliably identify the processed fins of seven of the eight CITES listed shark species. We also demonstrate that the assay can even frequently identify the species or genus of origin of shark fin soup (31 out of 50 samples). PMID:25646789

  5. Robust Fault Detection and Isolation for Stochastic Systems

    NASA Technical Reports Server (NTRS)

    George, Jemin; Gregory, Irene M.

    2010-01-01

    This paper outlines the formulation of a robust fault detection and isolation scheme that can precisely detect and isolate simultaneous actuator and sensor faults for uncertain linear stochastic systems. The given robust fault detection scheme based on the discontinuous robust observer approach would be able to distinguish between model uncertainties and actuator failures and therefore eliminate the problem of false alarms. Since the proposed approach involves precise reconstruction of sensor faults, it can also be used for sensor fault identification and the reconstruction of true outputs from faulty sensor outputs. Simulation results presented here validate the effectiveness of the robust fault detection and isolation system.

  6. Robust, automatic GPS station velocities and velocity time series

    NASA Astrophysics Data System (ADS)

    Blewitt, G.; Kreemer, C.; Hammond, W. C.

    2014-12-01

    Automation in GPS coordinate time series analysis makes results more objective and reproducible, but not necessarily as robust as the human eye to detect problems. Moreover, it is not a realistic option to manually scan our current load of >20,000 time series per day. This motivates us to find an automatic way to estimate station velocities that is robust to outliers, discontinuities, seasonality, and noise characteristics (e.g., heteroscedasticity). Here we present a non-parametric method based on the Theil-Sen estimator, defined as the median of velocities vij=(xj-xi)/(tj-ti) computed between all pairs (i, j). Theil-Sen estimators produce statistically identical solutions to ordinary least squares for normally distributed data, but they can tolerate up to 29% of data being problematic. To mitigate seasonality, our proposed estimator only uses pairs approximately separated by an integer number of years (N-δt)<(tj-ti )<(N+δt), where δt is chosen to be small enough to capture seasonality, yet large enough to reduce random error. We fix N=1 to maximally protect against discontinuities. In addition to estimating an overall velocity, we also use these pairs to estimate velocity time series. To test our methods, we process real data sets that have already been used with velocities published in the NA12 reference frame. Accuracy can be tested by the scatter of horizontal velocities in the North American plate interior, which is known to be stable to ~0.3 mm/yr. This presents new opportunities for time series interpretation. For example, the pattern of velocity variations at the interannual scale can help separate tectonic from hydrological processes. Without any step detection, velocity estimates prove to be robust for stations affected by the Mw7.2 2010 El Mayor-Cucapah earthquake, and velocity time series show a clear change after the earthquake, without any of the usual parametric constraints, such as relaxation of postseismic velocities to their preseismic values.

  7. SU-E-T-625: Robustness Evaluation and Robust Optimization of IMPT Plans Based on Per-Voxel Standard Deviation of Dose Distributions.

    PubMed

    Liu, W; Mohan, R

    2012-06-01

    Proton dose distributions, IMPT in particular, are highly sensitive to setup and range uncertainties. We report a novel method, based on per-voxel standard deviation (SD) of dose distributions, to evaluate the robustness of proton plans and to robustly optimize IMPT plans to render them less sensitive to uncertainties. For each optimization iteration, nine dose distributions are computed - the nominal one, and one each for ± setup uncertainties along x, y and z axes and for ± range uncertainty. SD of dose in each voxel is used to create SD-volume histogram (SVH) for each structure. SVH may be considered a quantitative representation of the robustness of the dose distribution. For optimization, the desired robustness may be specified in terms of an SD-volume (SV) constraint on the CTV and incorporated as a term in the objective function. Results of optimization with and without this constraint were compared in terms of plan optimality and robustness using the so called'worst case' dose distributions; which are obtained by assigning the lowest among the nine doses to each voxel in the clinical target volume (CTV) and the highest to normal tissue voxels outside the CTV. The SVH curve and the area under it for each structure were used as quantitative measures of robustness. Penalty parameter of SV constraint may be varied to control the tradeoff between robustness and plan optimality. We applied these methods to one case each of H&N and lung. In both cases, we found that imposing SV constraint improved plan robustness but at the cost of normal tissue sparing. SVH-based optimization and evaluation is an effective tool for robustness evaluation and robust optimization of IMPT plans. Studies need to be conducted to test the methods for larger cohorts of patients and for other sites. This research is supported by National Cancer Institute (NCI) grant P01CA021239, the University Cancer Foundation via the Institutional Research Grant program at the University of Texas MD

  8. Mining manufacturing data for discovery of high productivity process characteristics.

    PubMed

    Charaniya, Salim; Le, Huong; Rangwala, Huzefa; Mills, Keri; Johnson, Kevin; Karypis, George; Hu, Wei-Shou

    2010-06-01

    Modern manufacturing facilities for bioproducts are highly automated with advanced process monitoring and data archiving systems. The time dynamics of hundreds of process parameters and outcome variables over a large number of production runs are archived in the data warehouse. This vast amount of data is a vital resource to comprehend the complex characteristics of bioprocesses and enhance production robustness. Cell culture process data from 108 'trains' comprising production as well as inoculum bioreactors from Genentech's manufacturing facility were investigated. Each run constitutes over one-hundred on-line and off-line temporal parameters. A kernel-based approach combined with a maximum margin-based support vector regression algorithm was used to integrate all the process parameters and develop predictive models for a key cell culture performance parameter. The model was also used to identify and rank process parameters according to their relevance in predicting process outcome. Evaluation of cell culture stage-specific models indicates that production performance can be reliably predicted days prior to harvest. Strong associations between several temporal parameters at various manufacturing stages and final process outcome were uncovered. This model-based data mining represents an important step forward in establishing a process data-driven knowledge discovery in bioprocesses. Implementation of this methodology on the manufacturing floor can facilitate a real-time decision making process and thereby improve the robustness of large scale bioprocesses. 2010 Elsevier B.V. All rights reserved.

  9. Evaluation of Recoverable-Robust Timetables on Tree Networks

    NASA Astrophysics Data System (ADS)

    D'Angelo, Gianlorenzo; di Stefano, Gabriele; Navarra, Alfredo

    In the context of scheduling and timetabling, we study a challenging combinatorial problem which is interesting from both a practical and a theoretical point of view. The motivation behind it is to cope with scheduled activities which might be subject to unavoidable disturbances, such as delays, occurring during the operational phase. The idea is to preventively plan some extra time for the scheduled activities in order to be "prepared" if a delay occurs, and to absorb it without the necessity of re-scheduling the activities from scratch. This realizes the concept of designing so called robust timetables. During the planning phase, one has to consider recovery features that might be applied at runtime if delays occur. Such recovery capabilities are given as input along with the possible delays that must be considered. The objective is the minimization of the overall needed time. The quality of a robust timetable is measured by the price of robustness, i.e. the ratio between the cost of the robust timetable and that of a non-robust optimal timetable. The considered problem is known to be NP-hard. We propose a pseudo-polynomial time algorithm and apply it on random networks and real case scenarios provided by Italian railways. We evaluate the effect of robustness on the scheduling of the activities and provide the price of robustness with respect to different scenarios. We experimentally show the practical effectiveness and efficiency of the proposed algorithm.

  10. Broken Robustness Analysis: How to make proper climate change conclusions in contradictory multimodal measurement contexts.

    NASA Astrophysics Data System (ADS)

    Keyser, V.

    2015-12-01

    Philosophers of science discuss how multiple modes of measurement can generate evidence for the existence and character of a phenomenon (Horwich 1982; Hacking 1983; Franklin and Howson 1984; Collins 1985; Sober 1989; Trout 1993; Culp 1995; Keeley 2002; Staley 2004; Weber 2005; Keyser 2012). But how can this work systematically in climate change measurement? Additionally, what conclusions can scientists and policy-makers draw when different modes of measurement fail to be robust by producing contradictory results? First, I present a new technical account of robust measurement (RAMP) that focuses on the physical independence of measurement processes. I detail how physically independent measurement processes "check each other's results." (This account is in contrast to philosophical accounts of robustness analysis that focus on independent model assumptions or independent measurement products or results.) Second, I present a puzzle about contradictory and divergent climate change measures, which has consistently re-emerged in climate measurement. This discussion will focus on land, drilling, troposphere, and computer simulation measures. Third, to systematically solve this climate measurement puzzle, I use RAMP in the context of drought measurement in order to generate a classification of measurement processes. Here, I discuss how multimodal precipitation measures—e.g., measures of precipitation deficit like the Standard Precipitation Index vs. air humidity measures like the Standardized Relative Humidity Index--can help with the classification scheme of climate change measurement processes. Finally, I discuss how this classification of measures can help scientists and policy-makers draw effective conclusions in contradictory multimodal climate change measurement contexts.

  11. Clocking the social mind by identifying mental processes in the IAT with electrical neuroimaging

    PubMed Central

    Schiller, Bastian; Gianotti, Lorena R. R.; Baumgartner, Thomas; Nash, Kyle; Koenig, Thomas; Knoch, Daria

    2016-01-01

    Why do people take longer to associate the word “love” with outgroup words (incongruent condition) than with ingroup words (congruent condition)? Despite the widespread use of the implicit association test (IAT), it has remained unclear whether this IAT effect is due to additional mental processes in the incongruent condition, or due to longer duration of the same processes. Here, we addressed this previously insoluble issue by assessing the spatiotemporal evolution of brain electrical activity in 83 participants. From stimulus presentation until response production, we identified seven processes. Crucially, all seven processes occurred in the same temporal sequence in both conditions, but participants needed more time to perform one early occurring process (perceptual processing) and one late occurring process (implementing cognitive control to select the motor response) in the incongruent compared with the congruent condition. We also found that the latter process contributed to individual differences in implicit bias. These results advance understanding of the neural mechanics of response time differences in the IAT: They speak against theories that explain the IAT effect as due to additional processes in the incongruent condition and speak in favor of theories that assume a longer duration of specific processes in the incongruent condition. More broadly, our data analysis approach illustrates the potential of electrical neuroimaging to illuminate the temporal organization of mental processes involved in social cognition. PMID:26903643

  12. Clocking the social mind by identifying mental processes in the IAT with electrical neuroimaging.

    PubMed

    Schiller, Bastian; Gianotti, Lorena R R; Baumgartner, Thomas; Nash, Kyle; Koenig, Thomas; Knoch, Daria

    2016-03-08

    Why do people take longer to associate the word "love" with outgroup words (incongruent condition) than with ingroup words (congruent condition)? Despite the widespread use of the implicit association test (IAT), it has remained unclear whether this IAT effect is due to additional mental processes in the incongruent condition, or due to longer duration of the same processes. Here, we addressed this previously insoluble issue by assessing the spatiotemporal evolution of brain electrical activity in 83 participants. From stimulus presentation until response production, we identified seven processes. Crucially, all seven processes occurred in the same temporal sequence in both conditions, but participants needed more time to perform one early occurring process (perceptual processing) and one late occurring process (implementing cognitive control to select the motor response) in the incongruent compared with the congruent condition. We also found that the latter process contributed to individual differences in implicit bias. These results advance understanding of the neural mechanics of response time differences in the IAT: They speak against theories that explain the IAT effect as due to additional processes in the incongruent condition and speak in favor of theories that assume a longer duration of specific processes in the incongruent condition. More broadly, our data analysis approach illustrates the potential of electrical neuroimaging to illuminate the temporal organization of mental processes involved in social cognition.

  13. Robust vortex lines, vortex rings, and hopfions in three-dimensional Bose-Einstein condensates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bisset, R. N.; Wang, Wenlong; Ticknor, Christopher

    Performing a systematic Bogoliubov–de Gennes spectral analysis, we illustrate that stationary vortex lines, vortex rings, and more exotic states, such as hopfions, are robust in three-dimensional atomic Bose-Einstein condensates, for large parameter intervals. Importantly, we find that the hopfion can be stabilized in a simple parabolic trap, without the need for trap rotation or inhomogeneous interactions. We supplement our spectral analysis by studying the dynamics of such stationary states; we find them to be robust against significant perturbations of the initial state. In the unstable regimes, we not only identify the unstable mode, such as a quadrupolar or hexapolar mode,more » but we also observe the corresponding instability dynamics. Moreover, deep in the Thomas-Fermi regime, we investigate the particlelike behavior of vortex rings and hopfions.« less

  14. Robust vortex lines, vortex rings, and hopfions in three-dimensional Bose-Einstein condensates

    DOE PAGES

    Bisset, R. N.; Wang, Wenlong; Ticknor, Christopher; ...

    2015-12-07

    Performing a systematic Bogoliubov–de Gennes spectral analysis, we illustrate that stationary vortex lines, vortex rings, and more exotic states, such as hopfions, are robust in three-dimensional atomic Bose-Einstein condensates, for large parameter intervals. Importantly, we find that the hopfion can be stabilized in a simple parabolic trap, without the need for trap rotation or inhomogeneous interactions. We supplement our spectral analysis by studying the dynamics of such stationary states; we find them to be robust against significant perturbations of the initial state. In the unstable regimes, we not only identify the unstable mode, such as a quadrupolar or hexapolar mode,more » but we also observe the corresponding instability dynamics. Moreover, deep in the Thomas-Fermi regime, we investigate the particlelike behavior of vortex rings and hopfions.« less

  15. Identifying different learning styles to enhance the learning experience.

    PubMed

    Anderson, Irene

    2016-10-12

    Identifying your preferred learning style can be a useful way to optimise learning opportunities, and can help learners to recognise their strengths and areas for development in the way that learning takes place. It can also help teachers (educators) to recognise where additional activities are required to ensure the learning experience is robust and effective. There are several models available that may be used to identify learning styles. This article discusses these models and considers their usefulness in healthcare education. Models of teaching styles are also considered.

  16. Combining Digital Watermarking and Fingerprinting Techniques to Identify Copyrights for Color Images

    PubMed Central

    Hsieh, Shang-Lin; Chen, Chun-Che; Shen, Wen-Shan

    2014-01-01

    This paper presents a copyright identification scheme for color images that takes advantage of the complementary nature of watermarking and fingerprinting. It utilizes an authentication logo and the extracted features of the host image to generate a fingerprint, which is then stored in a database and also embedded in the host image to produce a watermarked image. When a dispute over the copyright of a suspect image occurs, the image is first processed by watermarking. If the watermark can be retrieved from the suspect image, the copyright can then be confirmed; otherwise, the watermark then serves as the fingerprint and is processed by fingerprinting. If a match in the fingerprint database is found, then the suspect image will be considered a duplicated one. Because the proposed scheme utilizes both watermarking and fingerprinting, it is more robust than those that only adopt watermarking, and it can also obtain the preliminary result more quickly than those that only utilize fingerprinting. The experimental results show that when the watermarked image suffers slight attacks, watermarking alone is enough to identify the copyright. The results also show that when the watermarked image suffers heavy attacks that render watermarking incompetent, fingerprinting can successfully identify the copyright, hence demonstrating the effectiveness of the proposed scheme. PMID:25114966

  17. Robust quantum entanglement generation and generation-plus-storage protocols with spin chains

    NASA Astrophysics Data System (ADS)

    Estarellas, Marta P.; D'Amico, Irene; Spiller, Timothy P.

    2017-04-01

    Reliable quantum communication and/or processing links between modules are a necessary building block for various quantum processing architectures. Here we consider a spin-chain system with alternating strength couplings and containing three defects, which impose three domain walls between topologically distinct regions of the chain. We show that—in addition to its useful, high-fidelity, quantum state transfer properties—an entangling protocol can be implemented in this system, with optional localization and storage of the entangled states. We demonstrate both numerically and analytically that, given a suitable initial product-state injection, the natural dynamics of the system produces a maximally entangled state at a given time. We present detailed investigations of the effects of fabrication errors, analyzing random static disorder both in the diagonal and off-diagonal terms of the system Hamiltonian. Our results show that the entangled state formation is very robust against perturbations of up to ˜10 % the weaker chain coupling, and also robust against timing injection errors. We propose a further protocol, which manipulates the chain in order to localize and store each of the entangled qubits. The engineering of a system with such characteristics would thus provide a useful device for quantum information processing tasks involving the creation and storage of entangled resources.

  18. A powerful and robust test in genetic association studies.

    PubMed

    Cheng, Kuang-Fu; Lee, Jen-Yu

    2014-01-01

    There are several well-known single SNP tests presented in the literature for detecting gene-disease association signals. Having in place an efficient and robust testing process across all genetic models would allow a more comprehensive approach to analysis. Although some studies have shown that it is possible to construct such a test when the variants are common and the genetic model satisfies certain conditions, the model conditions are too restrictive and in general difficult to verify. In this paper, we propose a powerful and robust test without assuming any model restrictions. Our test is based on the selected 2 × 2 tables derived from the usual 2 × 3 table. By signals from these tables, we show through simulations across a wide range of allele frequencies and genetic models that this approach may produce a test which is almost uniformly most powerful in the analysis of low- and high-frequency variants. Two cancer studies are used to demonstrate applications of the proposed test. © 2014 S. Karger AG, Basel.

  19. Directed International Technological Change and Climate Policy: New Methods for Identifying Robust Policies Under Conditions of Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Molina-Perez, Edmundo

    It is widely recognized that international environmental technological change is key to reduce the rapidly rising greenhouse gas emissions of emerging nations. In 2010, the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) agreed to the creation of the Green Climate Fund (GCF). This new multilateral organization has been created with the collective contributions of COP members, and has been tasked with directing over USD 100 billion per year towards investments that can enhance the development and diffusion of clean energy technologies in both advanced and emerging nations (Helm and Pichler, 2015). The landmark agreement arrived at the COP 21 has reaffirmed the key role that the GCF plays in enabling climate mitigation as it is now necessary to align large scale climate financing efforts with the long-term goals agreed at Paris 2015. This study argues that because of the incomplete understanding of the mechanics of international technological change, the multiplicity of policy options and ultimately the presence of climate and technological change deep uncertainty, climate financing institutions such as the GCF, require new analytical methods for designing long-term robust investment plans. Motivated by these challenges, this dissertation shows that the application of new analytical methods, such as Robust Decision Making (RDM) and Exploratory Modeling (Lempert, Popper and Bankes, 2003) to the study of international technological change and climate policy provides useful insights that can be used for designing a robust architecture of international technological cooperation for climate change mitigation. For this study I developed an exploratory dynamic integrated assessment model (EDIAM) which is used as the scenario generator in a large computational experiment. The scope of the experimental design considers an ample set of climate and technological scenarios. These scenarios combine five sources of uncertainty

  20. Designing Phononic Crystals with Wide and Robust Band Gaps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, Zian; Chen, Yanyu; Yang, Haoxiang

    Here, phononic crystals (PnCs) engineered to manipulate and control the propagation of mechanical waves have enabled the design of a range of novel devices, such as waveguides, frequency modulators, and acoustic cloaks, for which wide and robust phononic band gaps are highly preferable. While numerous PnCs have been designed in recent decades, to the best of our knowledge, PnCs that possess simultaneous wide and robust band gaps (to randomness and deformations) have not yet been reported. Here, we demonstrate that by combining the band-gap formation mechanisms of Bragg scattering and local resonances (the latter one is dominating), PnCs with widemore » and robust phononic band gaps can be established. The robustness of the phononic band gaps are then discussed from two aspects: robustness to geometric randomness (manufacture defects) and robustness to deformations (mechanical stimuli). Analytical formulations further predict the optimal design parameters, and an uncertainty analysis quantifies the randomness effect of each designing parameter. Moreover, we show that the deformation robustness originates from a local resonance-dominant mechanism together with the suppression of structural instability. Importantly, the proposed PnCs require only a small number of layers of elements (three unit cells) to obtain broad, robust, and strong attenuation bands, which offer great potential in designing flexible and deformable phononic devices.« less