Sample records for robust tools develop robust

  1. Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.

    PubMed

    Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry

    Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.

  2. Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer

    DTIC Science & Technology

    2006-03-01

    able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem

  3. Modern CACSD using the Robust-Control Toolbox

    NASA Technical Reports Server (NTRS)

    Chiang, Richard Y.; Safonov, Michael G.

    1989-01-01

    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.

  4. Robust detection, isolation and accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Emami-Naeini, A.; Akhter, M. M.; Rock, S. M.

    1986-01-01

    The objective is to extend the recent advances in robust control system design of multivariable systems to sensor failure detection, isolation, and accommodation (DIA), and estimator design. This effort provides analysis tools to quantify the trade-off between performance robustness and DIA sensitivity, which are to be used to achieve higher levels of performance robustness for given levels of DIA sensitivity. An innovations-based DIA scheme is used. Estimators, which depend upon a model of the process and process inputs and outputs, are used to generate these innovations. Thresholds used to determine failure detection are computed based on bounds on modeling errors, noise properties, and the class of failures. The applicability of the newly developed tools are demonstrated on a multivariable aircraft turbojet engine example. A new concept call the threshold selector was developed. It represents a significant and innovative tool for the analysis and synthesis of DiA algorithms. The estimators were made robust by introduction of an internal model and by frequency shaping. The internal mode provides asymptotically unbiased filter estimates.The incorporation of frequency shaping of the Linear Quadratic Gaussian cost functional modifies the estimator design to make it suitable for sensor failure DIA. The results are compared with previous studies which used thresholds that were selcted empirically. Comparison of these two techniques on a nonlinear dynamic engine simulation shows improved performance of the new method compared to previous techniques

  5. Recent Advances in Algal Genetic Tool Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  6. Recent Advances in Algal Genetic Tool Development

    DOE PAGES

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  7. Assessing the Robustness of Complete Bacterial Genome Segmentations

    NASA Astrophysics Data System (ADS)

    Devillers, Hugo; Chiapello, Hélène; Schbath, Sophie; El Karoui, Meriem

    Comparison of closely related bacterial genomes has revealed the presence of highly conserved sequences forming a "backbone" that is interrupted by numerous, less conserved, DNA fragments. Segmentation of bacterial genomes into backbone and variable regions is particularly useful to investigate bacterial genome evolution. Several software tools have been designed to compare complete bacterial chromosomes and a few online databases store pre-computed genome comparisons. However, very few statistical methods are available to evaluate the reliability of these software tools and to compare the results obtained with them. To fill this gap, we have developed two local scores to measure the robustness of bacterial genome segmentations. Our method uses a simulation procedure based on random perturbations of the compared genomes. The scores presented in this paper are simple to implement and our results show that they allow to discriminate easily between robust and non-robust bacterial genome segmentations when using aligners such as MAUVE and MGA.

  8. Averaging business cycles vs. myopia: Do we need a long term vision when developing IRP?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, C.; Gupta, P.C.

    1995-05-01

    Utility demand forecasting is inherently imprecise due to the number of uncertainties resulting from business cycles, policy making, technology breakthroughs, national and international political upheavals and the limitations of the forecasting tools. This implies that revisions based primarily on recent experience could lead to unstable forecasts. Moreover, new planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning decisions.

  9. Hybrid approach for robust diagnostics of cutting tools

    NASA Astrophysics Data System (ADS)

    Ramamurthi, K.; Hough, C. L., Jr.

    1994-03-01

    A new multisensor based hybrid technique has been developed for robust diagnosis of cutting tools. The technique combines the concepts of pattern classification and real-time knowledge based systems (RTKBS) and draws upon their strengths; learning facility in the case of pattern classification and a higher level of reasoning in the case of RTKBS. It eliminates some of their major drawbacks: false alarms or delayed/lack of diagnosis in case of pattern classification and tedious knowledge base generation in case of RTKBS. It utilizes a dynamic distance classifier, developed upon a new separability criterion and a new definition of robust diagnosis for achieving these benefits. The promise of this technique has been proven concretely through an on-line diagnosis of drill wear. Its suitability for practical implementation is substantiated by the use of practical, inexpensive, machine-mounted sensors and low-cost delivery systems.

  10. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Liang, X; Kalbasi, A

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less

  11. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  12. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    PubMed

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  13. TU-EF-304-03: 4D Monte Carlo Robustness Test for Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souris, K; Sterpin, E; Lee, J

    Purpose: Breathing motion and approximate dose calculation engines may increase proton range uncertainties. We address these two issues using a comprehensive 4D robustness evaluation tool based on an efficient Monte Carlo (MC) engine, which can simulate breathing with no significant increase in computation time. Methods: To assess the robustness of the treatment plan, multiple scenarios of uncertainties are simulated, taking into account the systematic and random setup errors, range uncertainties, and organ motion. Our fast MC dose engine, called MCsquare, implements optimized models on a massively-parallel computation architecture and allows us to accurately simulate a scenario in less than onemore » minute. The deviations of the uncertainty scenarios are then reported on a DVH-band and compared to the nominal plan.The robustness evaluation tool is illustrated in a lung case by comparing three 60Gy treatment plans. First, a plan is optimized on a PTV obtained by extending the CTV with an 8mm margin, in order to take into account systematic geometrical uncertainties, like in our current practice in radiotherapy. No specific strategy is employed to correct for tumor and organ motions. The second plan involves a PTV generated from the ITV, which encompasses the tumor volume in all breathing phases. The last plan results from robust optimization performed on the ITV, with robustness parameters of 3% for tissue density and 8 mm for positioning errors. Results: The robustness test revealed that the first two plans could not properly cover the target in the presence of uncertainties. CTV-coverage (D95) in the three plans ranged respectively between 39.4–55.5Gy, 50.2–57.5Gy, and 55.1–58.6Gy. Conclusion: A realistic robustness verification tool based on a fast MC dose engine has been developed. This test is essential to assess the quality of proton therapy plan and very useful to study various planning strategies for mobile tumors. This work is partly funded by IBA (Louvain-la-Neuve, Belgium)« less

  14. Robust bidirectional links for photonic quantum networks

    PubMed Central

    Xu, Jin-Shi; Yung, Man-Hong; Xu, Xiao-Ye; Tang, Jian-Shun; Li, Chuan-Feng; Guo, Guang-Can

    2016-01-01

    Optical fibers are widely used as one of the main tools for transmitting not only classical but also quantum information. We propose and report an experimental realization of a promising method for creating robust bidirectional quantum communication links through paired optical polarization-maintaining fibers. Many limitations of existing protocols can be avoided with the proposed method. In particular, the path and polarization degrees of freedom are combined to deterministically create a photonic decoherence-free subspace without the need for any ancillary photon. This method is input state–independent, robust against dephasing noise, postselection-free, and applicable bidirectionally. To rigorously quantify the amount of quantum information transferred, the optical fibers are analyzed with the tools developed in quantum communication theory. These results not only suggest a practical means for protecting quantum information sent through optical quantum networks but also potentially provide a new physical platform for enriching the structure of the quantum communication theory. PMID:26824069

  15. SU-E-T-618: Plan Robustness Study of Volumetric-Modulated Arc Therapy Vs. Intensity-Modulated Radiation Therapy for Head and Neck Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, W; Patel, S; Shen, J

    Purpose: Lack of plan robustness may contribute to local failure in volumetric-modulated arc therapy (VMAT) to treat head and neck (H&N) cancer. Thus we compared plan robustness of VMAT with intensity-modulated radiation therapy (IMRT). Methods: VMAT and IMRT plans were created for 9 H&N cancer patients. For each plan, six new perturbed dose distributions were computed — one each for ± 3mm setup deviations along the S-I, A-P and L-R directions. We used three robustness quantification tools: (1) worst-case analysis (WCA); (2) dose-volume histograms (DVHs) band (DVHB); and (3) root-mean-square-dose deviation (RMSD) volume histogram (DDVH). DDVH represents the relative volumemore » (y) on the vertical axis and the RMSD (x) on the horizontal axis. Similar to DVH, this means that y% of the volume of the indicated structure has the RMSD at least x Gy[RBE].The width from the first two methods at different target DVH indices (such as D95 and D5) and the area under the DDVH curves (AUC) for the target were used to indicate plan robustness. In these robustness quantification tools, the smaller the value, the more robust the plan is. Plan robustness evaluation metrics were compared using Wilcoxon test. Results: DVHB showed the width at D95 from IMRT to be larger than from VMAT (unit Gy) [1.59 vs 1.18 (p=0.49)], while the width at D5 from IMRT was found to be slightly larger than from VMAT [0.59 vs 0.54 (p=0.84)]. WCA showed similar results [D95: 3.28 vs 3.00 (p=0.56); D5: 1.68 vs 1.95 (p=0.23)]. DDVH showed the AUC from IMRT to be slightly smaller than from VMAT [1.13 vs 1.15 (p=0.43)]. Conclusion: VMAT plan robustness is comparable to IMRT plan robustness. The plan robustness conclusions from WCA and DVHB are DVH parameter dependent. On the other hand DDVH captures the overall effect of uncertainties on the dose to a volume of interest. NIH/NCI K25CA168984; Eagles Cancer Research Career Development; The Lawrence W. and Marilyn W. Matteson Fund for Cancer Research Mayo ASU Seed Grant; The Kemper Marley Foundation.« less

  16. Autonomous Task Management and Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Burian, Barbara

    2017-01-01

    For some time aircraft manufacturers and researchers have been pursuing mechanisms for reducing crew workload and providing better decision support to the pilots, especially during non-normal situations. Some previous attempts to develop task managers or pilot decision support tools have not resulted in robust and fully functional systems. However, the increasing sophistication of sensors and automated reasoners, and the exponential surge in the amount of digital data that is now available create a ripe environment for the development of a robust, dynamic, task manager and decision support tool that is context sensitive and integrates information from a wide array of on-board and off aircraft sourcesa tool that monitors systems and the overall flight situation, anticipates information needs, prioritizes tasks appropriately, keeps pilots well informed, and is nimble and able to adapt to changing circumstances. This presentation will discuss the many significant challenges and issues associated with the development and functionality of such a system for use on the aircraft flight deck.

  17. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  18. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  19. Marine and Hydrokinetic Research | Water Power | NREL

    Science.gov Websites

    . Resource Characterization and Maps NREL develops measurement systems, simulation tools, and web-based models and tools to evaluate the economic potential of power-generating devices for all technology Acceleration NREL analysts study the potential impacts that developing a robust MHK market could have on

  20. Lessons Learned from LIBS Calibration Development

    NASA Astrophysics Data System (ADS)

    Dyar, M. D.; Breves, E. A.; Lepore, K. H.; Boucher, T. F.; Giguere, S.

    2016-10-01

    More than two decades of work have been dedicated to development of robust standards, data processing, and calibration tools for LIBS. Here we summarize major considerations for improving accuracy of LIBS chemical analyses.

  1. A simulation-optimization-based decision support tool for mitigating traffic congestion.

    DOT National Transportation Integrated Search

    2009-12-01

    "Traffic congestion has grown considerably in the United States over the past twenty years. In this paper, we develop : a robust decision support tool based on simulation optimization to evaluate and recommend congestion-mitigation : strategies to tr...

  2. Design optimization for cost and quality: The robust design approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  3. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  4. Strict Constraint Feasibility in Analysis and Design of Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.

    2006-01-01

    This paper proposes a methodology for the analysis and design optimization of models subject to parametric uncertainty, where hard inequality constraints are present. Hard constraints are those that must be satisfied for all parameter realizations prescribed by the uncertainty model. Emphasis is given to uncertainty models prescribed by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles. These models make it possible to consider sets of parameters having comparable as well as dissimilar levels of uncertainty. Two alternative formulations for hyper-rectangular sets are proposed, one based on a transformation of variables and another based on an infinity norm approach. The suite of tools developed enable us to determine if the satisfaction of hard constraints is feasible by identifying critical combinations of uncertain parameters. Since this practice is performed without sampling or partitioning the parameter space, the resulting assessments of robustness are analytically verifiable. Strategies that enable the comparison of the robustness of competing design alternatives, the approximation of the robust design space, and the systematic search for designs with improved robustness characteristics are also proposed. Since the problem formulation is generic and the solution methods only require standard optimization algorithms for their implementation, the tools developed are applicable to a broad range of problems in several disciplines.

  5. Many-objective robust decision making for water allocation under climate change.

    PubMed

    Yan, Dan; Ludwig, Fulco; Huang, He Qing; Werners, Saskia E

    2017-12-31

    Water allocation is facing profound challenges due to climate change uncertainties. To identify adaptive water allocation strategies that are robust to climate change uncertainties, a model framework combining many-objective robust decision making and biophysical modeling is developed for large rivers. The framework was applied to the Pearl River basin (PRB), China where sufficient flow to the delta is required to reduce saltwater intrusion in the dry season. Before identifying and assessing robust water allocation plans for the future, the performance of ten state-of-the-art MOEAs (multi-objective evolutionary algorithms) is evaluated for the water allocation problem in the PRB. The Borg multi-objective evolutionary algorithm (Borg MOEA), which is a self-adaptive optimization algorithm, has the best performance during the historical periods. Therefore it is selected to generate new water allocation plans for the future (2079-2099). This study shows that robust decision making using carefully selected MOEAs can help limit saltwater intrusion in the Pearl River Delta. However, the framework could perform poorly due to larger than expected climate change impacts on water availability. Results also show that subjective design choices from the researchers and/or water managers could potentially affect the ability of the model framework, and cause the most robust water allocation plans to fail under future climate change. Developing robust allocation plans in a river basin suffering from increasing water shortage requires the researchers and water managers to well characterize future climate change of the study regions and vulnerabilities of their tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Effective Visual Tracking Using Multi-Block and Scale Space Based on Kernelized Correlation Filters

    PubMed Central

    Jeong, Soowoong; Kim, Guisik; Lee, Sangkeun

    2017-01-01

    Accurate scale estimation and occlusion handling is a challenging problem in visual tracking. Recently, correlation filter-based trackers have shown impressive results in terms of accuracy, robustness, and speed. However, the model is not robust to scale variation and occlusion. In this paper, we address the problems associated with scale variation and occlusion by employing a scale space filter and multi-block scheme based on a kernelized correlation filter (KCF) tracker. Furthermore, we develop a more robust algorithm using an appearance update model that approximates the change of state of occlusion and deformation. In particular, an adaptive update scheme is presented to make each process robust. The experimental results demonstrate that the proposed method outperformed 29 state-of-the-art trackers on 100 challenging sequences. Specifically, the results obtained with the proposed scheme were improved by 8% and 18% compared to those of the KCF tracker for 49 occlusion and 64 scale variation sequences, respectively. Therefore, the proposed tracker can be a robust and useful tool for object tracking when occlusion and scale variation are involved. PMID:28241475

  7. Effective Visual Tracking Using Multi-Block and Scale Space Based on Kernelized Correlation Filters.

    PubMed

    Jeong, Soowoong; Kim, Guisik; Lee, Sangkeun

    2017-02-23

    Accurate scale estimation and occlusion handling is a challenging problem in visual tracking. Recently, correlation filter-based trackers have shown impressive results in terms of accuracy, robustness, and speed. However, the model is not robust to scale variation and occlusion. In this paper, we address the problems associated with scale variation and occlusion by employing a scale space filter and multi-block scheme based on a kernelized correlation filter (KCF) tracker. Furthermore, we develop a more robust algorithm using an appearance update model that approximates the change of state of occlusion and deformation. In particular, an adaptive update scheme is presented to make each process robust. The experimental results demonstrate that the proposed method outperformed 29 state-of-the-art trackers on 100 challenging sequences. Specifically, the results obtained with the proposed scheme were improved by 8% and 18% compared to those of the KCF tracker for 49 occlusion and 64 scale variation sequences, respectively. Therefore, the proposed tracker can be a robust and useful tool for object tracking when occlusion and scale variation are involved.

  8. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  9. Visualization of the Invisible, Explanation of the Unknown, Ruggedization of the Unstable: Sensitivity Analysis, Virtual Tryout and Robust Design through Systematic Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Zwickl, Titus; Carleer, Bart; Kubli, Waldemar

    2005-08-01

    In the past decade, sheet metal forming simulation became a well established tool to predict the formability of parts. In the automotive industry, this has enabled significant reduction in the cost and time for vehicle design and development, and has helped to improve the quality and performance of vehicle parts. However, production stoppages for troubleshooting and unplanned die maintenance, as well as production quality fluctuations continue to plague manufacturing cost and time. The focus therefore has shifted in recent times beyond mere feasibility to robustness of the product and process being engineered. Ensuring robustness is the next big challenge for the virtual tryout / simulation technology. We introduce new methods, based on systematic stochastic simulations, to visualize the behavior of the part during the whole forming process — in simulation as well as in production. Sensitivity analysis explains the response of the part to changes in influencing parameters. Virtual tryout allows quick exploration of changed designs and conditions. Robust design and manufacturing guarantees quality and process capability for the production process. While conventional simulations helped to reduce development time and cost by ensuring feasible processes, robustness engineering tools have the potential for far greater cost and time savings. Through examples we illustrate how expected and unexpected behavior of deep drawing parts may be tracked down, identified and assigned to the influential parameters. With this knowledge, defects can be eliminated or springback can be compensated e.g.; the response of the part to uncontrollable noise can be predicted and minimized. The newly introduced methods enable more reliable and predictable stamping processes in general.

  10. Macromedia Flash as a Tool for Mathematics Teaching and Learning

    ERIC Educational Resources Information Center

    Garofalo, Joe; Summers, Tim

    2004-01-01

    Macromedia Flash is a powerful and robust development tool. Because of its graphical, sound, and animation capabilities (and ubiquitous browser plug-in), major companies employ it in their website development (see www.nike.com or www.espn.com). These same features also make Flash a valuable environment for building multi-representational "movies"…

  11. Design of Robust Adaptive Unbalance Response Controllers for Rotors with Magnetic Bearings

    NASA Technical Reports Server (NTRS)

    Knospe, Carl R.; Tamer, Samir M.; Fedigan, Stephen J.

    1996-01-01

    Experimental results have recently demonstrated that an adaptive open loop control strategy can be highly effective in the suppression of unbalance induced vibration on rotors supported in active magnetic bearings. This algorithm, however, relies upon a predetermined gain matrix. Typically, this matrix is determined by an optimal control formulation resulting in the choice of the pseudo-inverse of the nominal influence coefficient matrix as the gain matrix. This solution may result in problems with stability and performance robustness since the estimated influence coefficient matrix is not equal to the actual influence coefficient matrix. Recently, analysis tools have been developed to examine the robustness of this control algorithm with respect to structured uncertainty. Herein, these tools are extended to produce a design procedure for determining the adaptive law's gain matrix. The resulting control algorithm has a guaranteed convergence rate and steady state performance in spite of the uncertainty in the rotor system. Several examples are presented which demonstrate the effectiveness of this approach and its advantages over the standard optimal control formulation.

  12. Balancing entrepreneurship and business practices for e-collaboration: responsible information sharing in academic research.

    PubMed

    Porter, Mark W; Porter, Mark William; Milley, David; Oliveti, Kristyn; Ladd, Allen; O'Hara, Ryan J; Desai, Bimal R; White, Peter S

    2008-11-06

    Flexible, highly accessible collaboration tools can inherently conflict with controls placed on information sharing by offices charged with privacy protection, compliance, and maintenance of the general business environment. Our implementation of a commercial enterprise wiki within the academic research environment addresses concerns of all involved through the development of a robust user training program, a suite of software customizations that enhance security elements, a robust auditing program, allowance for inter-institutional wiki collaboration, and wiki-specific governance.

  13. Computational methods of robust controller design for aerodynamic flutter suppression

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1981-01-01

    The development of Riccati iteration, a tool for the design and analysis of linear control systems is examined. First, Riccati iteration is applied to the problem of pole placement and order reduction in two-time scale control systems. Order reduction, yielding a good approximation to the original system, is demonstrated using a 16th order linear model of a turbofan engine. Next, a numerical method for solving the Riccati equation is presented and demonstrated for a set of eighth order random examples. A literature review of robust controller design methods follows which includes a number of methods for reducing the trajectory and performance index sensitivity in linear regulators. Lastly, robust controller design for large parameter variations is discussed.

  14. A framework for developing safe and effective large-fire response in a new fire management paradigm

    Treesearch

    Christopher J. Dunn; Matthew P. Thompson; David E. Calkin

    2017-01-01

    The impacts of wildfires have increased in recent decades because of historical forest and fire management, a rapidly changing climate, and an increasingly populated wildland urban interface. This increasingly complex fire environment highlights the importance of developing robust tools to support risk-informed decision making. While tools have been developed to aid...

  15. Overview of computational control research at UT Austin

    NASA Technical Reports Server (NTRS)

    Bong, Wie

    1989-01-01

    An overview of current research activities at UT Austin is presented to discuss certain technical issues in the following areas: (1) Computer-Aided Nonlinear Control Design: In this project, the describing function method is employed for the nonlinear control analysis and design of a flexible spacecraft equipped with pulse modulated reaction jets. INCA program has been enhanced to allow the numerical calculation of describing functions as well as the nonlinear limit cycle analysis capability in the frequency domain; (2) Robust Linear Quadratic Gaussian (LQG) Compensator Synthesis: Robust control design techniques and software tools are developed for flexible space structures with parameter uncertainty. In particular, an interactive, robust multivariable control design capability is being developed for INCA program; and (3) LQR-Based Autonomous Control System for the Space Station: In this project, real time implementation of LQR-based autonomous control system is investigated for the space station with time-varying inertias and with significant multibody dynamic interactions.

  16. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  17. Computational fluid dynamics applications to improve crop production systems

    USDA-ARS?s Scientific Manuscript database

    Computational fluid dynamics (CFD), numerical analysis and simulation tools of fluid flow processes have emerged from the development stage and become nowadays a robust design tool. It is widely used to study various transport phenomena which involve fluid flow, heat and mass transfer, providing det...

  18. Turning science on robust cattle into improved genetic selection decisions.

    PubMed

    Amer, P R

    2012-04-01

    More robust cattle have the potential to increase farm profitability, improve animal welfare, reduce the contribution of ruminant livestock to greenhouse gas emissions and decrease the risk of food shortages in the face of increased variability in the farm environment. Breeding is a powerful tool for changing the robustness of cattle; however, insufficient recording of breeding goal traits and selection of animals at younger ages tend to favour genetic change in productivity traits relative to robustness traits. This paper has extended a previously proposed theory of artificial evolution to demonstrate, using deterministic simulation, how choice of breeding scheme design can be used as a tool to manipulate the direction of genetic progress, whereas the breeding goal remains focussed on the factors motivating individual farm decision makers. Particular focus was placed on the transition from progeny testing or mass selection to genomic selection breeding strategies. Transition to genomic selection from a breeding strategy where candidates are selected before records from progeny being available was shown to be highly likely to favour genetic progress in robustness traits relative to productivity traits. This was shown even with modest numbers of animals available for training and when heritability for robustness traits was only slightly lower than that for productivity traits. When transitioning from progeny testing to a genomic selection strategy without progeny testing, it was shown that there is a significant risk that robustness traits could become less influential in selection relative to productivity traits. Augmentations of training populations using genotyped cows and support for industry-wide improvements in phenotypic recording of robustness traits were put forward as investment opportunities for stakeholders wishing to facilitate the application of science on robust cattle into improved genetic selection schemes.

  19. Linear, multivariable robust control with a mu perspective

    NASA Technical Reports Server (NTRS)

    Packard, Andy; Doyle, John; Balas, Gary

    1993-01-01

    The structured singular value is a linear algebra tool developed to study a particular class of matrix perturbation problems arising in robust feedback control of multivariable systems. These perturbations are called linear fractional, and are a natural way to model many types of uncertainty in linear systems, including state-space parameter uncertainty, multiplicative and additive unmodeled dynamics uncertainty, and coprime factor and gap metric uncertainty. The structured singular value theory provides a natural extension of classical SISO robustness measures and concepts to MIMO systems. The structured singular value analysis, coupled with approximate synthesis methods, make it possible to study the tradeoff between performance and uncertainty that occurs in all feedback systems. In MIMO systems, the complexity of the spatial interactions in the loop gains make it difficult to heuristically quantify the tradeoffs that must occur. This paper examines the role played by the structured singular value (and its computable bounds) in answering these questions, as well as its role in the general robust, multivariable control analysis and design problem.

  20. Robust multi-site MR data processing: iterative optimization of bias correction, tissue classification, and registration.

    PubMed

    Young Kim, Eun; Johnson, Hans J

    2013-01-01

    A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.

  1. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  2. A novel modification of the Turing test for artificial intelligence and robotics in healthcare.

    PubMed

    Ashrafian, Hutan; Darzi, Ara; Athanasiou, Thanos

    2015-03-01

    The increasing demands of delivering higher quality global healthcare has resulted in a corresponding expansion in the development of computer-based and robotic healthcare tools that rely on artificially intelligent technologies. The Turing test was designed to assess artificial intelligence (AI) in computer technology. It remains an important qualitative tool for testing the next generation of medical diagnostics and medical robotics. Development of quantifiable diagnostic accuracy meta-analytical evaluative techniques for the Turing test paradigm. Modification of the Turing test to offer quantifiable diagnostic precision and statistical effect-size robustness in the assessment of AI for computer-based and robotic healthcare technologies. Modification of the Turing test to offer robust diagnostic scores for AI can contribute to enhancing and refining the next generation of digital diagnostic technologies and healthcare robotics. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Microscopy image segmentation tool: Robust image data analysis

    NASA Astrophysics Data System (ADS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  4. Modeling PPP Economic Benefits for Lunar ISRU

    NASA Astrophysics Data System (ADS)

    Blair, B.

    2017-10-01

    A new tool is needed for selecting the PPP strategy that could maximize the rate of lunar commercialization by attracting private capital into the development of critical infrastructure and robust capability. A PPP model under development for NASA-ESO will be described.

  5. Robustness Analysis of Integrated LPV-FDI Filters and LTI-FTC System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Khong, Thuan H.; Shin, Jong-Yeob

    2007-01-01

    This paper proposes an analysis framework for robustness analysis of a nonlinear dynamics system that can be represented by a polynomial linear parameter varying (PLPV) system with constant bounded uncertainty. The proposed analysis framework contains three key tools: 1) a function substitution method which can convert a nonlinear system in polynomial form into a PLPV system, 2) a matrix-based linear fractional transformation (LFT) modeling approach, which can convert a PLPV system into an LFT system with the delta block that includes key uncertainty and scheduling parameters, 3) micro-analysis, which is a well known robust analysis tool for linear systems. The proposed analysis framework is applied to evaluating the performance of the LPV-fault detection and isolation (FDI) filters of the closed-loop system of a transport aircraft in the presence of unmodeled actuator dynamics and sensor gain uncertainty. The robustness analysis results are compared with nonlinear time simulations.

  6. Tool compounds robustly increase turnover of an artificial substrate by glucocerebrosidase in human brain lysates.

    PubMed

    Berger, Zdenek; Perkins, Sarah; Ambroise, Claude; Oborski, Christine; Calabrese, Matthew; Noell, Stephen; Riddell, David; Hirst, Warren D

    2015-01-01

    Mutations in glucocerebrosidase (GBA1) cause Gaucher disease and also represent a common risk factor for Parkinson's disease and Dementia with Lewy bodies. Recently, new tool molecules were described which can increase turnover of an artificial substrate 4MUG when incubated with mutant N370S GBA1 from human spleen. Here we show that these compounds exert a similar effect on the wild-type enzyme in a cell-free system. In addition, these tool compounds robustly increase turnover of 4MUG by GBA1 derived from human cortex, despite substantially lower glycosylation of GBA1 in human brain, suggesting that the degree of glycosylation is not important for compound binding. Surprisingly, these tool compounds failed to robustly alter GBA1 turnover of 4MUG in the mouse brain homogenate. Our data raise the possibility that in vivo models with humanized glucocerebrosidase may be needed for efficacy assessments of such small molecules.

  7. Exploring critical pathways for urban water management to identify robust strategies under deep uncertainties.

    PubMed

    Urich, Christian; Rauch, Wolfgang

    2014-12-01

    Long-term projections for key drivers needed in urban water infrastructure planning such as climate change, population growth, and socio-economic changes are deeply uncertain. Traditional planning approaches heavily rely on these projections, which, if a projection stays unfulfilled, can lead to problematic infrastructure decisions causing high operational costs and/or lock-in effects. New approaches based on exploratory modelling take a fundamentally different view. Aim of these is, to identify an adaptation strategy that performs well under many future scenarios, instead of optimising a strategy for a handful. However, a modelling tool to support strategic planning to test the implication of adaptation strategies under deeply uncertain conditions for urban water management does not exist yet. This paper presents a first step towards a new generation of such strategic planning tools, by combing innovative modelling tools, which coevolve the urban environment and urban water infrastructure under many different future scenarios, with robust decision making. The developed approach is applied to the city of Innsbruck, Austria, which is spatially explicitly evolved 20 years into the future under 1000 scenarios to test the robustness of different adaptation strategies. Key findings of this paper show that: (1) Such an approach can be used to successfully identify parameter ranges of key drivers in which a desired performance criterion is not fulfilled, which is an important indicator for the robustness of an adaptation strategy; and (2) Analysis of the rich dataset gives new insights into the adaptive responses of agents to key drivers in the urban system by modifying a strategy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Clinical Outcome Metrics for Optimization of Robust Training

    NASA Technical Reports Server (NTRS)

    Ebert, Doug; Byrne, Vicky; Cole, Richard; Dulchavsky, Scott; Foy, Millennia; Garcia, Kathleen; Gibson, Robert; Ham, David; Hurst, Victor; Kerstman, Eric; hide

    2015-01-01

    The objective of this research is to develop and use clinical outcome metrics and training tools to quantify the differences in performance of a physician vs non-physician crew medical officer (CMO) analogues during simulations.

  9. Robust Implementation of MDC: Teacher Perceptions of Tool Use and Outcomes. Brief Three

    ERIC Educational Resources Information Center

    Lawrence, Nancy; Sanders, Felicia

    2012-01-01

    The Bill and Melinda Gates Foundation has invested in the development and dissemination of high-quality instructional and formative assessment tools to support teachers' incorporation of the Common Core State Standards (CCSS) into their classroom instruction. Lessons from the first generation of standards-based reforms suggest that intense…

  10. Robust Implementation of LDC: Teacher Perceptions of Tool Use and Outcomes. Brief Two

    ERIC Educational Resources Information Center

    Reumann-Moore, Rebecca; Sanders, Felicia

    2012-01-01

    The Bill and Melinda Gates Foundation has invested in the development and dissemination of high-quality instructional and formative assessment tools to support teachers' incorporation of the Common Core State Standards (CCSS) into their classroom instruction. Lessons from the first generation of standards-based reforms suggest that intense…

  11. Beyond singular values and loop shapes

    NASA Technical Reports Server (NTRS)

    Stein, G.

    1985-01-01

    The status of singular value loop-shaping as a design paradigm for multivariable feedback systems is reviewed. It shows that this paradigm is an effective design tool whenever the problem specifications are spacially round. The tool can be arbitrarily conservative, however, when they are not. This happens because singular value conditions for robust performance are not tight (necessary and sufficient) and can severely overstate actual requirements. An alternate paradign is discussed which overcomes these limitations. The alternative includes a more general problem formulation, a new matrix function mu, and tight conditions for both robust stability and robust performance. The state of the art currently permits analysis of feedback systems within this new paradigm. Synthesis remains a subject of research.

  12. A protocatechuate biosensor for Pseudomonas putida KT2440 via promoter and protein evolution.

    PubMed

    Jha, Ramesh K; Bingen, Jeremy M; Johnson, Christopher W; Kern, Theresa L; Khanna, Payal; Trettel, Daniel S; Strauss, Charlie E M; Beckham, Gregg T; Dale, Taraka

    2018-06-01

    Robust fluorescence-based biosensors are emerging as critical tools for high-throughput strain improvement in synthetic biology. Many biosensors are developed in model organisms where sophisticated synthetic biology tools are also well established. However, industrial biochemical production often employs microbes with phenotypes that are advantageous for a target process, and biosensors may fail to directly transition outside the host in which they are developed. In particular, losses in sensitivity and dynamic range of sensing often occur, limiting the application of a biosensor across hosts. Here we demonstrate the optimization of an Escherichia coli- based biosensor in a robust microbial strain for the catabolism of aromatic compounds, Pseudomonas putida KT2440, through a generalizable approach of modulating interactions at the protein-DNA interface in the promoter and the protein-protein dimer interface. The high-throughput biosensor optimization approach demonstrated here is readily applicable towards other allosteric regulators.

  13. A protocatechuate biosensor for Pseudomonas putida KT2440 via promoter and protein evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jha, Ramesh K.; Bingen, Jeremy M.; Johnson, Christopher W.

    Robust fluorescence-based biosensors are emerging as critical tools for high-throughput strain improvement in synthetic biology. Many biosensors are developed in model organisms where sophisticated synthetic biology tools are also well established. However, industrial biochemical production often employs microbes with phenotypes that are advantageous for a target process, and biosensors may fail to directly transition outside the host in which they are developed. In particular, losses in sensitivity and dynamic range of sensing often occur, limiting the application of a biosensor across hosts. In this study, we demonstrate the optimization of an Escherichia coli-based biosensor in a robust microbial strain formore » the catabolism of aromatic compounds, Pseudomonas putida KT2440, through a generalizable approach of modulating interactions at the protein-DNA interface in the promoter and the protein-protein dimer interface. The high-throughput biosensor optimization approach demonstrated here is readily applicable towards other allosteric regulators.« less

  14. A protocatechuate biosensor for Pseudomonas putida KT2440 via promoter and protein evolution

    DOE PAGES

    Jha, Ramesh K.; Bingen, Jeremy M.; Johnson, Christopher W.; ...

    2018-06-01

    Robust fluorescence-based biosensors are emerging as critical tools for high-throughput strain improvement in synthetic biology. Many biosensors are developed in model organisms where sophisticated synthetic biology tools are also well established. However, industrial biochemical production often employs microbes with phenotypes that are advantageous for a target process, and biosensors may fail to directly transition outside the host in which they are developed. In particular, losses in sensitivity and dynamic range of sensing often occur, limiting the application of a biosensor across hosts. In this study, we demonstrate the optimization of an Escherichia coli-based biosensor in a robust microbial strain formore » the catabolism of aromatic compounds, Pseudomonas putida KT2440, through a generalizable approach of modulating interactions at the protein-DNA interface in the promoter and the protein-protein dimer interface. The high-throughput biosensor optimization approach demonstrated here is readily applicable towards other allosteric regulators.« less

  15. A constrained robust least squares approach for contaminant release history identification

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Painter, Scott L.; Wittmeyer, Gordon W.

    2006-04-01

    Contaminant source identification is an important type of inverse problem in groundwater modeling and is subject to both data and model uncertainty. Model uncertainty was rarely considered in the previous studies. In this work, a robust framework for solving contaminant source recovery problems is introduced. The contaminant source identification problem is first cast into one of solving uncertain linear equations, where the response matrix is constructed using a superposition technique. The formulation presented here is general and is applicable to any porous media flow and transport solvers. The robust least squares (RLS) estimator, which originated in the field of robust identification, directly accounts for errors arising from model uncertainty and has been shown to significantly reduce the sensitivity of the optimal solution to perturbations in model and data. In this work, a new variant of RLS, the constrained robust least squares (CRLS), is formulated for solving uncertain linear equations. CRLS allows for additional constraints, such as nonnegativity, to be imposed. The performance of CRLS is demonstrated through one- and two-dimensional test problems. When the system is ill-conditioned and uncertain, it is found that CRLS gave much better performance than its classical counterpart, the nonnegative least squares. The source identification framework developed in this work thus constitutes a reliable tool for recovering source release histories in real applications.

  16. Robustness for slope stability modelling under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  17. Matlab as a robust control design tool

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    1994-01-01

    This presentation introduces Matlab as a tool used in flight control research. The example used to illustrate some of the capabilities of this software is a robust controller designed for a single stage to orbit air breathing vehicles's ascent to orbit. The global requirements of the controller are to stabilize the vehicle and follow a trajectory in the presence of atmospheric disturbances and strong dynamic coupling between airframe and propulsion.

  18. Fuzzy support vector machine: an efficient rule-based classification technique for microarrays.

    PubMed

    Hajiloo, Mohsen; Rabiee, Hamid R; Anooshahpour, Mahdi

    2013-01-01

    The abundance of gene expression microarray data has led to the development of machine learning algorithms applicable for tackling disease diagnosis, disease prognosis, and treatment selection problems. However, these algorithms often produce classifiers with weaknesses in terms of accuracy, robustness, and interpretability. This paper introduces fuzzy support vector machine which is a learning algorithm based on combination of fuzzy classifiers and kernel machines for microarray classification. Experimental results on public leukemia, prostate, and colon cancer datasets show that fuzzy support vector machine applied in combination with filter or wrapper feature selection methods develops a robust model with higher accuracy than the conventional microarray classification models such as support vector machine, artificial neural network, decision trees, k nearest neighbors, and diagonal linear discriminant analysis. Furthermore, the interpretable rule-base inferred from fuzzy support vector machine helps extracting biological knowledge from microarray data. Fuzzy support vector machine as a new classification model with high generalization power, robustness, and good interpretability seems to be a promising tool for gene expression microarray classification.

  19. RSRE: RNA structural robustness evaluator

    PubMed Central

    Shu, Wenjie; Zheng, Zhiqiang; Wang, Shengqi

    2007-01-01

    Biological robustness, defined as the ability to maintain stable functioning in the face of various perturbations, is an important and fundamental topic in current biology, and has become a focus of numerous studies in recent years. Although structural robustness has been explored in several types of RNA molecules, the origins of robustness are still controversial. Computational analysis results are needed to make up for the lack of evidence of robustness in natural biological systems. The RNA structural robustness evaluator (RSRE) web server presented here provides a freely available online tool to quantitatively evaluate the structural robustness of RNA based on the widely accepted definition of neutrality. Several classical structure comparison methods are employed; five randomization methods are implemented to generate control sequences; sub-optimal predicted structures can be optionally utilized to mitigate the uncertainty of secondary structure prediction. With a user-friendly interface, the web application is easy to use. Intuitive illustrations are provided along with the original computational results to facilitate analysis. The RSRE will be helpful in the wide exploration of RNA structural robustness and will catalyze our understanding of RNA evolution. The RSRE web server is freely available at http://biosrv1.bmi.ac.cn/RSRE/ or http://biotech.bmi.ac.cn/RSRE/. PMID:17567615

  20. Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark

    2011-01-01

    A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.

  1. Non-negative Tensor Factorization for Robust Exploratory Big-Data Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrov, Boian; Vesselinov, Velimir Valentinov; Djidjev, Hristo Nikolov

    Currently, large multidimensional datasets are being accumulated in almost every field. Data are: (1) collected by distributed sensor networks in real-time all over the globe, (2) produced by large-scale experimental measurements or engineering activities, (3) generated by high-performance simulations, and (4) gathered by electronic communications and socialnetwork activities, etc. Simultaneous analysis of these ultra-large heterogeneous multidimensional datasets is often critical for scientific discoveries, decision-making, emergency response, and national and global security. The importance of such analyses mandates the development of the next-generation of robust machine learning (ML) methods and tools for bigdata exploratory analysis.

  2. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Tradeoff on Phenotype Robustness in Biological Networks Part II: Ecological Networks

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    In ecological networks, network robustness should be large enough to confer intrinsic robustness for tolerating intrinsic parameter fluctuations, as well as environmental robustness for resisting environmental disturbances, so that the phenotype stability of ecological networks can be maintained, thus guaranteeing phenotype robustness. However, it is difficult to analyze the network robustness of ecological systems because they are complex nonlinear partial differential stochastic systems. This paper develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance sensitivity in ecological networks. We found that the phenotype robustness criterion for ecological networks is that if intrinsic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations and environmental disturbances. These results in robust ecological networks are similar to that in robust gene regulatory networks and evolutionary networks even they have different spatial-time scales. PMID:23515112

  3. Mechanical design in embryos: mechanical signalling, robustness and developmental defects.

    PubMed

    Davidson, Lance A

    2017-05-19

    Embryos are shaped by the precise application of force against the resistant structures of multicellular tissues. Forces may be generated, guided and resisted by cells, extracellular matrix, interstitial fluids, and how they are organized and bound within the tissue's architecture. In this review, we summarize our current thoughts on the multiple roles of mechanics in direct shaping, mechanical signalling and robustness of development. Genetic programmes of development interact with environmental cues to direct the composition of the early embryo and endow cells with active force production. Biophysical advances now provide experimental tools to measure mechanical resistance and collective forces during morphogenesis and are allowing integration of this field with studies of signalling and patterning during development. We focus this review on concepts that highlight this integration, and how the unique contributions of mechanical cues and gradients might be tested side by side with conventional signalling systems. We conclude with speculation on the integration of large-scale programmes of development, and how mechanical responses may ensure robust development and serve as constraints on programmes of tissue self-assembly.This article is part of the themed issue 'Systems morphodynamics: understanding the development of tissue hardware'. © 2017 The Author(s).

  4. The Employee Survey: An Important Tool for Changing the Culture of an Organization

    ERIC Educational Resources Information Center

    Drapeau, Suzanne

    2004-01-01

    A regularly administered employee opinion survey is an important institutional outcomes measurement tool. It can provide robust benchmarks and standards for a whole range of dimensions of a healthy workplace. This kind of survey should also be a critically important component of the process of engaging employees in the development of the…

  5. Introducing a new open source GIS user interface for the SWAT model

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool (SWAT) model is a robust watershed modelling tool. It typically uses the ArcSWAT interface to create its inputs. ArcSWAT is public domain software which works in the licensed ArcGIS environment. The aim of this paper was to develop an open source user interface ...

  6. The Engineering of Engineering Education: Curriculum Development from a Designer's Point of View

    ERIC Educational Resources Information Center

    Rompelman, Otto; De Graaff, Erik

    2006-01-01

    Engineers have a set of powerful tools at their disposal for designing robust and reliable technical systems. In educational design these tools are seldom applied. This paper explores the application of concepts from the systems approach in an educational context. The paradigms of design methodology and systems engineering appear to be suitable…

  7. Off-target model based OPC

    NASA Astrophysics Data System (ADS)

    Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III

    2005-11-01

    Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.

  8. Closed-Loop Evaluation of an Integrated Failure Identification and Fault Tolerant Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine; Khong, thuan

    2006-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems developed for failure detection, identification, and reconfiguration, as well as upset recovery, need to be evaluated over broad regions of the flight envelope or under extreme flight conditions, and should include various sources of uncertainty. To apply formal robustness analysis, formulation of linear fractional transformation (LFT) models of complex parameter-dependent systems is required, which represent system uncertainty due to parameter uncertainty and actuator faults. This paper describes a detailed LFT model formulation procedure from the nonlinear model of a transport aircraft by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The closed-loop system is evaluated over the entire flight envelope based on the generated LFT model which can cover nonlinear dynamics. The robustness analysis results of the closed-loop fault tolerant control system of a transport aircraft are presented. A reliable flight envelope (safe flight regime) is also calculated from the robust performance analysis results, over which the closed-loop system can achieve the desired performance of command tracking and failure detection.

  9. A Design Tool for Robust Composite Structures

    DTIC Science & Technology

    2010-06-01

    a a UNIVERSITY OF ^?CAiVI BRIDGE FINAL REPORT A Design Tool for Robust Composite Structures Frank Zok Materials Department University of ...organic fibers, especially Dyneema®. The principal objectives of the present study were to ascertain the fundamental mechanical properties of Dyneema...composites increases by a factor of 2 and the ductility by almost a factor of 3 over the strain rate range 10-3 s-1 to 104 s- 1. One consequence is

  10. Wisdom of crowds for robust gene network inference

    PubMed Central

    Marbach, Daniel; Costello, James C.; Küffner, Robert; Vega, Nicci; Prill, Robert J.; Camacho, Diogo M.; Allison, Kyle R.; Kellis, Manolis; Collins, James J.; Stolovitzky, Gustavo

    2012-01-01

    Reconstructing gene regulatory networks from high-throughput data is a long-standing problem. Through the DREAM project (Dialogue on Reverse Engineering Assessment and Methods), we performed a comprehensive blind assessment of over thirty network inference methods on Escherichia coli, Staphylococcus aureus, Saccharomyces cerevisiae, and in silico microarray data. We characterize performance, data requirements, and inherent biases of different inference approaches offering guidelines for both algorithm application and development. We observe that no single inference method performs optimally across all datasets. In contrast, integration of predictions from multiple inference methods shows robust and high performance across diverse datasets. Thereby, we construct high-confidence networks for E. coli and S. aureus, each comprising ~1700 transcriptional interactions at an estimated precision of 50%. We experimentally test 53 novel interactions in E. coli, of which 23 were supported (43%). Our results establish community-based methods as a powerful and robust tool for the inference of transcriptional gene regulatory networks. PMID:22796662

  11. Mapping of unknown industrial plant using ROS-based navigation mobile robot

    NASA Astrophysics Data System (ADS)

    Priyandoko, G.; Ming, T. Y.; Achmad, M. S. H.

    2017-10-01

    This research examines how humans work with teleoperated unmanned mobile robot inspection in industrial plant area resulting 2D/3D map for further critical evaluation. This experiment focuses on two parts, the way human-robot doing remote interactions using robust method and the way robot perceives the environment surround as a 2D/3D perspective map. ROS (robot operating system) as a tool was utilized in the development and implementation during the research which comes up with robust data communication method in the form of messages and topics. RGBD SLAM performs the visual mapping function to construct 2D/3D map using Kinect sensor. The results showed that the mobile robot-based teleoperated system are successful to extend human perspective in term of remote surveillance in large area of industrial plant. It was concluded that the proposed work is robust solution for large mapping within an unknown construction building.

  12. Structure Computation of Quiet Spike[Trademark] Flight-Test Data During Envelope Expansion

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2008-01-01

    System identification or mathematical modeling is used in the aerospace community for development of simulation models for robust control law design. These models are often described as linear time-invariant processes. Nevertheless, it is well known that the underlying process is often nonlinear. The reason for using a linear approach has been due to the lack of a proper set of tools for the identification of nonlinear systems. Over the past several decades, the controls and biomedical communities have made great advances in developing tools for the identification of nonlinear systems. These approaches are robust and readily applicable to aerospace systems. In this paper, we show the application of one such nonlinear system identification technique, structure detection, for the analysis of F-15B Quiet Spike(TradeMark) aeroservoelastic flight-test data. Structure detection is concerned with the selection of a subset of candidate terms that best describe the observed output. This is a necessary procedure to compute an efficient system description that may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modeling may be of critical importance for the development of robust parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion, which may save significant development time and costs. The objectives of this study are to demonstrate via analysis of F-15B Quiet Spike aeroservoelastic flight-test data for several flight conditions that 1) linear models are inefficient for modeling aeroservoelastic data, 2) nonlinear identification provides a parsimonious model description while providing a high percent fit for cross-validated data, and 3) the model structure and parameters vary as the flight condition is altered.

  13. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-off on Phenotype Robustness in Biological Networks Part I: Gene Regulatory Networks in Systems and Evolutionary Biology

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view. PMID:23515240

  14. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-off on Phenotype Robustness in Biological Networks Part I: Gene Regulatory Networks in Systems and Evolutionary Biology.

    PubMed

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view.

  15. Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools

    NASA Technical Reports Server (NTRS)

    Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

    2006-01-01

    A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

  16. COME: a robust coding potential calculation tool for lncRNA identification and characterization based on multiple features.

    PubMed

    Hu, Long; Xu, Zhiyu; Hu, Boqin; Lu, Zhi John

    2017-01-09

    Recent genomic studies suggest that novel long non-coding RNAs (lncRNAs) are specifically expressed and far outnumber annotated lncRNA sequences. To identify and characterize novel lncRNAs in RNA sequencing data from new samples, we have developed COME, a coding potential calculation tool based on multiple features. It integrates multiple sequence-derived and experiment-based features using a decompose-compose method, which makes it more accurate and robust than other well-known tools. We also showed that COME was able to substantially improve the consistency of predication results from other coding potential calculators. Moreover, COME annotates and characterizes each predicted lncRNA transcript with multiple lines of supporting evidence, which are not provided by other tools. Remarkably, we found that one subgroup of lncRNAs classified by such supporting features (i.e. conserved local RNA secondary structure) was highly enriched in a well-validated database (lncRNAdb). We further found that the conserved structural domains on lncRNAs had better chance than other RNA regions to interact with RNA binding proteins, based on the recent eCLIP-seq data in human, indicating their potential regulatory roles. Overall, we present COME as an accurate, robust and multiple-feature supported method for the identification and characterization of novel lncRNAs. The software implementation is available at https://github.com/lulab/COME. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, S.; Katz, J.; Wurtenberger, L.

    Low emission development strategies (LEDS) articulate economy-wide policies and implementation plans designed to enable a country to meet its long-term development objectives while reducing greenhouse gas emissions. A development impact assessment tool was developed to inform an analytically robust and transparent prioritization of LEDS actions based on their economic, social, and environmental impacts. The graphical tool helps policymakers communicate the development impacts of LEDS options and identify actions that help meet both emissions reduction and development goals. This paper summarizes the adaptation and piloting of the tool in Kenya and Montenegro. The paper highlights strengths of the tool and discussesmore » key needs for improving it.« less

  18. An Intercompany Perspective on Biopharmaceutical Drug Product Robustness Studies.

    PubMed

    Morar-Mitrica, Sorina; Adams, Monica L; Crotts, George; Wurth, Christine; Ihnat, Peter M; Tabish, Tanvir; Antochshuk, Valentyn; DiLuzio, Willow; Dix, Daniel B; Fernandez, Jason E; Gupta, Kapil; Fleming, Michael S; He, Bing; Kranz, James K; Liu, Dingjiang; Narasimhan, Chakravarthy; Routhier, Eric; Taylor, Katherine D; Truong, Nobel; Stokes, Elaine S E

    2018-02-01

    The Biophorum Development Group (BPDG) is an industry-wide consortium enabling networking and sharing of best practices for the development of biopharmaceuticals. To gain a better understanding of current industry approaches for establishing biopharmaceutical drug product (DP) robustness, the BPDG-Formulation Point Share group conducted an intercompany collaboration exercise, which included a bench-marking survey and extensive group discussions around the scope, design, and execution of robustness studies. The results of this industry collaboration revealed several key common themes: (1) overall DP robustness is defined by both the formulation and the manufacturing process robustness; (2) robustness integrates the principles of quality by design (QbD); (3) DP robustness is an important factor in setting critical quality attribute control strategies and commercial specifications; (4) most companies employ robustness studies, along with prior knowledge, risk assessments, and statistics, to develop the DP design space; (5) studies are tailored to commercial development needs and the practices of each company. Three case studies further illustrate how a robustness study design for a biopharmaceutical DP balances experimental complexity, statistical power, scientific understanding, and risk assessment to provide the desired product and process knowledge. The BPDG-Formulation Point Share discusses identified industry challenges with regard to biopharmaceutical DP robustness and presents some recommendations for best practices. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  19. An "in silico" Bioinformatics Laboratory Manual for Bioscience Departments: "Prediction of Glycosylation Sites in Phosphoethanolamine Transferases"

    ERIC Educational Resources Information Center

    Alyuruk, Hakan; Cavas, Levent

    2014-01-01

    Genomics and proteomics projects have produced a huge amount of raw biological data including DNA and protein sequences. Although these data have been stored in data banks, their evaluation is strictly dependent on bioinformatics tools. These tools have been developed by multidisciplinary experts for fast and robust analysis of biological data.…

  20. The development of a standardised diet history tool to support the diagnosis of food allergy.

    PubMed

    Skypala, Isabel J; Venter, Carina; Meyer, Rosan; deJong, Nicolette W; Fox, Adam T; Groetch, Marion; Oude Elberink, J N; Sprikkelman, Aline; Diamandi, Louiza; Vlieg-Boerstra, Berber J

    2015-01-01

    The disparity between reported and diagnosed food allergy makes robust diagnosis imperative. The allergy-focussed history is an important starting point, but published literature on its efficacy is sparse. Using a structured approach to connect symptoms, suspected foods and dietary intake, a multi-disciplinary task force of the European Academy of Allergy and Clinical Immunology developed paediatric and adult diet history tools. Both tools are divided into stages using traffic light labelling (red, amber and green). The red stage requires the practitioner to gather relevant information on symptoms, atopic history, food triggers, foods eaten and nutritional issues. The amber stage facilitates interpretation of the responses to the red-stage questions, thus enabling the practitioner to prepare to move forward. The final green stage provides a summary template and test algorithm to support continuation down the diagnostic pathway. These tools will provide a standardised, practical approach to support food allergy diagnosis, ensuring that all relevant information is captured and interpreted in a robust manner. Future work is required to validate their use in diverse age groups, disease entities and in different countries, in order to account for differences in health care systems, food availability and dietary norms.

  1. Robust Selection Algorithm (RSA) for Multi-Omic Biomarker Discovery; Integration with Functional Network Analysis to Identify miRNA Regulated Pathways in Multiple Cancers.

    PubMed

    Sehgal, Vasudha; Seviour, Elena G; Moss, Tyler J; Mills, Gordon B; Azencott, Robert; Ram, Prahlad T

    2015-01-01

    MicroRNAs (miRNAs) play a crucial role in the maintenance of cellular homeostasis by regulating the expression of their target genes. As such, the dysregulation of miRNA expression has been frequently linked to cancer. With rapidly accumulating molecular data linked to patient outcome, the need for identification of robust multi-omic molecular markers is critical in order to provide clinical impact. While previous bioinformatic tools have been developed to identify potential biomarkers in cancer, these methods do not allow for rapid classification of oncogenes versus tumor suppressors taking into account robust differential expression, cutoffs, p-values and non-normality of the data. Here, we propose a methodology, Robust Selection Algorithm (RSA) that addresses these important problems in big data omics analysis. The robustness of the survival analysis is ensured by identification of optimal cutoff values of omics expression, strengthened by p-value computed through intensive random resampling taking into account any non-normality in the data and integration into multi-omic functional networks. Here we have analyzed pan-cancer miRNA patient data to identify functional pathways involved in cancer progression that are associated with selected miRNA identified by RSA. Our approach demonstrates the way in which existing survival analysis techniques can be integrated with a functional network analysis framework to efficiently identify promising biomarkers and novel therapeutic candidates across diseases.

  2. Proteomics tools reveal startlingly high amounts of oxytocin in plasma and serum

    NASA Astrophysics Data System (ADS)

    Brandtzaeg, Ole Kristian; Johnsen, Elin; Roberg-Larsen, Hanne; Seip, Knut Fredrik; Maclean, Evan L.; Gesquiere, Laurence R.; Leknes, Siri; Lundanes, Elsa; Wilson, Steven Ray

    2016-08-01

    The neuropeptide oxytocin (OT) is associated with a plethora of social behaviors, and is a key topic at the intersection of psychology and biology. However, tools for measuring OT are still not fully developed. We describe a robust nano liquid chromatography-mass spectrometry (nanoLC-MS) platform for measuring the total amount of OT in human plasma/serum. OT binds strongly to plasma proteins, but a reduction/alkylation (R/A) procedure breaks this bond, enabling ample detection of total OT. The method (R/A + robust nanoLC-MS) was used to determine total OT plasma/serum levels to startlingly high concentrations (high pg/mL-ng/mL). Similar results were obtained when combining R/A and ELISA. Compared to measuring free OT, measuring total OT can have advantages in e.g. biomarker studies.

  3. A model to assess the Mars Telecommunications Network relay robustness

    NASA Technical Reports Server (NTRS)

    Girerd, Andre R.; Meshkat, Leila; Edwards, Charles D., Jr.; Lee, Charles H.

    2005-01-01

    The relatively long mission durations and compatible radio protocols of current and projected Mars orbiters have enabled the gradual development of a heterogeneous constellation providing proximity communication services for surface assets. The current and forecasted capability of this evolving network has reached the point that designers of future surface missions consider complete dependence on it. Such designers, along with those architecting network requirements, have a need to understand the robustness of projected communication service. A model has been created to identify the robustness of the Mars Network as a function of surface location and time. Due to the decade-plus time horizon considered, the network will evolve, with emerging productive nodes and nodes that cease or fail to contribute. The model is a flexible framework to holistically process node information into measures of capability robustness that can be visualized for maximum understanding. Outputs from JPL's Telecom Orbit Analysis Simulation Tool (TOAST) provide global telecom performance parameters for current and projected orbiters. Probabilistic estimates of orbiter fuel life are derived from orbit keeping burn rates, forecasted maneuver tasking, and anomaly resolution budgets. Orbiter reliability is estimated probabilistically. A flexible scheduling framework accommodates the projected mission queue as well as potential alterations.

  4. Robust stability of fractional order polynomials with complicated uncertainty structure

    PubMed Central

    Şenol, Bilal; Pekař, Libor

    2017-01-01

    The main aim of this article is to present a graphical approach to robust stability analysis for families of fractional order (quasi-)polynomials with complicated uncertainty structure. More specifically, the work emphasizes the multilinear, polynomial and general structures of uncertainty and, moreover, the retarded quasi-polynomials with parametric uncertainty are studied. Since the families with these complex uncertainty structures suffer from the lack of analytical tools, their robust stability is investigated by numerical calculation and depiction of the value sets and subsequent application of the zero exclusion condition. PMID:28662173

  5. A High-Availability, Distributed Hardware Control System Using Java

    NASA Technical Reports Server (NTRS)

    Niessner, Albert F.

    2011-01-01

    Two independent coronagraph experiments that require 24/7 availability with different optical layouts and different motion control requirements are commanded and controlled with the same Java software system executing on many geographically scattered computer systems interconnected via TCP/IP. High availability of a distributed system requires that the computers have a robust communication messaging system making the mix of TCP/IP (a robust transport), and XML (a robust message) a natural choice. XML also adds the configuration flexibility. Java then adds object-oriented paradigms, exception handling, heavily tested libraries, and many third party tools for implementation robustness. The result is a software system that provides users 24/7 access to two diverse experiments with XML files defining the differences

  6. Robust Design Optimization via Failure Domain Bounding

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2007-01-01

    This paper extends and applies the strategies recently developed by the authors for handling constraints under uncertainty to robust design optimization. For the scope of this paper, robust optimization is a methodology aimed at problems for which some parameters are uncertain and are only known to belong to some uncertainty set. This set can be described by either a deterministic or a probabilistic model. In the methodology developed herein, optimization-based strategies are used to bound the constraint violation region using hyper-spheres and hyper-rectangles. By comparing the resulting bounding sets with any given uncertainty model, it can be determined whether the constraints are satisfied for all members of the uncertainty model (i.e., constraints are feasible) or not (i.e., constraints are infeasible). If constraints are infeasible and a probabilistic uncertainty model is available, upper bounds to the probability of constraint violation can be efficiently calculated. The tools developed enable approximating not only the set of designs that make the constraints feasible but also, when required, the set of designs for which the probability of constraint violation is below a prescribed admissible value. When constraint feasibility is possible, several design criteria can be used to shape the uncertainty model of performance metrics of interest. Worst-case, least-second-moment, and reliability-based design criteria are considered herein. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, these strategies are easily applicable to a broad range of engineering problems.

  7. NADIR: A Flexible Archiving System Current Development

    NASA Astrophysics Data System (ADS)

    Knapic, C.; De Marco, M.; Smareglia, R.; Molinaro, M.

    2014-05-01

    The New Archiving Distributed InfrastructuRe (NADIR) is under development at the Italian center for Astronomical Archives (IA2) to increase the performances of the current archival software tools at the data center. Traditional softwares usually offer simple and robust solutions to perform data archive and distribution but are awkward to adapt and reuse in projects that have different purposes. Data evolution in terms of data model, format, publication policy, version, and meta-data content are the main threats to re-usage. NADIR, using stable and mature framework features, answers those very challenging issues. Its main characteristics are a configuration database, a multi threading and multi language environment (C++, Java, Python), special features to guarantee high scalability, modularity, robustness, error tracking, and tools to monitor with confidence the status of each project at each archiving site. In this contribution, the development of the core components is presented, commenting also on some performance and innovative features (multi-cast and publisher-subscriber paradigms). NADIR is planned to be developed as simply as possible with default configurations for every project, first of all for LBT and other IA2 projects.

  8. Statistics based sampling for controller and estimator design

    NASA Astrophysics Data System (ADS)

    Tenne, Dirk

    The purpose of this research is the development of statistical design tools for robust feed-forward/feedback controllers and nonlinear estimators. This dissertation is threefold and addresses the aforementioned topics nonlinear estimation, target tracking and robust control. To develop statistically robust controllers and nonlinear estimation algorithms, research has been performed to extend existing techniques, which propagate the statistics of the state, to achieve higher order accuracy. The so-called unscented transformation has been extended to capture higher order moments. Furthermore, higher order moment update algorithms based on a truncated power series have been developed. The proposed techniques are tested on various benchmark examples. Furthermore, the unscented transformation has been utilized to develop a three dimensional geometrically constrained target tracker. The proposed planar circular prediction algorithm has been developed in a local coordinate framework, which is amenable to extension of the tracking algorithm to three dimensional space. This tracker combines the predictions of a circular prediction algorithm and a constant velocity filter by utilizing the Covariance Intersection. This combined prediction can be updated with the subsequent measurement using a linear estimator. The proposed technique is illustrated on a 3D benchmark trajectory, which includes coordinated turns and straight line maneuvers. The third part of this dissertation addresses the design of controller which include knowledge of parametric uncertainties and their distributions. The parameter distributions are approximated by a finite set of points which are calculated by the unscented transformation. This set of points is used to design robust controllers which minimize a statistical performance of the plant over the domain of uncertainty consisting of a combination of the mean and variance. The proposed technique is illustrated on three benchmark problems. The first relates to the design of prefilters for a linear and nonlinear spring-mass-dashpot system and the second applies a feedback controller to a hovering helicopter. Lastly, the statistical robust controller design is devoted to a concurrent feed-forward/feedback controller structure for a high-speed low tension tape drive.

  9. Design and Analysis of Morpheus Lander Flight Control System

    NASA Technical Reports Server (NTRS)

    Jang, Jiann-Woei; Yang, Lee; Fritz, Mathew; Nguyen, Louis H.; Johnson, Wyatt R.; Hart, Jeremy J.

    2014-01-01

    The Morpheus Lander is a vertical takeoff and landing test bed vehicle developed to demonstrate the system performance of the Guidance, Navigation and Control (GN&C) system capability for the integrated autonomous landing and hazard avoidance system hardware and software. The Morpheus flight control system design must be robust to various mission profiles. This paper presents a design methodology for employing numerical optimization to develop the Morpheus flight control system. The design objectives include attitude tracking accuracy and robust stability with respect to rigid body dynamics and propellant slosh. Under the assumption that the Morpheus time-varying dynamics and control system can be frozen over a short period of time, the flight controllers are designed to stabilize all selected frozen-time control systems in the presence of parametric uncertainty. Both control gains in the inner attitude control loop and guidance gains in the outer position control loop are designed to maximize the vehicle performance while ensuring robustness. The flight control system designs provided herein have been demonstrated to provide stable control systems in both Draper Ares Stability Analysis Tool (ASAT) and the NASA/JSC Trick-based Morpheus time domain simulation.

  10. Tools of Robustness for Item Response Theory.

    ERIC Educational Resources Information Center

    Jones, Douglas H.

    This paper briefly demonstrates a few of the possibilities of a systematic application of robustness theory, concentrating on the estimation of ability when the true item response model does and does not fit the data. The definition of the maximum likelihood estimator (MLE) of ability is briefly reviewed. After introducing the notion of…

  11. Molecular mechanisms governing differential robustness of development and environmental responses in plants

    PubMed Central

    Lachowiec, Jennifer; Queitsch, Christine; Kliebenstein, Daniel J.

    2016-01-01

    Background Robustness to genetic and environmental perturbation is a salient feature of multicellular organisms. Loss of developmental robustness can lead to severe phenotypic defects and fitness loss. However, perfect robustness, i.e. no variation at all, is evolutionarily unfit as organisms must be able to change phenotype to properly respond to changing environments and biotic challenges. Plasticity is the ability to adjust phenotypes predictably in response to specific environmental stimuli, which can be considered a transient shift allowing an organism to move from one robust phenotypic state to another. Plants, as sessile organisms that undergo continuous development, are particularly dependent on an exquisite fine-tuning of the processes that balance robustness and plasticity to maximize fitness. Scope and Conclusions This paper reviews recently identified mechanisms, both systems-level and molecular, that modulate robustness, and discusses their implications for the optimization of plant fitness. Robustness in living systems arises from the structure of genetic networks, the specific molecular functions of the underlying genes, and their interactions. This very same network responsible for the robustness of specific developmental states also has to be built such that it enables plastic yet robust shifts in response to environmental changes. In plants, the interactions and functions of signal transduction pathways activated by phytohormones and the tendency for plants to tolerate whole-genome duplications, tandem gene duplication and hybridization are emerging as major regulators of robustness in development. Despite their obvious implications for plant evolution and plant breeding, the mechanistic underpinnings by which plants modulate precise levels of robustness, plasticity and evolvability in networks controlling different phenotypes are under-studied. PMID:26473020

  12. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  13. The Role of Crop Systems Simulation in Agriculture and Environment

    USDA-ARS?s Scientific Manuscript database

    Over the past 30 to 40 years, simulation of crop systems has advanced from a neophyte science with inadequate computing power into a robust and increasingly accepted science supported by improved software, languages, development tools, and computer capabilities. Crop system simulators contain mathe...

  14. Robust detection-isolation-accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Weiss, J. L.; Pattipati, K. R.; Willsky, A. S.; Eterno, J. S.; Crawford, J. T.

    1985-01-01

    The results of a one year study to: (1) develop a theory for Robust Failure Detection and Identification (FDI) in the presence of model uncertainty, (2) develop a design methodology which utilizes the robust FDI ththeory, (3) apply the methodology to a sensor FDI problem for the F-100 jet engine, and (4) demonstrate the application of the theory to the evaluation of alternative FDI schemes are presented. Theoretical results in statistical discrimination are used to evaluate the robustness of residual signals (or parity relations) in terms of their usefulness for FDI. Furthermore, optimally robust parity relations are derived through the optimization of robustness metrics. The result is viewed as decentralization of the FDI process. A general structure for decentralized FDI is proposed and robustness metrics are used for determining various parameters of the algorithm.

  15. The role of 3D visualisation as an analytical tool preparatory to numerical modelling [rapid communication

    NASA Astrophysics Data System (ADS)

    Robins, N. S.; Rutter, H. K.; Dumpleton, S.; Peach, D. W.

    2005-01-01

    Groundwater investigation has long depended on the process of developing a conceptual flow model as a precursor to developing a mathematical model, which in turn may lead in complex aquifers to the development of a numerical approximation model. The assumptions made in the development of the conceptual model depend heavily on the geological framework defining the aquifer, and if the conceptual model is inappropriate then subsequent modelling will also be incorrect. Paradoxically, the development of a robust conceptual model remains difficult, not least because this 3D paradigm is usually reduced to 2D plans and sections. 3D visualisation software is now available to facilitate the development of the conceptual model, to make the model more robust and defensible and to assist in demonstrating the hydraulics of the aquifer system. Case studies are presented to demonstrate the role and cost-effectiveness of the visualisation process.

  16. FANSe2: a robust and cost-efficient alignment tool for quantitative next-generation sequencing applications.

    PubMed

    Xiao, Chuan-Le; Mai, Zhi-Biao; Lian, Xin-Lei; Zhong, Jia-Yong; Jin, Jing-Jie; He, Qing-Yu; Zhang, Gong

    2014-01-01

    Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT) based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads) to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/.

  17. Inducer analysis/pump model development

    NASA Astrophysics Data System (ADS)

    Cheng, Gary C.

    1994-03-01

    Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.

  18. Inducer analysis/pump model development

    NASA Technical Reports Server (NTRS)

    Cheng, Gary C.

    1994-01-01

    Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.

  19. Kendall-Theil Robust Line (KTRLine--version 1.0)-A Visual Basic Program for Calculating and Graphing Robust Nonparametric Estimates of Linear-Regression Coefficients Between Two Continuous Variables

    USGS Publications Warehouse

    Granato, Gregory E.

    2006-01-01

    The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and data in subsequent rows. The user may choose the columns that contain the independent (X) and dependent (Y) variable. A third column, if present, may contain metadata such as the sample-collection location and date. The program screens the input files and plots the data. The KTRLine software is a graphical tool that facilitates development of regression models by use of graphs of the regression line with data, the regression residuals (with X or Y), and percentile plots of the cumulative frequency of the X variable, Y variable, and the regression residuals. The user may individually transform the independent and dependent variables to reduce heteroscedasticity and to linearize data. The program plots the data and the regression line. The program also prints model specifications and regression statistics to the screen. The user may save and print the regression results. The program can accept data sets that contain up to about 15,000 XY data points, but because the program must sort the array of all pairwise slopes, the program may be perceptibly slow with data sets that contain more than about 1,000 points.

  20. A Leader's Guide to Mathematics Curriculum Topic Study

    ERIC Educational Resources Information Center

    Keeley, Page; Mundry, Susan; Tobey, Cheryl Rose; Carroll, Catherine E.

    2012-01-01

    The Curriculum Topic Study (CTS) process, funded by the National Science Foundation, supports teachers in improving practice by connecting standards and research to curriculum, instruction, and assessment. Designed for facilitators, this guide provides a robust set of professional development tools, templates, and designs to strengthen mathematics…

  1. The art of attrition: development of robust oat microsatellites

    USDA-ARS?s Scientific Manuscript database

    Microsatellite or simple sequence repeat (SSR) markers are important tools for genetic analyses, especially those targeting diversity, based on the fact that multiple alleles can occur at a given locus. Currently, only 160 genomic-based SSR markers are publicly available for oat, most of which have...

  2. RoBuST: an integrated genomics resource for the root and bulb crop families Apiaceae and Alliaceae

    PubMed Central

    2010-01-01

    Background Root and bulb vegetables (RBV) include carrots, celeriac (root celery), parsnips (Apiaceae), onions, garlic, and leek (Alliaceae)—food crops grown globally and consumed worldwide. Few data analysis platforms are currently available where data collection, annotation and integration initiatives are focused on RBV plant groups. Scientists working on RBV include breeders, geneticists, taxonomists, plant pathologists, and plant physiologists who use genomic data for a wide range of activities including the development of molecular genetic maps, delineation of taxonomic relationships, and investigation of molecular aspects of gene expression in biochemical pathways and disease responses. With genomic data coming from such diverse areas of plant science, availability of a community resource focused on these RBV data types would be of great interest to this scientific community. Description The RoBuST database has been developed to initiate a platform for collecting and organizing genomic information useful for RBV researchers. The current release of RoBuST contains genomics data for 294 Alliaceae and 816 Apiaceae plant species and has the following features: (1) comprehensive sequence annotations of 3663 genes 5959 RNAs, 22,723 ESTs and 11,438 regulatory sequence elements from Apiaceae and Alliaceae plant families; (2) graphical tools for visualization and analysis of sequence data; (3) access to traits, biosynthetic pathways, genetic linkage maps and molecular taxonomy data associated with Alliaceae and Apiaceae plants; and (4) comprehensive plant splice signal repository of 659,369 splice signals collected from 6015 plant species for comparative analysis of plant splicing patterns. Conclusions RoBuST, available at http://robust.genome.com, provides an integrated platform for researchers to effortlessly explore and analyze genomic data associated with root and bulb vegetables. PMID:20691054

  3. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    PubMed

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  4. Robust Fault Detection for Aircraft Using Mixed Structured Singular Value Theory and Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Collins, Emmanuel G.

    2000-01-01

    The purpose of fault detection is to identify when a fault or failure has occurred in a system such as an aircraft or expendable launch vehicle. The faults may occur in sensors, actuators, structural components, etc. One of the primary approaches to model-based fault detection relies on analytical redundancy. That is the output of a computer-based model (actually a state estimator) is compared with the sensor measurements of the actual system to determine when a fault has occurred. Unfortunately, the state estimator is based on an idealized mathematical description of the underlying plant that is never totally accurate. As a result of these modeling errors, false alarms can occur. This research uses mixed structured singular value theory, a relatively recent and powerful robustness analysis tool, to develop robust estimators and demonstrates the use of these estimators in fault detection. To allow qualitative human experience to be effectively incorporated into the detection process fuzzy logic is used to predict the seriousness of the fault that has occurred.

  5. Towards an Automated Acoustic Detection System for Free Ranging Elephants.

    PubMed

    Zeppelzauer, Matthias; Hensman, Sean; Stoeger, Angela S

    The human-elephant conflict is one of the most serious conservation problems in Asia and Africa today. The involuntary confrontation of humans and elephants claims the lives of many animals and humans every year. A promising approach to alleviate this conflict is the development of an acoustic early warning system. Such a system requires the robust automated detection of elephant vocalizations under unconstrained field conditions. Today, no system exists that fulfills these requirements. In this paper, we present a method for the automated detection of elephant vocalizations that is robust to the diverse noise sources present in the field. We evaluate the method on a dataset recorded under natural field conditions to simulate a real-world scenario. The proposed method outperformed existing approaches and robustly and accurately detected elephants. It thus can form the basis for a future automated early warning system for elephants. Furthermore, the method may be a useful tool for scientists in bioacoustics for the study of wildlife recordings.

  6. A non-disruptive technology for robust 3D tool tracking for ultrasound-guided interventions.

    PubMed

    Mung, Jay; Vignon, Francois; Jain, Ameet

    2011-01-01

    In the past decade ultrasound (US) has become the preferred modality for a number of interventional procedures, offering excellent soft tissue visualization. The main limitation however is limited visualization of surgical tools. A new method is proposed for robust 3D tracking and US image enhancement of surgical tools under US guidance. Small US sensors are mounted on existing surgical tools. As the imager emits acoustic energy, the electrical signal from the sensor is analyzed to reconstruct its 3D coordinates. These coordinates can then be used for 3D surgical navigation, similar to current day tracking systems. A system with real-time 3D tool tracking and image enhancement was implemented on a commercial ultrasound scanner and 3D probe. Extensive water tank experiments with a tracked 0.2mm sensor show robust performance in a wide range of imaging conditions and tool position/orientations. The 3D tracking accuracy was 0.36 +/- 0.16mm throughout the imaging volume of 55 degrees x 27 degrees x 150mm. Additionally, the tool was successfully tracked inside a beating heart phantom. This paper proposes an image enhancement and tool tracking technology with sub-mm accuracy for US-guided interventions. The technology is non-disruptive, both in terms of existing clinical workflow and commercial considerations, showing promise for large scale clinical impact.

  7. Dynamic robustness of knowledge collaboration network of open source product development community

    NASA Astrophysics Data System (ADS)

    Zhou, Hong-Li; Zhang, Xiao-Dong

    2018-01-01

    As an emergent innovative design style, open source product development communities are characterized by a self-organizing, mass collaborative, networked structure. The robustness of the community is critical to its performance. Using the complex network modeling method, the knowledge collaboration network of the community is formulated, and the robustness of the network is systematically and dynamically studied. The characteristics of the network along the development period determine that its robustness should be studied from three time stages: the start-up, development and mature stages of the network. Five kinds of user-loss pattern are designed, to assess the network's robustness under different situations in each of these three time stages. Two indexes - the largest connected component and the network efficiency - are used to evaluate the robustness of the community. The proposed approach is applied in an existing open source car design community. The results indicate that the knowledge collaboration networks show different levels of robustness in different stages and different user loss patterns. Such analysis can be applied to provide protection strategies for the key users involved in knowledge dissemination and knowledge contribution at different stages of the network, thereby promoting the sustainable and stable development of the open source community.

  8. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  9. Characterizing challenged Minnesota ballots

    NASA Astrophysics Data System (ADS)

    Nagy, George; Lopresti, Daniel; Barney Smith, Elisa H.; Wu, Ziyan

    2011-01-01

    Photocopies of the ballots challenged in the 2008 Minnesota elections, which constitute a public record, were scanned on a high-speed scanner and made available on a public radio website. The PDF files were downloaded, converted to TIF images, and posted on the PERFECT website. Based on a review of relevant image-processing aspects of paper-based election machinery and on additional statistics and observations on the posted sample data, robust tools were developed for determining the underlying grid of the targets on these ballots regardless of skew, clipping, and other degradations caused by high-speed copying and digitization. The accuracy and robustness of a method based on both index-marks and oval targets are demonstrated on 13,435 challenged ballot page images.

  10. Cell communities and robustness in development.

    PubMed

    Monk, N A

    1997-11-01

    The robustness of patterning events in development is a key feature that must be accounted for in proposed models of these events. When considering explicitly cellular systems, robustness can be exhibited at different levels of organization. Consideration of two widespread patterning mechanisms suggests that robustness at the level of cell communities can result from variable development at the level of individual cells; models of these mechanisms show how interactions between participating cells guarantee community-level robustness. Cooperative interactions enhance homogeneity within communities of like cells and the sharpness of boundaries between communities of distinct cells, while competitive interactions amplify small inhomogeneities within communities of initially equivalent cells, resulting in fine-grained patterns of cell specialization.

  11. OPERA: A free and open source QSAR tool for predicting physicochemical properties and environmental fate endpoints

    EPA Science Inventory

    Collecting the chemical structures and data for necessary QSAR modeling is facilitated by available public databases and open data. However, QSAR model performance is dependent on the quality of data and modeling methodology used. This study developed robust QSAR models for physi...

  12. Yokoi's Theory of Lateral Innovation: Applications for Learning Game Design

    ERIC Educational Resources Information Center

    Warren, Scott J.; Jones, Greg

    2008-01-01

    There are several major challenges for instructional designers seeking to design learning games. These include the lack of access, the cost of rapidly advancing/expensive technology tools that make developing games uneconomical, the institutional time constraints limiting game use, and the concerns that schools lack sufficiently robust computer…

  13. 76 FR 14592 - Safety Management System; Withdrawal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-17

    ...-06A] RIN 2120-AJ15 Safety Management System; Withdrawal AGENCY: Federal Aviation Administration (FAA... (``product/ service providers'') to develop a Safety Management System (SMS). The FAA is withdrawing the... management with a set of robust decision-making tools to use to improve safety. The FAA received 89 comments...

  14. Systematic review of methods for quantifying teamwork in the operating theatre

    PubMed Central

    Marshall, D.; Sykes, M.; McCulloch, P.; Shalhoub, J.; Maruthappu, M.

    2018-01-01

    Background Teamwork in the operating theatre is becoming increasingly recognized as a major factor in clinical outcomes. Many tools have been developed to measure teamwork. Most fall into two categories: self‐assessment by theatre staff and assessment by observers. A critical and comparative analysis of the validity and reliability of these tools is lacking. Methods MEDLINE and Embase databases were searched following PRISMA guidelines. Content validity was assessed using measurements of inter‐rater agreement, predictive validity and multisite reliability, and interobserver reliability using statistical measures of inter‐rater agreement and reliability. Quantitative meta‐analysis was deemed unsuitable. Results Forty‐eight articles were selected for final inclusion; self‐assessment tools were used in 18 and observational tools in 28, and there were two qualitative studies. Self‐assessment of teamwork by profession varied with the profession of the assessor. The most robust self‐assessment tool was the Safety Attitudes Questionnaire (SAQ), although this failed to demonstrate multisite reliability. The most robust observational tool was the Non‐Technical Skills (NOTECHS) system, which demonstrated both test–retest reliability (P > 0·09) and interobserver reliability (Rwg = 0·96). Conclusion Self‐assessment of teamwork by the theatre team was influenced by professional differences. Observational tools, when used by trained observers, circumvented this.

  15. Robustness Analysis and Reliable Flight Regime Estimation of an Integrated Resilent Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2008-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.

  16. Benchmarking of a treatment planning system for spot scanning proton therapy: Comparison and analysis of robustness to setup errors of photon IMRT and proton SFUD treatment plans of base of skull meningioma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harding, R., E-mail: ruth.harding2@wales.nhs.uk; Trnková, P.; Lomax, A. J.

    Purpose: Base of skull meningioma can be treated with both intensity modulated radiation therapy (IMRT) and spot scanned proton therapy (PT). One of the main benefits of PT is better sparing of organs at risk, but due to the physical and dosimetric characteristics of protons, spot scanned PT can be more sensitive to the uncertainties encountered in the treatment process compared with photon treatment. Therefore, robustness analysis should be part of a comprehensive comparison between these two treatment methods in order to quantify and understand the sensitivity of the treatment techniques to uncertainties. The aim of this work was tomore » benchmark a spot scanning treatment planning system for planning of base of skull meningioma and to compare the created plans and analyze their robustness to setup errors against the IMRT technique. Methods: Plans were produced for three base of skull meningioma cases: IMRT planned with a commercial TPS [Monaco (Elekta AB, Sweden)]; single field uniform dose (SFUD) spot scanning PT produced with an in-house TPS (PSI-plan); and SFUD spot scanning PT plan created with a commercial TPS [XiO (Elekta AB, Sweden)]. A tool for evaluating robustness to random setup errors was created and, for each plan, both a dosimetric evaluation and a robustness analysis to setup errors were performed. Results: It was possible to create clinically acceptable treatment plans for spot scanning proton therapy of meningioma with a commercially available TPS. However, since each treatment planning system uses different methods, this comparison showed different dosimetric results as well as different sensitivities to setup uncertainties. The results confirmed the necessity of an analysis tool for assessing plan robustness to provide a fair comparison of photon and proton plans. Conclusions: Robustness analysis is a critical part of plan evaluation when comparing IMRT plans with spot scanned proton therapy plans.« less

  17. Development and Implementation of a Design Metric for Systems Containing Long-Term Fluid Loops

    NASA Technical Reports Server (NTRS)

    Steele, John W.

    2016-01-01

    John Steele, a chemist and technical fellow from United Technologies Corporation, provided a water quality module to assist engineers and scientists with a metric tool to evaluate risks associated with the design of space systems with fluid loops. This design metric is a methodical, quantitative, lessons-learned based means to evaluate the robustness of a long-term fluid loop system design. The tool was developed by a cross-section of engineering disciplines who had decades of experience and problem resolution.

  18. AutoCellSeg: robust automatic colony forming unit (CFU)/cell analysis using adaptive image segmentation and easy-to-use post-editing techniques.

    PubMed

    Khan, Arif Ul Maula; Torelli, Angelo; Wolf, Ivo; Gretz, Norbert

    2018-05-08

    In biological assays, automated cell/colony segmentation and counting is imperative owing to huge image sets. Problems occurring due to drifting image acquisition conditions, background noise and high variation in colony features in experiments demand a user-friendly, adaptive and robust image processing/analysis method. We present AutoCellSeg (based on MATLAB) that implements a supervised automatic and robust image segmentation method. AutoCellSeg utilizes multi-thresholding aided by a feedback-based watershed algorithm taking segmentation plausibility criteria into account. It is usable in different operation modes and intuitively enables the user to select object features interactively for supervised image segmentation method. It allows the user to correct results with a graphical interface. This publicly available tool outperforms tools like OpenCFU and CellProfiler in terms of accuracy and provides many additional useful features for end-users.

  19. Robust diagnosis of non-Hodgkin lymphoma phenotypes validated on gene expression data from different laboratories.

    PubMed

    Bhanot, Gyan; Alexe, Gabriela; Levine, Arnold J; Stolovitzky, Gustavo

    2005-01-01

    A major challenge in cancer diagnosis from microarray data is the need for robust, accurate, classification models which are independent of the analysis techniques used and can combine data from different laboratories. We propose such a classification scheme originally developed for phenotype identification from mass spectrometry data. The method uses a robust multivariate gene selection procedure and combines the results of several machine learning tools trained on raw and pattern data to produce an accurate meta-classifier. We illustrate and validate our method by applying it to gene expression datasets: the oligonucleotide HuGeneFL microarray dataset of Shipp et al. (www.genome.wi.mit.du/MPR/lymphoma) and the Hu95Av2 Affymetrix dataset (DallaFavera's laboratory, Columbia University). Our pattern-based meta-classification technique achieves higher predictive accuracies than each of the individual classifiers , is robust against data perturbations and provides subsets of related predictive genes. Our techniques predict that combinations of some genes in the p53 pathway are highly predictive of phenotype. In particular, we find that in 80% of DLBCL cases the mRNA level of at least one of the three genes p53, PLK1 and CDK2 is elevated, while in 80% of FL cases, the mRNA level of at most one of them is elevated.

  20. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  1. Accumulating Evidence and Research Organization (AERO) model: a new tool for representing, analyzing, and planning a translational research program.

    PubMed

    Hey, Spencer Phillips; Heilig, Charles M; Weijer, Charles

    2013-05-30

    Maximizing efficiency in drug development is important for drug developers, policymakers, and human subjects. Limited funds and the ethical imperative of risk minimization demand that researchers maximize the knowledge gained per patient-subject enrolled. Yet, despite a common perception that the current system of drug development is beset by inefficiencies, there remain few approaches for systematically representing, analyzing, and communicating the efficiency and coordination of the research enterprise. In this paper, we present the first steps toward developing such an approach: a graph-theoretic tool for representing the Accumulating Evidence and Research Organization (AERO) across a translational trajectory. This initial version of the AERO model focuses on elucidating two dimensions of robustness: (1) the consistency of results among studies with an identical or similar outcome metric; and (2) the concordance of results among studies with qualitatively different outcome metrics. The visual structure of the model is a directed acyclic graph, designed to capture these two dimensions of robustness and their relationship to three basic questions that underlie the planning of a translational research program: What is the accumulating state of total evidence? What has been the translational trajectory? What studies should be done next? We demonstrate the utility of the AERO model with an application to a case study involving the antibacterial agent, moxifloxacin, for the treatment of drug-susceptible tuberculosis. We then consider some possible elaborations for the AERO model and propose a number of ways in which the tool could be used to enhance the planning, reporting, and analysis of clinical trials. The AERO model provides an immediate visual representation of the number of studies done at any stage of research, depicting both the robustness of evidence and the relationship of each study to the larger translational trajectory. In so doing, it makes some of the invisible or inchoate properties of the research system explicit - helping to elucidate judgments about the accumulating state of evidence and supporting decision-making for future research.

  2. F-15B QuietSpike(TradeMark) Aeroservoelastic Flight Test Data Analysis

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2007-01-01

    System identification or mathematical modelling is utilised in the aerospace community for the development of simulation models for robust control law design. These models are often described as linear, time-invariant processes and assumed to be uniform throughout the flight envelope. Nevertheless, it is well known that the underlying process is inherently nonlinear. The reason for utilising a linear approach has been due to the lack of a proper set of tools for the identification of nonlinear systems. Over the past several decades the controls and biomedical communities have made great advances in developing tools for the identification of nonlinear systems. These approaches are robust and readily applicable to aerospace systems. In this paper, we show the application of one such nonlinear system identification technique, structure detection, for the analysis of F-15B QuietSpike(TradeMark) aeroservoelastic flight test data. Structure detection is concerned with the selection of a subset of candidate terms that best describe the observed output. This is a necessary procedure to compute an efficient system description which may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modelling may be of critical importance for the development of robust, parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion which may save significant development time and costs. The objectives of this study are to demonstrate via analysis of F-15B QuietSpike(TradeMark) aeroservoelastic flight test data for several flight conditions (Mach number) that (i) linear models are inefficient for modelling aeroservoelastic data, (ii) nonlinear identification provides a parsimonious model description whilst providing a high percent fit for cross-validated data and (iii) the model structure and parameters vary as the flight condition is altered.

  3. Implementation of a Low-Thrust Trajectory Optimization Algorithm for Preliminary Design

    NASA Technical Reports Server (NTRS)

    Sims, Jon A.; Finlayson, Paul A.; Rinderle, Edward A.; Vavrina, Matthew A.; Kowalkowski, Theresa D.

    2006-01-01

    A tool developed for the preliminary design of low-thrust trajectories is described. The trajectory is discretized into segments and a nonlinear programming method is used for optimization. The tool is easy to use, has robust convergence, and can handle many intermediate encounters. In addition, the tool has a wide variety of features, including several options for objective function and different low-thrust propulsion models (e.g., solar electric propulsion, nuclear electric propulsion, and solar sail). High-thrust, impulsive trajectories can also be optimized.

  4. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  5. Cognitive Issues in Learning Advanced Physics: An Example from Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Singh, Chandralekha; Zhu, Guangtian

    2009-11-01

    We are investigating cognitive issues in learning quantum mechanics in order to develop effective teaching and learning tools. The analysis of cognitive issues is particularly important for bridging the gap between the quantitative and conceptual aspects of quantum mechanics and for ensuring that the learning tools help students build a robust knowledge structure. We discuss the cognitive aspects of quantum mechanics that are similar or different from those of introductory physics and their implications for developing strategies to help students develop a good grasp of quantum mechanics.

  6. The effects of ecology and evolutionary history on robust capuchin morphological diversity.

    PubMed

    Wright, Kristin A; Wright, Barth W; Ford, Susan M; Fragaszy, Dorothy; Izar, Patricia; Norconk, Marilyn; Masterson, Thomas; Hobbs, David G; Alfaro, Michael E; Lynch Alfaro, Jessica W

    2015-01-01

    Recent molecular work has confirmed the long-standing morphological hypothesis that capuchins are comprised of two distinct clades, the gracile (untufted) capuchins (genus Cebus, Erxleben, 1777) and the robust (tufted) capuchins (genus Sapajus Kerr, 1792). In the past, the robust group was treated as a single, undifferentiated and cosmopolitan species, with data from all populations lumped together in morphological and ecological studies, obscuring morphological differences that might exist across this radiation. Genetic evidence suggests that the modern radiation of robust capuchins began diversifying ∼2.5 Ma, with significant subsequent geographic expansion into new habitat types. In this study we use a morphological sample of gracile and robust capuchin craniofacial and postcranial characters to examine how ecology and evolutionary history have contributed to morphological diversity within the robust capuchins. We predicted that if ecology is driving robust capuchin variation, three distinct robust morphotypes would be identified: (1) the Atlantic Forest species (Sapajus xanthosternos, S. robustus, and S. nigritus), (2) the Amazonian rainforest species (S. apella, S. cay and S. macrocephalus), and (3) the Cerrado-Caatinga species (S. libidinosus). Alternatively, if diversification time between species pairs predicts degree of morphological difference, we predicted that the recently diverged S. apella, S. macrocephalus, S. libidinosus, and S. cay would be morphologically comparable, with greater variation among the more ancient lineages of S. nigritus, S. xanthosternos, and S. robustus. Our analyses suggest that S. libidinosus has the most derived craniofacial and postcranial features, indicative of inhabiting a more terrestrial niche that includes a dependence on tool use for the extraction of imbedded foods. We also suggest that the cranial robusticity of S. macrocephalus and S. apella are indicative of recent competition with sympatric gracile capuchin species, resulting in character displacement. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Designing Flood Management Systems for Joint Economic and Ecological Robustness

    NASA Astrophysics Data System (ADS)

    Spence, C. M.; Grantham, T.; Brown, C. M.; Poff, N. L.

    2015-12-01

    Freshwater ecosystems across the United States are threatened by hydrologic change caused by water management operations and non-stationary climate trends. Nonstationary hydrology also threatens flood management systems' performance. Ecosystem managers and flood risk managers need tools to design systems that achieve flood risk reduction objectives while sustaining ecosystem functions and services in an uncertain hydrologic future. Robust optimization is used in water resources engineering to guide system design under climate change uncertainty. Using principles introduced by Eco-Engineering Decision Scaling (EEDS), we extend robust optimization techniques to design flood management systems that meet both economic and ecological goals simultaneously across a broad range of future climate conditions. We use three alternative robustness indices to identify flood risk management solutions that preserve critical ecosystem functions in a case study from the Iowa River, where recent severe flooding has tested the limits of the existing flood management system. We seek design modifications to the system that both reduce expected cost of flood damage while increasing ecologically beneficial inundation of riparian floodplains across a wide range of plausible climate futures. The first robustness index measures robustness as the fraction of potential climate scenarios in which both engineering and ecological performance goals are met, implicitly weighting each climate scenario equally. The second index builds on the first by using climate projections to weight each climate scenario, prioritizing acceptable performance in climate scenarios most consistent with climate projections. The last index measures robustness as mean performance across all climate scenarios, but penalizes scenarios with worse performance than average, rewarding consistency. Results stemming from alternate robustness indices reflect implicit assumptions about attitudes toward risk and reveal the tradeoffs between using structural and non-structural flood management strategies to ensure economic and ecological robustness.

  8. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. SU-E-T-625: Robustness Evaluation and Robust Optimization of IMPT Plans Based on Per-Voxel Standard Deviation of Dose Distributions.

    PubMed

    Liu, W; Mohan, R

    2012-06-01

    Proton dose distributions, IMPT in particular, are highly sensitive to setup and range uncertainties. We report a novel method, based on per-voxel standard deviation (SD) of dose distributions, to evaluate the robustness of proton plans and to robustly optimize IMPT plans to render them less sensitive to uncertainties. For each optimization iteration, nine dose distributions are computed - the nominal one, and one each for ± setup uncertainties along x, y and z axes and for ± range uncertainty. SD of dose in each voxel is used to create SD-volume histogram (SVH) for each structure. SVH may be considered a quantitative representation of the robustness of the dose distribution. For optimization, the desired robustness may be specified in terms of an SD-volume (SV) constraint on the CTV and incorporated as a term in the objective function. Results of optimization with and without this constraint were compared in terms of plan optimality and robustness using the so called'worst case' dose distributions; which are obtained by assigning the lowest among the nine doses to each voxel in the clinical target volume (CTV) and the highest to normal tissue voxels outside the CTV. The SVH curve and the area under it for each structure were used as quantitative measures of robustness. Penalty parameter of SV constraint may be varied to control the tradeoff between robustness and plan optimality. We applied these methods to one case each of H&N and lung. In both cases, we found that imposing SV constraint improved plan robustness but at the cost of normal tissue sparing. SVH-based optimization and evaluation is an effective tool for robustness evaluation and robust optimization of IMPT plans. Studies need to be conducted to test the methods for larger cohorts of patients and for other sites. This research is supported by National Cancer Institute (NCI) grant P01CA021239, the University Cancer Foundation via the Institutional Research Grant program at the University of Texas MD Anderson Cancer Center, and MD Anderson’s cancer center support grant CA016672. © 2012 American Association of Physicists in Medicine.

  10. An index-based robust decision making framework for watershed management in a changing climate.

    PubMed

    Kim, Yeonjoo; Chung, Eun-Sung

    2014-03-01

    This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Selective robust optimization: A new intensity-modulated proton therapy optimization strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yupeng; Niemela, Perttu; Siljamaki, Sami

    2015-08-15

    Purpose: To develop a new robust optimization strategy for intensity-modulated proton therapy as an important step in translating robust proton treatment planning from research to clinical applications. Methods: In selective robust optimization, a worst-case-based robust optimization algorithm is extended, and terms of the objective function are selectively computed from either the worst-case dose or the nominal dose. Two lung cancer cases and one head and neck cancer case were used to demonstrate the practical significance of the proposed robust planning strategy. The lung cancer cases had minimal tumor motion less than 5 mm, and, for the demonstration of the methodology,more » are assumed to be static. Results: Selective robust optimization achieved robust clinical target volume (CTV) coverage and at the same time increased nominal planning target volume coverage to 95.8%, compared to the 84.6% coverage achieved with CTV-based robust optimization in one of the lung cases. In the other lung case, the maximum dose in selective robust optimization was lowered from a dose of 131.3% in the CTV-based robust optimization to 113.6%. Selective robust optimization provided robust CTV coverage in the head and neck case, and at the same time improved controls over isodose distribution so that clinical requirements may be readily met. Conclusions: Selective robust optimization may provide the flexibility and capability necessary for meeting various clinical requirements in addition to achieving the required plan robustness in practical proton treatment planning settings.« less

  12. The road map towards providing a robust Raman spectroscopy-based cancer diagnostic platform and integration into clinic

    NASA Astrophysics Data System (ADS)

    Lau, Katherine; Isabelle, Martin; Lloyd, Gavin R.; Old, Oliver; Shepherd, Neil; Bell, Ian M.; Dorney, Jennifer; Lewis, Aaran; Gaifulina, Riana; Rodriguez-Justo, Manuel; Kendall, Catherine; Stone, Nicolas; Thomas, Geraint; Reece, David

    2016-03-01

    Despite the demonstrated potential as an accurate cancer diagnostic tool, Raman spectroscopy (RS) is yet to be adopted by the clinic for histopathology reviews. The Stratified Medicine through Advanced Raman Technologies (SMART) consortium has begun to address some of the hurdles in its adoption for cancer diagnosis. These hurdles include awareness and acceptance of the technology, practicality of integration into the histopathology workflow, data reproducibility and availability of transferrable models. We have formed a consortium, in joint efforts, to develop optimised protocols for tissue sample preparation, data collection and analysis. These protocols will be supported by provision of suitable hardware and software tools to allow statistically sound classification models to be built and transferred for use on different systems. In addition, we are building a validated gastrointestinal (GI) cancers model, which can be trialled as part of the histopathology workflow at hospitals, and a classification tool. At the end of the project, we aim to deliver a robust Raman based diagnostic platform to enable clinical researchers to stage cancer, define tumour margin, build cancer diagnostic models and discover novel disease bio markers.

  13. An intercomparison study of TSM, SEBS, and SEBAL using high-resolution imagery and lysimetric data

    USDA-ARS?s Scientific Manuscript database

    Over the past three decades, numerous remote sensing based ET mapping algorithms were developed. These algorithms provided a robust, economical, and efficient tool for ET estimations at field and regional scales. The Two Source Model (TSM), Surface Energy Balance System (SEBS), and Surface Energy Ba...

  14. Assessing Disease Class-Specific Diagnostic Ability: A Practical Adaptive Test Approach.

    ERIC Educational Resources Information Center

    Papa, Frank J.; Schumacker, Randall E.

    Measures of the robustness of disease class-specific diagnostic concepts could play a central role in training programs designed to assure the development of diagnostic competence. In the pilot study, the authors used disease/sign-symptom conditional probability estimates, Monte Carlo procedures, and artificial intelligence (AI) tools to create…

  15. Predictability and Robustness in the Manipulation of Dynamically Complex Objects

    PubMed Central

    Hasson, Christopher J.

    2017-01-01

    Manipulation of complex objects and tools is a hallmark of many activities of daily living, but how the human neuromotor control system interacts with such objects is not well understood. Even the seemingly simple task of transporting a cup of coffee without spilling creates complex interaction forces that humans need to compensate for. Predicting the behavior of an underactuated object with nonlinear fluid dynamics based on an internal model appears daunting. Hence, this research tests the hypothesis that humans learn strategies that make interactions predictable and robust to inaccuracies in neural representations of object dynamics. The task of moving a cup of coffee is modeled with a cart-and-pendulum system that is rendered in a virtual environment, where subjects interact with a virtual cup with a rolling ball inside using a robotic manipulandum. To gain insight into human control strategies, we operationalize predictability and robustness to permit quantitative theory-based assessment. Predictability is quantified by the mutual information between the applied force and the object dynamics; robustness is quantified by the energy margin away from failure. Three studies are reviewed that show how with practice subjects develop movement strategies that are predictable and robust. Alternative criteria, common for free movement, such as maximization of smoothness and minimization of force, do not account for the observed data. As manual dexterity is compromised in many individuals with neurological disorders, the experimental paradigm and its analyses are a promising platform to gain insights into neurological diseases, such as dystonia and multiple sclerosis, as well as healthy aging. PMID:28035560

  16. Robustness of assembly supply chain networks by considering risk propagation and cascading failure

    NASA Astrophysics Data System (ADS)

    Tang, Liang; Jing, Ke; He, Jie; Stanley, H. Eugene

    2016-10-01

    An assembly supply chain network (ASCN) is composed of manufacturers located in different geographical regions. To analyze the robustness of this ASCN when it suffers from catastrophe disruption events, we construct a cascading failure model of risk propagation. In our model, different disruption scenarios s are considered and the probability equation of all disruption scenarios is developed. Using production capability loss as the robustness index (RI) of an ASCN, we conduct a numerical simulation to assess its robustness. Through simulation, we compare the network robustness at different values of linking intensity and node threshold and find that weak linking intensity or high node threshold increases the robustness of the ASCN. We also compare network robustness levels under different disruption scenarios.

  17. Developing robust recurrence plot analysis techniques for investigating infant respiratory patterns.

    PubMed

    Terrill, Philip I; Wilson, Stephen; Suresh, Sadasivam; Cooper, David M

    2007-01-01

    Recurrence plot analysis is a useful non-linear analysis tool. There are still no well formalised procedures for carrying out this analysis on measured physiological data, and systemising analysis is often difficult. In this paper, the recurrence based embedding is compared to radius based embedding by studying a logistic attractor and measured breathing data collected from sleeping human infants. Recurrence based embedding appears to be a more robust method of carrying out a recurrence analysis when attractor size is likely to be different between datasets. In the infant breathing data, the radius measure calculated at a fixed recurrence, scaled by average respiratory period, allows the accurate discrimination of active sleep from quiet sleep states (AUC=0.975, Sn=098, Sp=0.94).

  18. Simultaneous Independent Control of Tool Axial Force and Temperature in Friction Stir Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Kenneth A.; Grant, Glenn J.; Darsell, Jens T.

    Maintaining consistent tool depth relative to the part surface is a critical requirement for many Friction stir processing (FSP) applications. Force control is often used with the goal of obtaining a constant weld depth. When force control is used, if weld temperature decreases, flow stress increases and the tool is pushed up. If weld temperature increases, flow stress decreases and the tool dives. These variations in tool depth and weld temperature cause various types of weld defects. Robust temperature control for FSP maintains a commanded temperature through control of the spindle axis only. Robust temperature control and force control aremore » completely decoupled in control logic and machine motion. This results in stable temperature, force and tool depth despite the presence of geometric and thermal disturbances. Performance of this control method is presented for various weld paths and alloy systems.« less

  19. Robust extrema features for time-series data analysis.

    PubMed

    Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N

    2013-06-01

    The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.

  20. Robustness of movement models: can models bridge the gap between temporal scales of data sets and behavioural processes?

    PubMed

    Schlägel, Ulrike E; Lewis, Mark A

    2016-12-01

    Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.

  1. RiceMetaSys for salt and drought stress responsive genes in rice: a web interface for crop improvement.

    PubMed

    Sandhu, Maninder; Sureshkumar, V; Prakash, Chandra; Dixit, Rekha; Solanke, Amolkumar U; Sharma, Tilak Raj; Mohapatra, Trilochan; S V, Amitha Mithra

    2017-09-30

    Genome-wide microarray has enabled development of robust databases for functional genomics studies in rice. However, such databases do not directly cater to the needs of breeders. Here, we have attempted to develop a web interface which combines the information from functional genomic studies across different genetic backgrounds with DNA markers so that they can be readily deployed in crop improvement. In the current version of the database, we have included drought and salinity stress studies since these two are the major abiotic stresses in rice. RiceMetaSys, a user-friendly and freely available web interface provides comprehensive information on salt responsive genes (SRGs) and drought responsive genes (DRGs) across genotypes, crop development stages and tissues, identified from multiple microarray datasets. 'Physical position search' is an attractive tool for those using QTL based approach for dissecting tolerance to salt and drought stress since it can provide the list of SRGs and DRGs in any physical interval. To identify robust candidate genes for use in crop improvement, the 'common genes across varieties' search tool is useful. Graphical visualization of expression profiles across genes and rice genotypes has been enabled to facilitate the user and to make the comparisons more impactful. Simple Sequence Repeat (SSR) search in the SRGs and DRGs is a valuable tool for fine mapping and marker assisted selection since it provides primers for survey of polymorphism. An external link to intron specific markers is also provided for this purpose. Bulk retrieval of data without any limit has been enabled in case of locus and SSR search. The aim of this database is to facilitate users with a simple and straight-forward search options for identification of robust candidate genes from among thousands of SRGs and DRGs so as to facilitate linking variation in expression profiles to variation in phenotype. Database URL: http://14.139.229.201.

  2. Transportation Infrastructure Robustness : Joint Engineering and Economic Analysis

    DOT National Transportation Integrated Search

    2017-11-01

    The objectives of this study are to develop a methodology for assessing the robustness of transportation infrastructure facilities and assess the effect of damage to such facilities on travel demand and the facilities users welfare. The robustness...

  3. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.

    2013-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R.M. 2007. Robust estimation of the variogram by residual maximum likelihood. Geoderma 140: 62-72. Richardson, A.M. and Welsh, A.H. 1995. Robust restricted maximum likelihood in mixed linear models. Biometrics 51: 1429-1439. Welsh, A.H. and Richardson, A.M. 1997. Approaches to the robust estimation of mixed models. In: Handbook of Statistics Vol. 15, Elsevier, pp. 343-384.

  4. Intelligent robust control for uncertain nonlinear time-varying systems and its application to robotic systems.

    PubMed

    Chang, Yeong-Chan

    2005-12-01

    This paper addresses the problem of designing adaptive fuzzy-based (or neural network-based) robust controls for a large class of uncertain nonlinear time-varying systems. This class of systems can be perturbed by plant uncertainties, unmodeled perturbations, and external disturbances. Nonlinear H(infinity) control technique incorporated with adaptive control technique and VSC technique is employed to construct the intelligent robust stabilization controller such that an H(infinity) control is achieved. The problem of the robust tracking control design for uncertain robotic systems is employed to demonstrate the effectiveness of the developed robust stabilization control scheme. Therefore, an intelligent robust tracking controller for uncertain robotic systems in the presence of high-degree uncertainties can easily be implemented. Its solution requires only to solve a linear algebraic matrix inequality and a satisfactorily transient and asymptotical tracking performance is guaranteed. A simulation example is made to confirm the performance of the developed control algorithms.

  5. A subagging regression method for estimating the qualitative and quantitative state of groundwater

    NASA Astrophysics Data System (ADS)

    Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young

    2017-08-01

    A subsample aggregating (subagging) regression (SBR) method for the analysis of groundwater data pertaining to trend-estimation-associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of other methods, and the uncertainties are reasonably estimated; the others have no uncertainty analysis option. To validate further, actual groundwater data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by both SBR and GPR regardless of Gaussian or non-Gaussian skewed data. However, it is expected that GPR has a limitation in applications to severely corrupted data by outliers owing to its non-robustness. From the implementations, it is determined that the SBR method has the potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data such as the groundwater level and contaminant concentration.

  6. Improving the 'tool box' for robust industrial enzymes.

    PubMed

    Littlechild, J A

    2017-05-01

    The speed of sequencing of microbial genomes and metagenomes is providing an ever increasing resource for the identification of new robust biocatalysts with industrial applications for many different aspects of industrial biotechnology. Using 'natures catalysts' provides a sustainable approach to chemical synthesis of fine chemicals, general chemicals such as surfactants and new consumer-based materials such as biodegradable plastics. This provides a sustainable and 'green chemistry' route to chemical synthesis which generates no toxic waste and is environmentally friendly. In addition, enzymes can play important roles in other applications such as carbon dioxide capture, breakdown of food and other waste streams to provide a route to the concept of a 'circular economy' where nothing is wasted. The use of improved bioinformatic approaches and the development of new rapid enzyme activity screening methodology can provide an endless resource for new robust industrial biocatalysts.This mini-review will discuss several recent case studies where industrial enzymes of 'high priority' have been identified and characterised. It will highlight specific hydrolase enzymes and recent case studies which have been carried out within our group in Exeter.

  7. 47 CFR 73.9007 - Robustness requirements for covered demodulator products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RADIO SERVICES RADIO BROADCAST SERVICES Digital Broadcast Television Redistribution Control § 73.9007...-available tools or equipment also means specialized electronic tools or software tools that are widely... requirements set forth in this subpart. Such specialized electronic tools or software tools includes, but is...

  8. In a quest for engineering acidophiles for biomining applications: challenges and opportunities.

    PubMed

    Gumulya, Yosephine; Boxall, Naomi J; Khaleque, Himel N; Santala, Ville; Carlson, Ross P; Kaksonen, Anna H

    2018-02-21

    Biomining with acidophilic microorganisms has been used at commercial scale for the extraction of metals from various sulfide ores. With metal demand and energy prices on the rise and the concurrent decline in quality and availability of mineral resources, there is an increasing interest in applying biomining technology, in particular for leaching metals from low grade minerals and wastes. However, bioprocessing is often hampered by the presence of inhibitory compounds that originate from complex ores. Synthetic biology could provide tools to improve the tolerance of biomining microbes to various stress factors that are present in biomining environments, which would ultimately increase bioleaching efficiency. This paper reviews the state-of-the-art tools to genetically modify acidophilic biomining microorganisms and the limitations of these tools. The first part of this review discusses resilience pathways that can be engineered in acidophiles to enhance their robustness and tolerance in harsh environments that prevail in bioleaching. The second part of the paper reviews the efforts that have been carried out towards engineering robust microorganisms and developing metabolic modelling tools. Novel synthetic biology tools have the potential to transform the biomining industry and facilitate the extraction of value from ores and wastes that cannot be processed with existing biomining microorganisms.

  9. In a Quest for Engineering Acidophiles for Biomining Applications: Challenges and Opportunities

    PubMed Central

    Gumulya, Yosephine; Boxall, Naomi J; Khaleque, Himel N; Santala, Ville; Carlson, Ross P; Kaksonen, Anna H

    2018-01-01

    Biomining with acidophilic microorganisms has been used at commercial scale for the extraction of metals from various sulfide ores. With metal demand and energy prices on the rise and the concurrent decline in quality and availability of mineral resources, there is an increasing interest in applying biomining technology, in particular for leaching metals from low grade minerals and wastes. However, bioprocessing is often hampered by the presence of inhibitory compounds that originate from complex ores. Synthetic biology could provide tools to improve the tolerance of biomining microbes to various stress factors that are present in biomining environments, which would ultimately increase bioleaching efficiency. This paper reviews the state-of-the-art tools to genetically modify acidophilic biomining microorganisms and the limitations of these tools. The first part of this review discusses resilience pathways that can be engineered in acidophiles to enhance their robustness and tolerance in harsh environments that prevail in bioleaching. The second part of the paper reviews the efforts that have been carried out towards engineering robust microorganisms and developing metabolic modelling tools. Novel synthetic biology tools have the potential to transform the biomining industry and facilitate the extraction of value from ores and wastes that cannot be processed with existing biomining microorganisms. PMID:29466321

  10. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Reed, Patrick; Trindade, Bernardo; Jonathan, Herman; Harrison, Zeff; Gregory, Characklis

    2016-04-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  11. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Trindade, B. C.; Reed, P. M.; Herman, J. D.; Zeff, H. B.; Characklis, G. W.

    2015-12-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as of the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management should be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  12. Framework for SEM contour analysis

    NASA Astrophysics Data System (ADS)

    Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.

    2017-03-01

    SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.

  13. Robust video copy detection approach based on local tangent space alignment

    NASA Astrophysics Data System (ADS)

    Nie, Xiushan; Qiao, Qianping

    2012-04-01

    We propose a robust content-based video copy detection approach based on local tangent space alignment (LTSA), which is an efficient dimensionality reduction algorithm. The idea is motivated by the fact that the content of video becomes richer and the dimension of content becomes higher. It does not give natural tools for video analysis and understanding because of the high dimensionality. The proposed approach reduces the dimensionality of video content using LTSA, and then generates video fingerprints in low dimensional space for video copy detection. Furthermore, a dynamic sliding window is applied to fingerprint matching. Experimental results show that the video copy detection approach has good robustness and discrimination.

  14. A Simple and Robust Method for Culturing Human-Induced Pluripotent Stem Cells in an Undifferentiated State Using Botulinum Hemagglutinin.

    PubMed

    Kim, Mee-Hae; Matsubara, Yoshifumi; Fujinaga, Yukako; Kino-Oka, Masahiro

    2018-02-01

    Clinical and industrial applications of human-induced pluripotent stem cells (hiPSCs) is hindered by the lack of robust culture strategies capable of sustaining a culture in an undifferentiated state. Here, a simple and robust hiPSC-culture-propagation strategy incorporating botulinum hemagglutinin (HA)-mediated selective removal of cells deviating from an undifferentiated state is developed. After HA treatment, cell-cell adhesion is disrupted, and deviated cells detached from the central region of the colony to subsequently form tight monolayer colonies following prolonged incubation. The authors find that the temporal and dose-dependent activity of HA regulated deviated-cell removal and recoverability after disruption of cell-cell adhesion in hiPSC colonies. The effects of HA are confirmed under all culture conditions examined, regardless of hiPSC line and feeder-dependent or -free culture conditions. After routine application of our HA-treatment paradigm for serial passages, hiPSCs maintains expression of pluripotent markers and readily forms embryoid bodies expressing markers for all three germ-cell layers. This method enables highly efficient culturing of hiPSCs and use of entire undifferentiated portions without having to pick deviated cells manually. This simple and readily reproducible culture strategy is a potentially useful tool for improving the robust and scalable maintenance of undifferentiated hiPSC cultures. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Evaluation of Micronutrient Sensors for Food Matrices in Resource-Limited Settings: A Systematic Narrative Review.

    PubMed

    Waller, Anna W; Lotton, Jennifer L; Gaur, Shashank; Andrade, Jeanette M; Andrade, Juan E

    2018-06-21

    In resource-limited settings, mass food fortification is a common strategy to ensure the population consumes appropriate quantities of essential micronutrients. Food and government organizations in these settings, however, lack tools to monitor the quality and compliance of fortified products and their efficacy to enhance nutrient status. The World Health Organization has developed general guidelines known as ASSURED (Affordable, Sensitive, Specific, User-friendly, Rapid and Robust, Equipment-free, and Deliverable to end-users) to aid the development of useful diagnostic tools for these settings. These guidelines assume performance aspects such as sufficient accuracy, reliability, and validity. The purpose of this systematic narrative review is to examine the micronutrient sensor literature on its adherence towards the ASSURED criteria along with accuracy, reliability, and validation when developing micronutrient sensors for resource-limited settings. Keyword searches were conducted in three databases: Web of Science, PubMed, and Scopus and were based on 6-point inclusion criteria. A 16-question quality assessment tool was developed to determine the adherence towards quality and performance criteria. Of the 2,365 retrieved studies, 42 sensors were included based on inclusion/exclusion criteria. Results showed that improvements to the current sensor design are necessary, especially their affordability, user-friendliness, robustness, equipment-free, and deliverability within the ASSURED criteria, and accuracy and validity of the additional criteria to be useful in resource-limited settings. Although it requires further validation, the 16-question quality assessment tool can be used as a guide in the development of sensors for resource-limited settings. © 2018 Institute of Food Technologists®.

  16. Probing the early development of visual working memory capacity with functional near-infrared spectroscopy

    PubMed Central

    Buss, Aaron T.; Fox, Nicholas; Boas, David A.; Spencer, John P.

    2013-01-01

    Visual working memory (VWM) is a core cognitive system with a highly limited capacity. The present study is the first to examine VWM capacity limits in early development using functional neuroimaging. We recorded optical neuroimaging data while 3- and 4-year-olds completed a change detection task where they detected changes in the shapes of objects after a brief delay. Near-infrared sources and detectors were placed over the following 10–20 positions: F3 and F5 in left frontal cortex, F4 and F6 in right frontal cortex, P3 and P5 in left parietal cortex, and P4 and P6 in right parietal cortex. The first question was whether we would see robust task-specific activation of the frontal-parietal network identified in the adult fMRI literature. This was indeed the case: three left frontal channels and 11 of 12 parietal channels showed a statistically robust difference between the concentration of oxygenated and deoxygenated hemoglobin following the presentation of the sample array. Moreover, four channels in the left hemisphere near P3, P5, and F5 showed a robust increase as the working memory load increased from 1–3 items. Notably, the hemodynamic response did not asymptote at 1–2 items as expected from previous fMRI studies with adults. Finally, 4-year-olds showed a more robust parietal response relative to 3-year-olds, and an increasing sensitivity to the memory load manipulation. These results demonstrate that fNIRS is an effective tool to study the neural processes that underlie the early development of VWM capacity. PMID:23707803

  17. Probing the early development of visual working memory capacity with functional near-infrared spectroscopy.

    PubMed

    Buss, Aaron T; Fox, Nicholas; Boas, David A; Spencer, John P

    2014-01-15

    Visual working memory (VWM) is a core cognitive system with a highly limited capacity. The present study is the first to examine VWM capacity limits in early development using functional neuroimaging. We recorded optical neuroimaging data while 3- and 4-year-olds completed a change detection task where they detected changes in the shapes of objects after a brief delay. Near-infrared sources and detectors were placed over the following 10-20 positions: F3 and F5 in left frontal cortex, F4 and F6 in right frontal cortex, P3 and P5 in left parietal cortex, and P4 and P6 in right parietal cortex. The first question was whether we would see robust task-specific activation of the frontal-parietal network identified in the adult fMRI literature. This was indeed the case: three left frontal channels and 11 of 12 parietal channels showed a statistically robust difference between the concentration of oxygenated and deoxygenated hemoglobin following the presentation of the sample array. Moreover, four channels in the left hemisphere near P3, P5, and F5 showed a robust increase as the working memory load increased from 1 to 3 items. Notably, the hemodynamic response did not asymptote at 1-2 items as expected from previous fMRI studies with adults. Finally, 4-year-olds showed a more robust parietal response relative to 3-year-olds, and an increasing sensitivity to the memory load manipulation. These results demonstrate that fNIRS is an effective tool to study the neural processes that underlie the early development of VWM capacity. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Tools for Understanding Space Weather Impacts to Satellites

    NASA Astrophysics Data System (ADS)

    Green, J. C.; Shprits, Y.; Likar, J. J.; Kellerman, A. C.; Quinn, R. A.; Whelan, P.; Reker, N.; Huston, S. L.

    2017-12-01

    Space weather causes dramatic changes in the near-Earth radiation environment. Intense particle fluxes can damage electronic components on satellites, causing temporary malfunctions, degraded performance, or a complete system/mission loss. Understanding whether space weather is the cause of such problems expedites investigations and guides successful design improvements resulting in a more robust satellite architecture. Here we discuss our progress in developing tools for satellite designers, manufacturers, and decision makers - tools that summarize space weather impacts to specific satellite assets and enable confident identification of the cause and right solution.

  19. A defect-driven diagnostic method for machine tool spindles

    PubMed Central

    Vogl, Gregory W.; Donmez, M. Alkan

    2016-01-01

    Simple vibration-based metrics are, in many cases, insufficient to diagnose machine tool spindle condition. These metrics couple defect-based motion with spindle dynamics; diagnostics should be defect-driven. A new method and spindle condition estimation device (SCED) were developed to acquire data and to separate system dynamics from defect geometry. Based on this method, a spindle condition metric relying only on defect geometry is proposed. Application of the SCED on various milling and turning spindles shows that the new approach is robust for diagnosing the machine tool spindle condition. PMID:28065985

  20. Robust spike classification based on frequency domain neural waveform features.

    PubMed

    Yang, Chenhui; Yuan, Yuan; Si, Jennie

    2013-12-01

    We introduce a new spike classification algorithm based on frequency domain features of the spike snippets. The goal for the algorithm is to provide high classification accuracy, low false misclassification, ease of implementation, robustness to signal degradation, and objectivity in classification outcomes. In this paper, we propose a spike classification algorithm based on frequency domain features (CFDF). It makes use of frequency domain contents of the recorded neural waveforms for spike classification. The self-organizing map (SOM) is used as a tool to determine the cluster number intuitively and directly by viewing the SOM output map. After that, spike classification can be easily performed using clustering algorithms such as the k-Means. In conjunction with our previously developed multiscale correlation of wavelet coefficient (MCWC) spike detection algorithm, we show that the MCWC and CFDF detection and classification system is robust when tested on several sets of artificial and real neural waveforms. The CFDF is comparable to or outperforms some popular automatic spike classification algorithms with artificial and real neural data. The detection and classification of neural action potentials or neural spikes is an important step in single-unit-based neuroscientific studies and applications. After the detection of neural snippets potentially containing neural spikes, a robust classification algorithm is applied for the analysis of the snippets to (1) extract similar waveforms into one class for them to be considered coming from one unit, and to (2) remove noise snippets if they do not contain any features of an action potential. Usually, a snippet is a small 2 or 3 ms segment of the recorded waveform, and differences in neural action potentials can be subtle from one unit to another. Therefore, a robust, high performance classification system like the CFDF is necessary. In addition, the proposed algorithm does not require any assumptions on statistical properties of the noise and proves to be robust under noise contamination.

  1. Reducing regional drought vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Trindade, B. C.; Reed, P. M.; Herman, J. D.; Zeff, H. B.; Characklis, G. W.

    2017-06-01

    Emerging water scarcity concerns in many urban regions are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative drought management strategies. Our results show that appropriately designing adaptive risk-of-failure action triggers required stressing them with a comprehensive sample of deeply uncertain factors in the computational search phase of MORDM. Search under the new ensemble of states-of-the-world is shown to fundamentally change perceived performance tradeoffs and substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Search under deep uncertainty enhanced the discovery of how cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be employed jointly to improve regional robustness and decrease robustness conflicts between the utilities. Insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  2. Wavelet Applications for Flight Flutter Testing

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty; Freudinger, Lawrence C.

    1999-01-01

    Wavelets present a method for signal processing that may be useful for analyzing responses of dynamical systems. This paper describes several wavelet-based tools that have been developed to improve the efficiency of flight flutter testing. One of the tools uses correlation filtering to identify properties of several modes throughout a flight test for envelope expansion. Another tool uses features in time-frequency representations of responses to characterize nonlinearities in the system dynamics. A third tool uses modulus and phase information from a wavelet transform to estimate modal parameters that can be used to update a linear model and reduce conservatism in robust stability margins.

  3. Robust Deep Semantics for Language Understanding

    DTIC Science & Technology

    focus on five areas: deep learning, textual inferential relations, relation and event extraction by distant supervision , semantic parsing and...ontology expansion, and coreference resolution. As time went by, the program focus converged towards emphasizing technologies for knowledge base...natural logic methods for text understanding, improved mention coreference algorithms, and the further development of multilingual tools in CoreNLP.

  4. Curriculum Design for Inquiry: Preservice Elementary Teachers' Mobilization and Adaptation of Science Curriculum Materials

    ERIC Educational Resources Information Center

    Forbes, Cory T.; Davis, Elizabeth A.

    2010-01-01

    Curriculum materials are crucial tools with which teachers engage students in science as inquiry. In order to use curriculum materials effectively, however, teachers must develop a robust capacity for pedagogical design, or the ability to mobilize a variety of personal and curricular resources to promote student learning. The purpose of this study…

  5. Effect of interaction strength on robustness of controlling edge dynamics in complex networks

    NASA Astrophysics Data System (ADS)

    Pang, Shao-Peng; Hao, Fei

    2018-05-01

    Robustness plays a critical role in the controllability of complex networks to withstand failures and perturbations. Recent advances in the edge controllability show that the interaction strength among edges plays a more important role than network structure. Therefore, we focus on the effect of interaction strength on the robustness of edge controllability. Using three categories of all edges to quantify the robustness, we develop a universal framework to evaluate and analyze the robustness in complex networks with arbitrary structures and interaction strengths. Applying our framework to a large number of model and real-world networks, we find that the interaction strength is a dominant factor for the robustness in undirected networks. Meanwhile, the strongest robustness and the optimal edge controllability in undirected networks can be achieved simultaneously. Different from the case of undirected networks, the robustness in directed networks is determined jointly by the interaction strength and the network's degree distribution. Moreover, a stronger robustness is usually associated with a larger number of driver nodes required to maintain full control in directed networks. This prompts us to provide an optimization method by adjusting the interaction strength to optimize the robustness of edge controllability.

  6. Tail mean and related robust solution concepts

    NASA Astrophysics Data System (ADS)

    Ogryczak, Włodzimierz

    2014-01-01

    Robust optimisation might be viewed as a multicriteria optimisation problem where objectives correspond to the scenarios although their probabilities are unknown or imprecise. The simplest robust solution concept represents a conservative approach focused on the worst-case scenario results optimisation. A softer concept allows one to optimise the tail mean thus combining performances under multiple worst scenarios. We show that while considering robust models allowing the probabilities to vary only within given intervals, the tail mean represents the robust solution for only upper bounded probabilities. For any arbitrary intervals of probabilities the corresponding robust solution may be expressed by the optimisation of appropriately combined mean and tail mean criteria thus remaining easily implementable with auxiliary linear inequalities. Moreover, we use the tail mean concept to develope linear programming implementable robust solution concepts related to risk averse optimisation criteria.

  7. Robust Fault Detection Using Robust Z1 Estimation and Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Curry, Tramone; Collins, Emmanuel G., Jr.; Selekwa, Majura; Guo, Ten-Huei (Technical Monitor)

    2001-01-01

    This research considers the application of robust Z(sub 1), estimation in conjunction with fuzzy logic to robust fault detection for an aircraft fight control system. It begins with the development of robust Z(sub 1) estimators based on multiplier theory and then develops a fixed threshold approach to fault detection (FD). It then considers the use of fuzzy logic for robust residual evaluation and FD. Due to modeling errors and unmeasurable disturbances, it is difficult to distinguish between the effects of an actual fault and those caused by uncertainty and disturbance. Hence, it is the aim of a robust FD system to be sensitive to faults while remaining insensitive to uncertainty and disturbances. While fixed thresholds only allow a decision on whether a fault has or has not occurred, it is more valuable to have the residual evaluation lead to a conclusion related to the degree of, or probability of, a fault. Fuzzy logic is a viable means of determining the degree of a fault and allows the introduction of human observations that may not be incorporated in the rigorous threshold theory. Hence, fuzzy logic can provide a more reliable and informative fault detection process. Using an aircraft flight control system, the results of FD using robust Z(sub 1) estimation with a fixed threshold are demonstrated. FD that combines robust Z(sub 1) estimation and fuzzy logic is also demonstrated. It is seen that combining the robust estimator with fuzzy logic proves to be advantageous in increasing the sensitivity to smaller faults while remaining insensitive to uncertainty and disturbances.

  8. FRAIL Questionnaire Screening Tool and Short-Term Outcomes in Geriatric Fracture Patients.

    PubMed

    Gleason, Lauren Jan; Benton, Emily A; Alvarez-Nebreda, M Loreto; Weaver, Michael J; Harris, Mitchel B; Javedan, Houman

    2017-12-01

    There are limited screening tools to predict adverse postoperative outcomes for the geriatric surgical fracture population. Frailty is increasingly recognized as a risk assessment to capture complexity. The goal of this study was to use a short screening tool, the FRAIL scale, to categorize the level of frailty of older adults admitted with a fracture to determine the association of each frailty category with postoperative and 30-day outcomes. Retrospective cohort study. Level 1 trauma center. A total of 175 consecutive patients over age 70 years admitted to co-managed orthopedic trauma and geriatrics services. The FRAIL scale (short 5-question assessment of fatigue, resistance, aerobic capacity, illnesses, and loss of weight) classified the patients into 3 categories: robust (score = 0), prefrail (score = 1-2), and frail (score = 3-5). Postoperative outcome variables collected were postoperative complications, unplanned intensive care unit admission, length of stay (LOS), discharge disposition, and orthopedic follow-up after surgery. Thirty-day outcomes measured were 30-day readmission and 30-day mortality. Analysis of variance (1-way) and Kruskal-Wallis tests were used to compare continuous variables across the 3 FRAIL categories. Fisher exact tests were used to compare categorical variables. Multiple regression analysis, adjusted by age, sex, and Charlson index, was conducted to study the association between frailty category and outcomes. FRAIL scale categorized the patients into 3 groups: robust (n = 29), prefrail (n = 73), and frail (n = 73). There were statistically significant differences between groups in terms of age, comorbidity, dementia, functional dependency, polypharmacy, and rate of institutionalization, being higher in the frailest patients. Hip fracture was the most frequent fracture, and it was more frequent as the frailty of the patient increased (48%, 61%, and 75% in robust, prefrail, and frail groups, respectively). The American Society of Anesthesiologists preoperative risk significantly correlated with the frailty of the patient (American Society of Anesthesiologists score 3-4: 41%, 82% and 86%, in robust, prefrail, and frail groups, P < .001). After adjustment by age, sex, and comorbidity, there was a statistically significant association between frailty and both LOS and the development of any complication after surgery (LOS: 4.2, 5.0, and 7.1 days, P = .002; any complication: 3.4%, 26%, and 39.7%, P = .03; in robust, prefrail, and frail groups). There were also significant differences in discharge disposition (31% of robust vs 4.1% frail, P = .008) and follow-up completion (97% of robust vs 69% of the frail ones). Differences in time to surgery, unplanned intensive care unit admission, and 30-day readmission and mortality, although showing a trend, did not reach statistical significance. Frailty, measured by the FRAIL scale, was associated with increase LOS, complications after surgery, and discharge to rehabilitation facility in geriatric fracture patients. The FRAIL scale is a promising short screen to stratify and help operationalize the perioperative care of older surgical patients. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  9. Construction of a robust, large-scale, collaborative database for raw data in computational chemistry: the Collaborative Chemistry Database Tool (CCDBT).

    PubMed

    Chen, Mingyang; Stott, Amanda C; Li, Shenggang; Dixon, David A

    2012-04-01

    A robust metadata database called the Collaborative Chemistry Database Tool (CCDBT) for massive amounts of computational chemistry raw data has been designed and implemented. It performs data synchronization and simultaneously extracts the metadata. Computational chemistry data in various formats from different computing sources, software packages, and users can be parsed into uniform metadata for storage in a MySQL database. Parsing is performed by a parsing pyramid, including parsers written for different levels of data types and sets created by the parser loader after loading parser engines and configurations. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doris, E.; Krasko, V.A.

    State and local policymakers show increasing interest in spurring the development of customer-sited distributed generation (DG), in particular solar photovoltaic (PV) markets. Prompted by that interest, this analysis examines the use of state policy as a tool to support the development of a robust private investment market. This analysis builds on previous studies that focus on government subsidies to reduce installation costs of individual projects and provides an evaluation of the impacts of policies on stimulating private market development.

  11. Panaceas, uncertainty, and the robust control framework in sustainability science

    PubMed Central

    Anderies, John M.; Rodriguez, Armando A.; Janssen, Marco A.; Cifdaloz, Oguzhan

    2007-01-01

    A critical challenge faced by sustainability science is to develop strategies to cope with highly uncertain social and ecological dynamics. This article explores the use of the robust control framework toward this end. After briefly outlining the robust control framework, we apply it to the traditional Gordon–Schaefer fishery model to explore fundamental performance–robustness and robustness–vulnerability trade-offs in natural resource management. We find that the classic optimal control policy can be very sensitive to parametric uncertainty. By exploring a large class of alternative strategies, we show that there are no panaceas: even mild robustness properties are difficult to achieve, and increasing robustness to some parameters (e.g., biological parameters) results in decreased robustness with respect to others (e.g., economic parameters). On the basis of this example, we extract some broader themes for better management of resources under uncertainty and for sustainability science in general. Specifically, we focus attention on the importance of a continual learning process and the use of robust control to inform this process. PMID:17881574

  12. SaRAD: a Simple and Robust Abbreviation Dictionary.

    PubMed

    Adar, Eytan

    2004-03-01

    Due to recent interest in the use of textual material to augment traditional experiments it has become necessary to automatically cluster, classify and filter natural language information. The Simple and Robust Abbreviation Dictionary (SaRAD) provides an easy to implement, high performance tool for the construction of a biomedical symbol dictionary. The algorithms, applied to the MEDLINE document set, result in a high quality dictionary and toolset to disambiguate abbreviation symbols automatically.

  13. Robust Magnetotelluric Impedance Estimation

    NASA Astrophysics Data System (ADS)

    Sutarno, D.

    2010-12-01

    Robust magnetotelluric (MT) response function estimators are now in standard use by the induction community. Properly devised and applied, these have ability to reduce the influence of unusual data (outliers). The estimators always yield impedance estimates which are better than the conventional least square (LS) estimation because the `real' MT data almost never satisfy the statistical assumptions of Gaussian distribution and stationary upon which normal spectral analysis is based. This paper discuses the development and application of robust estimation procedures which can be classified as M-estimators to MT data. Starting with the description of the estimators, special attention is addressed to the recent development of a bounded-influence robust estimation, including utilization of the Hilbert Transform (HT) operation on causal MT impedance functions. The resulting robust performances are illustrated using synthetic as well as real MT data.

  14. Programmable genetic circuits for pathway engineering.

    PubMed

    Hoynes-O'Connor, Allison; Moon, Tae Seok

    2015-12-01

    Synthetic biology has the potential to provide decisive advances in genetic control of metabolic pathways. However, there are several challenges that synthetic biologists must overcome before this vision becomes a reality. First, a library of diverse and well-characterized sensors, such as metabolite-sensing or condition-sensing promoters, must be constructed. Second, robust programmable circuits that link input conditions with a specific gene regulation response must be developed. Finally, multi-gene targeting strategies must be integrated with metabolically relevant sensors and complex, robust logic. Achievements in each of these areas, which employ the CRISPR/Cas system, in silico modeling, and dynamic sensor-regulators, among other tools, provide a strong basis for future research. Overall, the future for synthetic biology approaches in metabolic engineering holds immense promise. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Characterization of performance-emission indices of a diesel engine using ANFIS operating in dual-fuel mode with LPG

    NASA Astrophysics Data System (ADS)

    Chakraborty, Amitav; Roy, Sumit; Banerjee, Rahul

    2018-03-01

    This experimental work highlights the inherent capability of an adaptive-neuro fuzzy inference system (ANFIS) based model to act as a robust system identification tool (SIT) in prognosticating the performance and emission parameters of an existing diesel engine running of diesel-LPG dual fuel mode. The developed model proved its adeptness by successfully harnessing the effects of the input parameters of load, injection duration and LPG energy share on output parameters of BSFCEQ, BTE, NOX, SOOT, CO and HC. Successive evaluation of the ANFIS model, revealed high levels of resemblance with the already forecasted ANN results for the same input parameters and it was evident that similar to ANN, ANFIS also has the innate ability to act as a robust SIT. The ANFIS predicted data harmonized the experimental data with high overall accuracy. The correlation coefficient (R) values are stretched in between 0.99207 to 0.999988. The mean absolute percentage error (MAPE) tallies were recorded in the range of 0.02-0.173% with the root mean square errors (RMSE) in acceptable margins. Hence the developed model is capable of emulating the actual engine parameters with commendable ranges of accuracy, which in turn would act as a robust prediction platform in the future domains of optimization.

  16. Capability and Development Risk Management in System-of-Systems Architectures: A Portfolio Approach to Decision-Making

    DTIC Science & Technology

    2012-04-30

    tool that provides a means of balancing capability development against cost and interdependent risks through the use of modern portfolio theory ...Focardi, 2007; Tutuncu & Cornuejols, 2007) that are extensions of modern portfolio and control theory . The reformulation allows for possible changes...Acquisition: Wave Model context • An Investment Portfolio Approach – Mean Variance Approach – Mean - Variance : A Robust Version • Concept

  17. Zymomonas mobilis as a model system for production of biofuels and biochemicals

    DOE PAGES

    Yang, Shihui; Fei, Qiang; Zhang, Yaoping; ...

    2016-09-15

    Zymomonas mobilis is a natural ethanologen with many desirable industrial biocatalyst characteristics. In this review, we will discuss work to develop Z. mobilis as a model system for biofuel production from the perspectives of substrate utilization, development for industrial robustness, potential product spectrum, strain evaluation and fermentation strategies. Lastly, this review also encompasses perspectives related to classical genetic tools and emerging technologies in this context.

  18. Zymomonas mobilis as a model system for production of biofuels and biochemicals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Shihui; Fei, Qiang; Zhang, Yaoping

    Zymomonas mobilis is a natural ethanologen with many desirable industrial biocatalyst characteristics. In this review, we will discuss work to develop Z. mobilis as a model system for biofuel production from the perspectives of substrate utilization, development for industrial robustness, potential product spectrum, strain evaluation and fermentation strategies. Lastly, this review also encompasses perspectives related to classical genetic tools and emerging technologies in this context.

  19. Robust estimation approach for blind denoising.

    PubMed

    Rabie, Tamer

    2005-11-01

    This work develops a new robust statistical framework for blind image denoising. Robust statistics addresses the problem of estimation when the idealized assumptions about a system are occasionally violated. The contaminating noise in an image is considered as a violation of the assumption of spatial coherence of the image intensities and is treated as an outlier random variable. A denoised image is estimated by fitting a spatially coherent stationary image model to the available noisy data using a robust estimator-based regression method within an optimal-size adaptive window. The robust formulation aims at eliminating the noise outliers while preserving the edge structures in the restored image. Several examples demonstrating the effectiveness of this robust denoising technique are reported and a comparison with other standard denoising filters is presented.

  20. A Robust Approach to Risk Assessment Based on Species Sensitivity Distributions.

    PubMed

    Monti, Gianna S; Filzmoser, Peter; Deutsch, Roland C

    2018-05-03

    The guidelines for setting environmental quality standards are increasingly based on probabilistic risk assessment due to a growing general awareness of the need for probabilistic procedures. One of the commonly used tools in probabilistic risk assessment is the species sensitivity distribution (SSD), which represents the proportion of species affected belonging to a biological assemblage as a function of exposure to a specific toxicant. Our focus is on the inverse use of the SSD curve with the aim of estimating the concentration, HCp, of a toxic compound that is hazardous to p% of the biological community under study. Toward this end, we propose the use of robust statistical methods in order to take into account the presence of outliers or apparent skew in the data, which may occur without any ecological basis. A robust approach exploits the full neighborhood of a parametric model, enabling the analyst to account for the typical real-world deviations from ideal models. We examine two classic HCp estimation approaches and consider robust versions of these estimators. In addition, we also use data transformations in conjunction with robust estimation methods in case of heteroscedasticity. Different scenarios using real data sets as well as simulated data are presented in order to illustrate and compare the proposed approaches. These scenarios illustrate that the use of robust estimation methods enhances HCp estimation. © 2018 Society for Risk Analysis.

  1. Massive-scale gene co-expression network construction and robustness testing using random matrix theory.

    PubMed

    Gibson, Scott M; Ficklin, Stephen P; Isaacson, Sven; Luo, Feng; Feltus, Frank A; Smith, Melissa C

    2013-01-01

    The study of gene relationships and their effect on biological function and phenotype is a focal point in systems biology. Gene co-expression networks built using microarray expression profiles are one technique for discovering and interpreting gene relationships. A knowledge-independent thresholding technique, such as Random Matrix Theory (RMT), is useful for identifying meaningful relationships. Highly connected genes in the thresholded network are then grouped into modules that provide insight into their collective functionality. While it has been shown that co-expression networks are biologically relevant, it has not been determined to what extent any given network is functionally robust given perturbations in the input sample set. For such a test, hundreds of networks are needed and hence a tool to rapidly construct these networks. To examine functional robustness of networks with varying input, we enhanced an existing RMT implementation for improved scalability and tested functional robustness of human (Homo sapiens), rice (Oryza sativa) and budding yeast (Saccharomyces cerevisiae). We demonstrate dramatic decrease in network construction time and computational requirements and show that despite some variation in global properties between networks, functional similarity remains high. Moreover, the biological function captured by co-expression networks thresholded by RMT is highly robust.

  2. A robust ordering strategy for retailers facing a free shipping option.

    PubMed

    Meng, Qing-chun; Wan, Xiao-le; Rong, Xiao-xia

    2015-01-01

    Free shipping with conditions has become one of the most effective marketing tools available. An increasing number of companies, especially e-businesses, prefer to offer free shipping with some predetermined condition, such as a minimum purchase amount by the customer. However, in practice, the demands of buyers are uncertain; they are often affected by many factors, such as the weather and season. We begin by modeling the centralized ordering problem in which the supplier offers a free shipping service and retailers face stochastic demands. As these random data are considered, only partial information such as the known mean, support, and deviation is needed. The model is then analyzed via a robust optimization method, and the two types of equivalent sets of uncertainty constraints that are obtained provide good mathematical properties with consideration of the robustness of solutions. Subsequently, a numerical example is used to compare the results achieved from a robust optimization method and the linear decision rules. Additionally, the robustness of the optimal solution is discussed, as it is affected by the minimum quantity parameters. The increasing cost-threshold relationship is divided into three periods. In addition, the case study shows that the proposed method achieves better stability as well as computational complexity.

  3. SU-E-T-07: 4DCT Robust Optimization for Esophageal Cancer Using Intensity Modulated Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, L; Department of Industrial Engineering, University of Houston, Houston, TX; Yu, J

    2015-06-15

    Purpose: To develop a 4DCT robust optimization method to reduce the dosimetric impact from respiratory motion in intensity modulated proton therapy (IMPT) for esophageal cancer. Methods: Four esophageal cancer patients were selected for this study. The different phases of CT from a set of 4DCT were incorporated into the worst-case dose distribution robust optimization algorithm. 4DCT robust treatment plans were designed and compared with the conventional non-robust plans. Result doses were calculated on the average and maximum inhale/exhale phases of 4DCT. Dose volume histogram (DVH) band graphic and ΔD95%, ΔD98%, ΔD5%, ΔD2% of CTV between different phases were used tomore » evaluate the robustness of the plans. Results: Compare to the IMPT plans optimized using conventional methods, the 4DCT robust IMPT plans can achieve the same quality in nominal cases, while yield a better robustness to breathing motion. The mean ΔD95%, ΔD98%, ΔD5% and ΔD2% of CTV are 6%, 3.2%, 0.9% and 1% for the robustly optimized plans vs. 16.2%, 11.8%, 1.6% and 3.3% from the conventional non-robust plans. Conclusion: A 4DCT robust optimization method was proposed for esophageal cancer using IMPT. We demonstrate that the 4DCT robust optimization can mitigate the dose deviation caused by the diaphragm motion.« less

  4. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    PubMed

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  5. ORAC-DR: Pipelining With Other People's Code

    NASA Astrophysics Data System (ADS)

    Economou, Frossie; Bridger, Alan; Wright, Gillian S.; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy

    As part of the UKIRT ORAC project, we have developed a pipeline (orac-dr) for driving on-line data reduction using existing astronomical packages as algorithm engines and display tools. The design is modular and extensible on several levels, allowing it to be easily adapted to a wide variety of instruments. Here we briefly review the design, discuss the robustness and speed of execution issues inherent in such pipelines, and address what constitutes a desirable (in terms of ``buy-in'' effort) engine or tool.

  6. Robust inference under the beta regression model with application to health care studies.

    PubMed

    Ghosh, Abhik

    2017-01-01

    Data on rates, percentages, or proportions arise frequently in many different applied disciplines like medical biology, health care, psychology, and several others. In this paper, we develop a robust inference procedure for the beta regression model, which is used to describe such response variables taking values in (0, 1) through some related explanatory variables. In relation to the beta regression model, the issue of robustness has been largely ignored in the literature so far. The existing maximum likelihood-based inference has serious lack of robustness against outliers in data and generate drastically different (erroneous) inference in the presence of data contamination. Here, we develop the robust minimum density power divergence estimator and a class of robust Wald-type tests for the beta regression model along with several applications. We derive their asymptotic properties and describe their robustness theoretically through the influence function analyses. Finite sample performances of the proposed estimators and tests are examined through suitable simulation studies and real data applications in the context of health care and psychology. Although we primarily focus on the beta regression models with a fixed dispersion parameter, some indications are also provided for extension to the variable dispersion beta regression models with an application.

  7. Machine learning for inverse lithography: using stochastic gradient descent for robust photomask synthesis

    NASA Astrophysics Data System (ADS)

    Jia, Ningning; Y Lam, Edmund

    2010-04-01

    Inverse lithography technology (ILT) synthesizes photomasks by solving an inverse imaging problem through optimization of an appropriate functional. Much effort on ILT is dedicated to deriving superior masks at a nominal process condition. However, the lower k1 factor causes the mask to be more sensitive to process variations. Robustness to major process variations, such as focus and dose variations, is desired. In this paper, we consider the focus variation as a stochastic variable, and treat the mask design as a machine learning problem. The stochastic gradient descent approach, which is a useful tool in machine learning, is adopted to train the mask design. Compared with previous work, simulation shows that the proposed algorithm is effective in producing robust masks.

  8. A kriging metamodel-assisted robust optimization method based on a reverse model

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Zhou, Qi; Liu, Congwei; Zhou, Taotao

    2018-02-01

    The goal of robust optimization methods is to obtain a solution that is both optimum and relatively insensitive to uncertainty factors. Most existing robust optimization approaches use outer-inner nested optimization structures where a large amount of computational effort is required because the robustness of each candidate solution delivered from the outer level should be evaluated in the inner level. In this article, a kriging metamodel-assisted robust optimization method based on a reverse model (K-RMRO) is first proposed, in which the nested optimization structure is reduced into a single-loop optimization structure to ease the computational burden. Ignoring the interpolation uncertainties from kriging, K-RMRO may yield non-robust optima. Hence, an improved kriging-assisted robust optimization method based on a reverse model (IK-RMRO) is presented to take the interpolation uncertainty of kriging metamodel into consideration. In IK-RMRO, an objective switching criterion is introduced to determine whether the inner level robust optimization or the kriging metamodel replacement should be used to evaluate the robustness of design alternatives. The proposed criterion is developed according to whether or not the robust status of the individual can be changed because of the interpolation uncertainties from the kriging metamodel. Numerical and engineering cases are used to demonstrate the applicability and efficiency of the proposed approach.

  9. Robust infrared targets tracking with covariance matrix representation

    NASA Astrophysics Data System (ADS)

    Cheng, Jian

    2009-07-01

    Robust infrared target tracking is an important and challenging research topic in many military and security applications, such as infrared imaging guidance, infrared reconnaissance, scene surveillance, etc. To effectively tackle the nonlinear and non-Gaussian state estimation problems, particle filtering is introduced to construct the theory framework of infrared target tracking. Under this framework, the observation probabilistic model is one of main factors for infrared targets tracking performance. In order to improve the tracking performance, covariance matrices are introduced to represent infrared targets with the multi-features. The observation probabilistic model can be constructed by computing the distance between the reference target's and the target samples' covariance matrix. Because the covariance matrix provides a natural tool for integrating multiple features, and is scale and illumination independent, target representation with covariance matrices can hold strong discriminating ability and robustness. Two experimental results demonstrate the proposed method is effective and robust for different infrared target tracking, such as the sensor ego-motion scene, and the sea-clutter scene.

  10. Robust QRS peak detection by multimodal information fusion of ECG and blood pressure signals.

    PubMed

    Ding, Quan; Bai, Yong; Erol, Yusuf Bugra; Salas-Boni, Rebeca; Zhang, Xiaorong; Hu, Xiao

    2016-11-01

    QRS peak detection is a challenging problem when ECG signal is corrupted. However, additional physiological signals may also provide information about the QRS position. In this study, we focus on a unique benchmark provided by PhysioNet/Computing in Cardiology Challenge 2014 and Physiological Measurement focus issue: robust detection of heart beats in multimodal data, which aimed to explore robust methods for QRS detection in multimodal physiological signals. A dataset of 200 training and 210 testing records are used, where the testing records are hidden for evaluating the performance only. An information fusion framework for robust QRS detection is proposed by leveraging existing ECG and ABP analysis tools and combining heart beats derived from different sources. Results show that our approach achieves an overall accuracy of 90.94% and 88.66% on the training and testing datasets, respectively. Furthermore, we observe expected performance at each step of the proposed approach, as an evidence of the effectiveness of our approach. Discussion on the limitations of our approach is also provided.

  11. Discrete Walsh Hadamard transform based visible watermarking technique for digital color images

    NASA Astrophysics Data System (ADS)

    Santhi, V.; Thangavelu, Arunkumar

    2011-10-01

    As the size of the Internet is growing enormously the illegal manipulation of digital multimedia data become very easy with the advancement in technology tools. In order to protect those multimedia data from unauthorized access the digital watermarking system is used. In this paper a new Discrete walsh Hadamard Transform based visible watermarking system is proposed. As the watermark is embedded in transform domain, the system is robust to many signal processing attacks. Moreover in this proposed method the watermark is embedded in tiling manner in all the range of frequencies to make it robust to compression and cropping attack. The robustness of the algorithm is tested against noise addition, cropping, compression, Histogram equalization and resizing attacks. The experimental results show that the algorithm is robust to common signal processing attacks and the observed peak signal to noise ratio (PSNR) of watermarked image is varying from 20 to 30 db depends on the size of the watermark.

  12. H(2)- and H(infinity)-design tools for linear time-invariant systems

    NASA Technical Reports Server (NTRS)

    Ly, Uy-Loi

    1989-01-01

    Recent advances in optimal control have brought design techniques based on optimization of H(2) and H(infinity) norm criteria, closer to be attractive alternatives to single-loop design methods for linear time-variant systems. Significant steps forward in this technology are the deeper understanding of performance and robustness issues of these design procedures and means to perform design trade-offs. However acceptance of the technology is hindered by the lack of convenient design tools to exercise these powerful multivariable techniques, while still allowing single-loop design formulation. Presented is a unique computer tool for designing arbitrary low-order linear time-invarient controllers than encompasses both performance and robustness issues via the familiar H(2) and H(infinity) norm optimization. Application to disturbance rejection design for a commercial transport is demonstrated.

  13. Battery Pack Thermal Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pesaran, Ahmad

    This presentation describes the thermal design of battery packs at the National Renewable Energy Laboratory. A battery thermal management system essential for xEVs for both normal operation during daily driving (achieving life and performance) and off-normal operation during abuse conditions (achieving safety). The battery thermal management system needs to be optimized with the right tools for the lowest cost. Experimental tools such as NREL's isothermal battery calorimeter, thermal imaging, and heat transfer setups are needed. Thermal models and computer-aided engineering tools are useful for robust designs. During abuse conditions, designs should prevent cell-to-cell propagation in a module/pack (i.e., keep themore » fire small and manageable). NREL's battery ISC device can be used for evaluating the robustness of a module/pack to cell-to-cell propagation.« less

  14. Beyond evidence-based nursing: tools for practice.

    PubMed

    Jutel, Annemarie

    2008-05-01

    This commentary shares my views of evidence-based nursing as a framework for practice, pointing out its limitations and identifying a wider base of appraisal tools required for making good clinical decisions. As the principles of evidence-based nursing take an increasingly greater hold on nursing education, policy and management, it is important to consider the range of other decision-making tools which are subordinated by this approach. This article summarizes nursing's simultaneous reliance on and critique of evidence-based practice (EBP) in a context of inadequate critical reasoning. It then provides an exemplar of the limitations of evidence-based practice and offers an alternative view of important precepts of decision-making. I identify means by which nurses can develop skills to engage in informed and robust critique of practices and their underpinning rationale. Nurses need to be able to locate and assess useful and reliable information for decision-making. This skill is based on a range of tools which include, but also go beyond EBP including: information literacy, humanities, social sciences, public health, statistics, marketing, ethics and much more. This essay prompts nursing managers to reflect upon whether a flurried enthusiasm to adopt EBP neglects other important decision-making skills which provide an even stronger foundation for robust nursing decisions.

  15. Review of the cultivation program within the National Alliance for Advanced Biofuels and Bioproducts

    DOE PAGES

    Lammers, Peter J.; Huesemann, Michael; Boeing, Wiebke; ...

    2016-12-12

    The cultivation efforts within the National Alliance for Advanced Biofuels and Bioproducts (NAABB) were developed to provide four major goals for the consortium, which included biomass production for downstream experimentation, development of new assessment tools for cultivation, development of new cultivation reactor technologies, and development of methods for robust cultivation. The NAABB consortium testbeds produced over 1500 kg of biomass for downstream processing. The biomass production included a number of model production strains, but also took into production some of the more promising strains found through the prospecting efforts of the consortium. Cultivation efforts at large scale are intensive andmore » costly, therefore the consortium developed tools and models to assess the productivity of strains under various environmental conditions, at lab scale, and validated these against scaled outdoor production systems. Two new pond-based bioreactor designs were tested for their ability to minimize energy consumption while maintaining, and even exceeding, the productivity of algae cultivation compared to traditional systems. Also, molecular markers were developed for quality control and to facilitate detection of bacterial communities associated with cultivated algal species, including the Chlorella spp. pathogen, Vampirovibrio chlorellavorus, which was identified in at least two test site locations in Arizona and New Mexico. Finally, the consortium worked on understanding methods to utilize compromised municipal wastewater streams for cultivation. In conclusion, this review provides an overview of the cultivation methods and tools developed by the NAABB consortium to produce algae biomass, in robust low energy systems, for biofuel production.« less

  16. Evolution of robustness to damage in artificial 3-dimensional development.

    PubMed

    Joachimczak, Michał; Wróbel, Borys

    2012-09-01

    GReaNs is an Artificial Life platform we have built to investigate the general principles that guide evolution of multicellular development and evolution of artificial gene regulatory networks. The embryos develop in GReaNs in a continuous 3-dimensional (3D) space with simple physics. The developmental trajectories are indirectly encoded in linear genomes. The genomes are not limited in size and determine the topology of gene regulatory networks that are not limited in the number of nodes. The expression of the genes is continuous and can be modified by adding environmental noise. In this paper we evolved development of structures with a specific shape (an ellipsoid) and asymmetrical pattering (a 3D pattern inspired by the French flag problem), and investigated emergence of the robustness to damage in development and the emergence of the robustness to noise. Our results indicate that both types of robustness are related, and that including noise during evolution promotes higher robustness to damage. Interestingly, we have observed that some evolved gene regulatory networks rely on noise for proper behaviour. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. Research and development supporting risk-based wildfire effects prediction for fuels and fire management: Status and needs

    Treesearch

    Kevin Hyde; Matthew B. Dickinson; Gil Bohrer; David Calkin; Louisa Evers; Julie Gilbertson-Day; Tessa Nicolet; Kevin Ryan; Christina Tague

    2013-01-01

    Wildland fire management has moved beyond a singular focus on suppression, calling for wildfire management for ecological benefit where no critical human assets are at risk. Processes causing direct effects and indirect, long-term ecosystem changes are complex and multidimensional. Robust risk-assessment tools are required that account for highly variable effects on...

  18. Development and Application of Flow Duration Curves for Stream Restoration

    DTIC Science & Technology

    2016-02-01

    hydrograph (TNC 2009). Colorado State University’s GeoTools offers an FDC computation focusing on the geomorphic implications of hydrology (Bledsoe...processes • Assessment of changes in stream metabolism using temperature duration curves • Evaluation of pollutant or contaminant transport using...major concern associated with stream restoration projects, due to the many chemical, ecological, and geomorphic advantages a robust riparian buffer

  19. The Grape Remote Sensing Atmospheric Profile and Evapotranspiration eXperiment (GRAPEX): a multidisciplinary project to develop a robust remote sensing-based ET modeling tool for vineyards

    USDA-ARS?s Scientific Manuscript database

    The recent drought in much of California, particularly in the Central Valley region, has caused severe reduction in water reservoir levels and a major depletion of ground water by agriculture. Dramatic improvements in water and irrigation management practices are critical for agriculture to remain s...

  20. Robust time and frequency domain estimation methods in adaptive control

    NASA Technical Reports Server (NTRS)

    Lamaire, Richard Orville

    1987-01-01

    A robust identification method was developed for use in an adaptive control system. The type of estimator is called the robust estimator, since it is robust to the effects of both unmodeled dynamics and an unmeasurable disturbance. The development of the robust estimator was motivated by a need to provide guarantees in the identification part of an adaptive controller. To enable the design of a robust control system, a nominal model as well as a frequency-domain bounding function on the modeling uncertainty associated with this nominal model must be provided. Two estimation methods are presented for finding parameter estimates, and, hence, a nominal model. One of these methods is based on the well developed field of time-domain parameter estimation. In a second method of finding parameter estimates, a type of weighted least-squares fitting to a frequency-domain estimated model is used. The frequency-domain estimator is shown to perform better, in general, than the time-domain parameter estimator. In addition, a methodology for finding a frequency-domain bounding function on the disturbance is used to compute a frequency-domain bounding function on the additive modeling error due to the effects of the disturbance and the use of finite-length data. The performance of the robust estimator in both open-loop and closed-loop situations is examined through the use of simulations.

  1. MULTINEST: an efficient and robust Bayesian inference tool for cosmology and particle physics

    NASA Astrophysics Data System (ADS)

    Feroz, F.; Hobson, M. P.; Bridges, M.

    2009-10-01

    We present further development and the first public release of our multimodal nested sampling algorithm, called MULTINEST. This Bayesian inference tool calculates the evidence, with an associated error estimate, and produces posterior samples from distributions that may contain multiple modes and pronounced (curving) degeneracies in high dimensions. The developments presented here lead to further substantial improvements in sampling efficiency and robustness, as compared to the original algorithm presented in Feroz & Hobson, which itself significantly outperformed existing Markov chain Monte Carlo techniques in a wide range of astrophysical inference problems. The accuracy and economy of the MULTINEST algorithm are demonstrated by application to two toy problems and to a cosmological inference problem focusing on the extension of the vanilla Λ cold dark matter model to include spatial curvature and a varying equation of state for dark energy. The MULTINEST software, which is fully parallelized using MPI and includes an interface to COSMOMC, is available at http://www.mrao.cam.ac.uk/software/multinest/. It will also be released as part of the SUPERBAYES package, for the analysis of supersymmetric theories of particle physics, at http://www.superbayes.org.

  2. Robust consensus algorithm for multi-agent systems with exogenous disturbances under convergence conditions

    NASA Astrophysics Data System (ADS)

    Jiang, Yulian; Liu, Jianchang; Tan, Shubin; Ming, Pingsong

    2014-09-01

    In this paper, a robust consensus algorithm is developed and sufficient conditions for convergence to consensus are proposed for a multi-agent system (MAS) with exogenous disturbances subject to partial information. By utilizing H∞ robust control, differential game theory and a design-based approach, the consensus problem of the MAS with exogenous bounded interference is resolved and the disturbances are restrained, simultaneously. Attention is focused on designing an H∞ robust controller (the robust consensus algorithm) based on minimisation of our proposed rational and individual cost functions according to goals of the MAS. Furthermore, sufficient conditions for convergence of the robust consensus algorithm are given. An example is employed to demonstrate that our results are effective and more capable to restrain exogenous disturbances than the existing literature.

  3. Quality by Design: Multidimensional exploration of the design space in high performance liquid chromatography method development for better robustness before validation.

    PubMed

    Monks, K; Molnár, I; Rieger, H-J; Bogáti, B; Szabó, E

    2012-04-06

    Robust HPLC separations lead to fewer analysis failures and better method transfer as well as providing an assurance of quality. This work presents the systematic development of an optimal, robust, fast UHPLC method for the simultaneous assay of two APIs of an eye drop sample and their impurities, in accordance with Quality by Design principles. Chromatography software is employed to effectively generate design spaces (Method Operable Design Regions), which are subsequently employed to determine the final method conditions and to evaluate robustness prior to validation. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Robust Fixed-Structure Controller Synthesis

    NASA Technical Reports Server (NTRS)

    Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)

    2000-01-01

    The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.

  5. PredictSNP: Robust and Accurate Consensus Classifier for Prediction of Disease-Related Mutations

    PubMed Central

    Bendl, Jaroslav; Stourac, Jan; Salanda, Ondrej; Pavelka, Antonin; Wieben, Eric D.; Zendulka, Jaroslav; Brezovsky, Jan; Damborsky, Jiri

    2014-01-01

    Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp. PMID:24453961

  6. Tools for monitoring system suitability in LC MS/MS centric proteomic experiments.

    PubMed

    Bereman, Michael S

    2015-03-01

    With advances in liquid chromatography coupled to tandem mass spectrometry technologies combined with the continued goals of biomarker discovery, clinical applications of established biomarkers, and integrating large multiomic datasets (i.e. "big data"), there remains an urgent need for robust tools to assess instrument performance (i.e. system suitability) in proteomic workflows. To this end, several freely available tools have been introduced that monitor a number of peptide identification (ID) and/or peptide ID free metrics. Peptide ID metrics include numbers of proteins, peptides, or peptide spectral matches identified from a complex mixture. Peptide ID free metrics include retention time reproducibility, full width half maximum, ion injection times, and integrated peptide intensities. The main driving force in the development of these tools is to monitor both intra- and interexperiment performance variability and to identify sources of variation. The purpose of this review is to summarize and evaluate these tools based on versatility, automation, vendor neutrality, metrics monitored, and visualization capabilities. In addition, the implementation of a robust system suitability workflow is discussed in terms of metrics, type of standard, and frequency of evaluation along with the obstacles to overcome prior to incorporating a more proactive approach to overall quality control in liquid chromatography coupled to tandem mass spectrometry based proteomic workflows. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Development of a single nucleotide polymorphism barcode to genotype Plasmodium vivax infections.

    PubMed

    Baniecki, Mary Lynn; Faust, Aubrey L; Schaffner, Stephen F; Park, Daniel J; Galinsky, Kevin; Daniels, Rachel F; Hamilton, Elizabeth; Ferreira, Marcelo U; Karunaweera, Nadira D; Serre, David; Zimmerman, Peter A; Sá, Juliana M; Wellems, Thomas E; Musset, Lise; Legrand, Eric; Melnikov, Alexandre; Neafsey, Daniel E; Volkman, Sarah K; Wirth, Dyann F; Sabeti, Pardis C

    2015-03-01

    Plasmodium vivax, one of the five species of Plasmodium parasites that cause human malaria, is responsible for 25-40% of malaria cases worldwide. Malaria global elimination efforts will benefit from accurate and effective genotyping tools that will provide insight into the population genetics and diversity of this parasite. The recent sequencing of P. vivax isolates from South America, Africa, and Asia presents a new opportunity by uncovering thousands of novel single nucleotide polymorphisms (SNPs). Genotyping a selection of these SNPs provides a robust, low-cost method of identifying parasite infections through their unique genetic signature or barcode. Based on our experience in generating a SNP barcode for P. falciparum using High Resolution Melting (HRM), we have developed a similar tool for P. vivax. We selected globally polymorphic SNPs from available P. vivax genome sequence data that were located in putatively selectively neutral sites (i.e., intergenic, intronic, or 4-fold degenerate coding). From these candidate SNPs we defined a barcode consisting of 42 SNPs. We analyzed the performance of the 42-SNP barcode on 87 P. vivax clinical samples from parasite populations in South America (Brazil, French Guiana), Africa (Ethiopia) and Asia (Sri Lanka). We found that the P. vivax barcode is robust, as it requires only a small quantity of DNA (limit of detection 0.3 ng/μl) to yield reproducible genotype calls, and detects polymorphic genotypes with high sensitivity. The markers are informative across all clinical samples evaluated (average minor allele frequency > 0.1). Population genetic and statistical analyses show the barcode captures high degrees of population diversity and differentiates geographically distinct populations. Our 42-SNP barcode provides a robust, informative, and standardized genetic marker set that accurately identifies a genomic signature for P. vivax infections.

  8. Development of a Single Nucleotide Polymorphism Barcode to Genotype Plasmodium vivax Infections

    PubMed Central

    Baniecki, Mary Lynn; Faust, Aubrey L.; Schaffner, Stephen F.; Park, Daniel J.; Galinsky, Kevin; Daniels, Rachel F.; Hamilton, Elizabeth; Ferreira, Marcelo U.; Karunaweera, Nadira D.; Serre, David; Zimmerman, Peter A.; Sá, Juliana M.; Wellems, Thomas E.; Musset, Lise; Legrand, Eric; Melnikov, Alexandre; Neafsey, Daniel E.; Volkman, Sarah K.; Wirth, Dyann F.; Sabeti, Pardis C.

    2015-01-01

    Plasmodium vivax, one of the five species of Plasmodium parasites that cause human malaria, is responsible for 25–40% of malaria cases worldwide. Malaria global elimination efforts will benefit from accurate and effective genotyping tools that will provide insight into the population genetics and diversity of this parasite. The recent sequencing of P. vivax isolates from South America, Africa, and Asia presents a new opportunity by uncovering thousands of novel single nucleotide polymorphisms (SNPs). Genotyping a selection of these SNPs provides a robust, low-cost method of identifying parasite infections through their unique genetic signature or barcode. Based on our experience in generating a SNP barcode for P. falciparum using High Resolution Melting (HRM), we have developed a similar tool for P. vivax. We selected globally polymorphic SNPs from available P. vivax genome sequence data that were located in putatively selectively neutral sites (i.e., intergenic, intronic, or 4-fold degenerate coding). From these candidate SNPs we defined a barcode consisting of 42 SNPs. We analyzed the performance of the 42-SNP barcode on 87 P. vivax clinical samples from parasite populations in South America (Brazil, French Guiana), Africa (Ethiopia) and Asia (Sri Lanka). We found that the P. vivax barcode is robust, as it requires only a small quantity of DNA (limit of detection 0.3 ng/μl) to yield reproducible genotype calls, and detects polymorphic genotypes with high sensitivity. The markers are informative across all clinical samples evaluated (average minor allele frequency > 0.1). Population genetic and statistical analyses show the barcode captures high degrees of population diversity and differentiates geographically distinct populations. Our 42-SNP barcode provides a robust, informative, and standardized genetic marker set that accurately identifies a genomic signature for P. vivax infections. PMID:25781890

  9. Automatic Masking for Robust 3D-2D Image Registration in Image-Guided Spine Surgery.

    PubMed

    Ketcha, M D; De Silva, T; Uneri, A; Kleinszig, G; Vogt, S; Wolinsky, J-P; Siewerdsen, J H

    During spinal neurosurgery, patient-specific information, planning, and annotation such as vertebral labels can be mapped from preoperative 3D CT to intraoperative 2D radiographs via image-based 3D-2D registration. Such registration has been shown to provide a potentially valuable means of decision support in target localization as well as quality assurance of the surgical product. However, robust registration can be challenged by mismatch in image content between the preoperative CT and intraoperative radiographs, arising, for example, from anatomical deformation or the presence of surgical tools within the radiograph. In this work, we develop and evaluate methods for automatically mitigating the effect of content mismatch by leveraging the surgical planning data to assign greater weight to anatomical regions known to be reliable for registration and vital to the surgical task while removing problematic regions that are highly deformable or often occluded by surgical tools. We investigated two approaches to assigning variable weight (i.e., "masking") to image content and/or the similarity metric: (1) masking the preoperative 3D CT ("volumetric masking"); and (2) masking within the 2D similarity metric calculation ("projection masking"). The accuracy of registration was evaluated in terms of projection distance error (PDE) in 61 cases selected from an IRB-approved clinical study. The best performing of the masking techniques was found to reduce the rate of gross failure (PDE > 20 mm) from 11.48% to 5.57% in this challenging retrospective data set. These approaches provided robustness to content mismatch and eliminated distinct failure modes of registration. Such improvement was gained without additional workflow and has motivated incorporation of the masking methods within a system under development for prospective clinical studies.

  10. Vertebra identification using template matching modelmp and K-means clustering.

    PubMed

    Larhmam, Mohamed Amine; Benjelloun, Mohammed; Mahmoudi, Saïd

    2014-03-01

    Accurate vertebra detection and segmentation are essential steps for automating the diagnosis of spinal disorders. This study is dedicated to vertebra alignment measurement, the first step in a computer-aided diagnosis tool for cervical spine trauma. Automated vertebral segment alignment determination is a challenging task due to low contrast imaging and noise. A software tool for segmenting vertebrae and detecting subluxations has clinical significance. A robust method was developed and tested for cervical vertebra identification and segmentation that extracts parameters used for vertebra alignment measurement. Our contribution involves a novel combination of a template matching method and an unsupervised clustering algorithm. In this method, we build a geometric vertebra mean model. To achieve vertebra detection, manual selection of the region of interest is performed initially on the input image. Subsequent preprocessing is done to enhance image contrast and detect edges. Candidate vertebra localization is then carried out by using a modified generalized Hough transform (GHT). Next, an adapted cost function is used to compute local voted centers and filter boundary data. Thereafter, a K-means clustering algorithm is applied to obtain clusters distribution corresponding to the targeted vertebrae. These clusters are combined with the vote parameters to detect vertebra centers. Rigid segmentation is then carried out by using GHT parameters. Finally, cervical spine curves are extracted to measure vertebra alignment. The proposed approach was successfully applied to a set of 66 high-resolution X-ray images. Robust detection was achieved in 97.5 % of the 330 tested cervical vertebrae. An automated vertebral identification method was developed and demonstrated to be robust to noise and occlusion. This work presents a first step toward an automated computer-aided diagnosis system for cervical spine trauma detection.

  11. Automatic masking for robust 3D-2D image registration in image-guided spine surgery

    NASA Astrophysics Data System (ADS)

    Ketcha, M. D.; De Silva, T.; Uneri, A.; Kleinszig, G.; Vogt, S.; Wolinsky, J.-P.; Siewerdsen, J. H.

    2016-03-01

    During spinal neurosurgery, patient-specific information, planning, and annotation such as vertebral labels can be mapped from preoperative 3D CT to intraoperative 2D radiographs via image-based 3D-2D registration. Such registration has been shown to provide a potentially valuable means of decision support in target localization as well as quality assurance of the surgical product. However, robust registration can be challenged by mismatch in image content between the preoperative CT and intraoperative radiographs, arising, for example, from anatomical deformation or the presence of surgical tools within the radiograph. In this work, we develop and evaluate methods for automatically mitigating the effect of content mismatch by leveraging the surgical planning data to assign greater weight to anatomical regions known to be reliable for registration and vital to the surgical task while removing problematic regions that are highly deformable or often occluded by surgical tools. We investigated two approaches to assigning variable weight (i.e., "masking") to image content and/or the similarity metric: (1) masking the preoperative 3D CT ("volumetric masking"); and (2) masking within the 2D similarity metric calculation ("projection masking"). The accuracy of registration was evaluated in terms of projection distance error (PDE) in 61 cases selected from an IRB-approved clinical study. The best performing of the masking techniques was found to reduce the rate of gross failure (PDE > 20 mm) from 11.48% to 5.57% in this challenging retrospective data set. These approaches provided robustness to content mismatch and eliminated distinct failure modes of registration. Such improvement was gained without additional workflow and has motivated incorporation of the masking methods within a system under development for prospective clinical studies.

  12. A pre-operative planning for endoprosthetic human tracheal implantation: a decision support system based on robust design of experiments.

    PubMed

    Trabelsi, O; Villalobos, J L López; Ginel, A; Cortes, E Barrot; Doblaré, M

    2014-05-01

    Swallowing depends on physiological variables that have a decisive influence on the swallowing capacity and on the tracheal stress distribution. Prosthetic implantation modifies these values and the overall performance of the trachea. The objective of this work was to develop a decision support system based on experimental, numerical and statistical approaches, with clinical verification, to help the thoracic surgeon in deciding the position and appropriate dimensions of a Dumon prosthesis for a specific patient in an optimal time and with sufficient robustness. A code for mesh adaptation to any tracheal geometry was implemented and used to develop a robust experimental design, based on the Taguchi's method and the analysis of variance. This design was able to establish the main swallowing influencing factors. The equations to fit the stress and the vertical displacement distributions were obtained. The resulting fitted values were compared to those calculated directly by the finite element method (FEM). Finally, a checking and clinical validation of the statistical study were made, by studying two cases of real patients. The vertical displacements and principal stress distribution obtained for the specific tracheal model were in agreement with those calculated by FE simulations with a maximum absolute error of 1.2 mm and 0.17 MPa, respectively. It was concluded that the resulting decision support tool provides a fast, accurate and simple tool for the thoracic surgeon to predict the stress state of the trachea and the reduction in the ability to swallow after implantation. Thus, it will help them in taking decisions during pre-operative planning of tracheal interventions.

  13. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  14. RISK COMMUNICATION IN ACTION: THE TOOLS OF MESSAGE MAPPING

    EPA Science Inventory

    Risk Communication in Action: The Tools of Message Mapping, is a workbook designed to guide risk communicators in crisis situations. The first part of this workbook will review general guidelines for risk communication. The second part will focus on one of the most robust tools o...

  15. F-15B Quiet Spike(TradeMark) Aeroservoelastic Flight-Test Data Analysis

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2007-01-01

    System identification is utilized in the aerospace community for development of simulation models for robust control law design. These models are often described as linear, time-invariant processes and assumed to be uniform throughout the flight envelope. Nevertheless, it is well known that the underlying process is inherently nonlinear. Over the past several decades the controls and biomedical communities have made great advances in developing tools for the identification of nonlin ear systems. In this report, we show the application of one such nonlinear system identification technique, structure detection, for the an alysis of Quiet Spike(TradeMark)(Gulfstream Aerospace Corporation, Savannah, Georgia) aeroservoelastic flight-test data. Structure detectio n is concerned with the selection of a subset of candidate terms that best describe the observed output. Structure computation as a tool fo r black-box modeling may be of critical importance for the development of robust, parsimonious models for the flight-test community. The ob jectives of this study are to demonstrate via analysis of Quiet Spike(TradeMark) aeroservoelastic flight-test data for several flight conditions that: linear models are inefficient for modelling aeroservoelast ic data, nonlinear identification provides a parsimonious model description whilst providing a high percent fit for cross-validated data an d the model structure and parameters vary as the flight condition is altered.

  16. How robust is a robust policy? A comparative analysis of alternative robustness metrics for supporting robust decision analysis.

    NASA Astrophysics Data System (ADS)

    Kwakkel, Jan; Haasnoot, Marjolijn

    2015-04-01

    In response to climate and socio-economic change, in various policy domains there is increasingly a call for robust plans or policies. That is, plans or policies that performs well in a very large range of plausible futures. In the literature, a wide range of alternative robustness metrics can be found. The relative merit of these alternative conceptualizations of robustness has, however, received less attention. Evidently, different robustness metrics can result in different plans or policies being adopted. This paper investigates the consequences of several robustness metrics on decision making, illustrated here by the design of a flood risk management plan. A fictitious case, inspired by a river reach in the Netherlands is used. The performance of this system in terms of casualties, damages, and costs for flood and damage mitigation actions is explored using a time horizon of 100 years, and accounting for uncertainties pertaining to climate change and land use change. A set of candidate policy options is specified up front. This set of options includes dike raising, dike strengthening, creating more space for the river, and flood proof building and evacuation options. The overarching aim is to design an effective flood risk mitigation strategy that is designed from the outset to be adapted over time in response to how the future actually unfolds. To this end, the plan will be based on the dynamic adaptive policy pathway approach (Haasnoot, Kwakkel et al. 2013) being used in the Dutch Delta Program. The policy problem is formulated as a multi-objective robust optimization problem (Kwakkel, Haasnoot et al. 2014). We solve the multi-objective robust optimization problem using several alternative robustness metrics, including both satisficing robustness metrics and regret based robustness metrics. Satisficing robustness metrics focus on the performance of candidate plans across a large ensemble of plausible futures. Regret based robustness metrics compare the performance of a candidate plan with the performance of other candidate plans across a large ensemble of plausible futures. Initial results suggest that the simplest satisficing metric, inspired by the signal to noise ratio, results in very risk averse solutions. Other satisficing metrics, which handle the average performance and the dispersion around the average separately, provide substantial additional insights into the trade off between the average performance, and the dispersion around this average. In contrast, the regret-based metrics enhance insight into the relative merits of candidate plans, while being less clear on the average performance or the dispersion around this performance. These results suggest that it is beneficial to use multiple robustness metrics when doing a robust decision analysis study. Haasnoot, M., J. H. Kwakkel, W. E. Walker and J. Ter Maat (2013). "Dynamic Adaptive Policy Pathways: A New Method for Crafting Robust Decisions for a Deeply Uncertain World." Global Environmental Change 23(2): 485-498. Kwakkel, J. H., M. Haasnoot and W. E. Walker (2014). "Developing Dynamic Adaptive Policy Pathways: A computer-assisted approach for developing adaptive strategies for a deeply uncertain world." Climatic Change.

  17. Improving near-infrared prediction model robustness with support vector machine regression: a pharmaceutical tablet assay example.

    PubMed

    Igne, Benoît; Drennen, James K; Anderson, Carl A

    2014-01-01

    Changes in raw materials and process wear and tear can have significant effects on the prediction error of near-infrared calibration models. When the variability that is present during routine manufacturing is not included in the calibration, test, and validation sets, the long-term performance and robustness of the model will be limited. Nonlinearity is a major source of interference. In near-infrared spectroscopy, nonlinearity can arise from light path-length differences that can come from differences in particle size or density. The usefulness of support vector machine (SVM) regression to handle nonlinearity and improve the robustness of calibration models in scenarios where the calibration set did not include all the variability present in test was evaluated. Compared to partial least squares (PLS) regression, SVM regression was less affected by physical (particle size) and chemical (moisture) differences. The linearity of the SVM predicted values was also improved. Nevertheless, although visualization and interpretation tools have been developed to enhance the usability of SVM-based methods, work is yet to be done to provide chemometricians in the pharmaceutical industry with a regression method that can supplement PLS-based methods.

  18. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  19. Using Velocity Anisotropy to Analyze Magnetohydrodynamic Turbulence in Giant Molecular Clouds

    NASA Astrophysics Data System (ADS)

    Madrid, Alecio; Hernandez, Audra

    2018-01-01

    Structure function (SF) analysis is a strong tool for gaging the Alfvénic properties of magnetohydrodynamic (MHD) simulations, yet there is a lack of literature rigorously investigating limitations in the context of radio spectroscopy. This study takes an in depth approach to studying the limitations of SF analysis for analyzing MHD turbulence in giant molecular cloud (GMC) spectroscopy data. MHD turbulence plays a critical role in the structure and evolution of GMCs as well as in the formation of sub-structures known to spawn stellar progenitors. Existing methods of detection are neither economical nor robust (e.g. dust polarization), and nowhere is this more clear than in the theoretical-observational divide in current literature. A significant limitation of GMC spectroscopy results from the large variation in methods used for extracting GMCs from survey data. Thus, a robust method for studying MHD turbulence must correctly gauge physical properties regardless of the data extraction method used. While SF analysis has demonstrated strong potential across a range of simulated conditions, this study finds significant concern regarding its feasibility as a robust tool in GMC spectroscopy.

  20. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  1. Recent Progress in the Development of Metabolome Databases for Plant Systems Biology

    PubMed Central

    Fukushima, Atsushi; Kusano, Miyako

    2013-01-01

    Metabolomics has grown greatly as a functional genomics tool, and has become an invaluable diagnostic tool for biochemical phenotyping of biological systems. Over the past decades, a number of databases involving information related to mass spectra, compound names and structures, statistical/mathematical models and metabolic pathways, and metabolite profile data have been developed. Such databases complement each other and support efficient growth in this area, although the data resources remain scattered across the World Wide Web. Here, we review available metabolome databases and summarize the present status of development of related tools, particularly focusing on the plant metabolome. Data sharing discussed here will pave way for the robust interpretation of metabolomic data and advances in plant systems biology. PMID:23577015

  2. Extending item response theory to online homework

    NASA Astrophysics Data System (ADS)

    Kortemeyer, Gerd

    2014-06-01

    Item response theory (IRT) becomes an increasingly important tool when analyzing "big data" gathered from online educational venues. However, the mechanism was originally developed in traditional exam settings, and several of its assumptions are infringed upon when deployed in the online realm. For a large-enrollment physics course for scientists and engineers, the study compares outcomes from IRT analyses of exam and homework data, and then proceeds to investigate the effects of each confounding factor introduced in the online realm. It is found that IRT yields the correct trends for learner ability and meaningful item parameters, yet overall agreement with exam data is moderate. It is also found that learner ability and item discrimination is robust over a wide range with respect to model assumptions and introduced noise. Item difficulty is also robust, but over a narrower range.

  3. Robust Regression through Robust Covariances.

    DTIC Science & Technology

    1985-01-01

    we apply (2.3). But first let us examine the influence function (see Hampel (1974)). In order to simplify the formulas we will first consider the case...remember that the influence function is an asymptotic 0tooL" and that therefore the population Values of our estimators appear in the formula. V(GR) is...the parameter a , V) based on the data Z1 , ... DZ. via tp =~t 0. Now we can apply the standard formulas to get influence function (see Huber (1981

  4. Designed tools for analysis of lithography patterns and nanostructures

    NASA Astrophysics Data System (ADS)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  5. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices

    NASA Astrophysics Data System (ADS)

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-09-01

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes’ (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body.

  6. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices

    PubMed Central

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-01-01

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes’ (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body. PMID:27670953

  7. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices.

    PubMed

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-09-27

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes' (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body.

  8. A hybrid multi-objective imperialist competitive algorithm and Monte Carlo method for robust safety design of a rail vehicle

    NASA Astrophysics Data System (ADS)

    Nejlaoui, Mohamed; Houidi, Ajmi; Affi, Zouhaier; Romdhane, Lotfi

    2017-10-01

    This paper deals with the robust safety design optimization of a rail vehicle system moving in short radius curved tracks. A combined multi-objective imperialist competitive algorithm and Monte Carlo method is developed and used for the robust multi-objective optimization of the rail vehicle system. This robust optimization of rail vehicle safety considers simultaneously the derailment angle and its standard deviation where the design parameters uncertainties are considered. The obtained results showed that the robust design reduces significantly the sensitivity of the rail vehicle safety to the design parameters uncertainties compared to the determinist one and to the literature results.

  9. Massive-Scale Gene Co-Expression Network Construction and Robustness Testing Using Random Matrix Theory

    PubMed Central

    Isaacson, Sven; Luo, Feng; Feltus, Frank A.; Smith, Melissa C.

    2013-01-01

    The study of gene relationships and their effect on biological function and phenotype is a focal point in systems biology. Gene co-expression networks built using microarray expression profiles are one technique for discovering and interpreting gene relationships. A knowledge-independent thresholding technique, such as Random Matrix Theory (RMT), is useful for identifying meaningful relationships. Highly connected genes in the thresholded network are then grouped into modules that provide insight into their collective functionality. While it has been shown that co-expression networks are biologically relevant, it has not been determined to what extent any given network is functionally robust given perturbations in the input sample set. For such a test, hundreds of networks are needed and hence a tool to rapidly construct these networks. To examine functional robustness of networks with varying input, we enhanced an existing RMT implementation for improved scalability and tested functional robustness of human (Homo sapiens), rice (Oryza sativa) and budding yeast (Saccharomyces cerevisiae). We demonstrate dramatic decrease in network construction time and computational requirements and show that despite some variation in global properties between networks, functional similarity remains high. Moreover, the biological function captured by co-expression networks thresholded by RMT is highly robust. PMID:23409071

  10. A Robust Ordering Strategy for Retailers Facing a Free Shipping Option

    PubMed Central

    Meng, Qing-chun; Wan, Xiao-le; Rong, Xiao-xia

    2015-01-01

    Free shipping with conditions has become one of the most effective marketing tools available. An increasing number of companies, especially e-businesses, prefer to offer free shipping with some predetermined condition, such as a minimum purchase amount by the customer. However, in practice, the demands of buyers are uncertain; they are often affected by many factors, such as the weather and season. We begin by modeling the centralized ordering problem in which the supplier offers a free shipping service and retailers face stochastic demands. As these random data are considered, only partial information such as the known mean, support, and deviation is needed. The model is then analyzed via a robust optimization method, and the two types of equivalent sets of uncertainty constraints that are obtained provide good mathematical properties with consideration of the robustness of solutions. Subsequently, a numerical example is used to compare the results achieved from a robust optimization method and the linear decision rules. Additionally, the robustness of the optimal solution is discussed, as it is affected by the minimum quantity parameters. The increasing cost-threshold relationship is divided into three periods. In addition, the case study shows that the proposed method achieves better stability as well as computational complexity. PMID:25993533

  11. Empowering biotechnology in southern Africa: establishment of a robust transformation platform for the production of transgenic industry-preferred cassava.

    PubMed

    Chetty, C C; Rossin, C B; Gruissem, W; Vanderschuren, H; Rey, M E C

    2013-01-25

    Knowledge and technology transfer to African laboratories and farmers is an important objective for achieving food security and sustainable crop production on the sub-Saharan African continent. Cassava (Manihot esculenta Crantz) is a vital source of calories for more than a billion people in developing countries, and its potential industrial use for starch and bioethanol in the tropics is increasingly being recognized. However, cassava production remains constrained by the susceptibility of the crop to several biotic and abiotic stresses. For more than a decade, biotechnology has been considered an attractive tool to improve cassava as it substantially circumvents the limitations of traditional breeding, which is particularly time-consuming and tedious because of the high heterozygosity of the crop. A major constraint to the development of biotechnological approaches for cassava improvement has been the lack of an efficient and robust transformation and regeneration system. Despite some success achieved in genetic modification of the model cassava cultivar Tropical Manihot Series (TMS), TMS 60444, in some European and U.S. laboratories, the lack of a reproducible and robust protocol has not allowed the establishment of a routine transformation system in sub-Saharan Africa. In this study, we optimized a robust and efficient protocol developed at ETH Zurich to successfully establish transformation of a commercially cultivated South African landrace, T200, and compared this with the benchmark model cultivar TMS 60444. Results from our study demonstrated high transformation rates for both T200 (23 transgenic lines from 100 friable embryogenic callus (FEC) clusters) compared with TMS 60444 (32 transgenic lines from 100 FEC clusters). The success in transforming landraces or farmer-preferred cultivars has been limited, and the high transformation rate of an industry-preferred landrace in this study is encouraging for a feasible transformation program for cassava improvement in South Africa (SA), which can potentially be extended to other countries in southern Africa. The successful establishment of a robust cassava transformation and regeneration system in SA demonstrates the relevance of technology transfer to sub-Saharan Africa and highlights the importance of developing suitable and reliable techniques before their transfer to laboratories offering less optimal conditions. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Guide to NavyFOAM V1.0

    DTIC Science & Technology

    2011-04-01

    NavyFOAM has been developed using an open-source CFD software tool-kit ( OpenFOAM ) that draws heavily upon object-oriented programming. The...numerical methods and the physical models in the original version of OpenFOAM have been upgraded in an effort to improve accuracy and robustness of...computational fluid dynamics OpenFOAM , Object Oriented Programming (OOP) (CFD), NavyFOAM, 16. SECURITY CLASSIFICATION OF: a. REPORT UNCLASSIFIED b

  13. iTree-Hydro: Snow hydrology update for the urban forest hydrology model

    Treesearch

    Yang Yang; Theodore A. Endreny; David J. Nowak

    2011-01-01

    This article presents snow hydrology updates made to iTree-Hydro, previously called the Urban Forest Effects—Hydrology model. iTree-Hydro Version 1 was a warm climate model developed by the USDA Forest Service to provide a process-based planning tool with robust water quantity and quality predictions given data limitations common to most urban areas. Cold climate...

  14. Development of 3D Oxide Fuel Mechanics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, B. W.; Casagranda, A.; Pitts, S. A.

    This report documents recent work to improve the accuracy and robustness of the mechanical constitutive models used in the BISON fuel performance code. These developments include migration of the fuel mechanics models to be based on the MOOSE Tensor Mechanics module, improving the robustness of the smeared cracking model, implementing a capability to limit the time step size based on material model response, and improving the robustness of the return mapping iterations used in creep and plasticity models.

  15. Share Repository Framework: Component Specification and Otology

    DTIC Science & Technology

    2008-04-23

    Palantir Technologies has created one such software application to support the DoD intelligence community by providing robust capabilities for...managing data from various sources. The Palantir tool is based on user-defined ontologies and supports multiple representation and analysis tools

  16. The Relationship between Organizational Health and Robust School Vision in Elementary Schools

    ERIC Educational Resources Information Center

    Korkmaz, Mehmet

    2006-01-01

    Teachers play an important role in developing a robust school vision. This study is aimed to find out the likely relationship between the teachers' perception of school health and a robust school vision. It has been found that there is a significant positive relationship between teachers' perceptions of organizational health and the relative…

  17. Robust, Optimal Subsonic Airfoil Shapes

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan

    2014-01-01

    A method has been developed to create an airfoil robust enough to operate satisfactorily in different environments. This method determines a robust, optimal, subsonic airfoil shape, beginning with an arbitrary initial airfoil shape, and imposes the necessary constraints on the design. Also, this method is flexible and extendible to a larger class of requirements and changes in constraints imposed.

  18. Multi-Agent Many-Objective Robust Decision Making: Supporting Cooperative Regional Water Portfolio Planning in the Eastern United States

    NASA Astrophysics Data System (ADS)

    Herman, J. D.; Zeff, H. B.; Reed, P. M.; Characklis, G. W.

    2013-12-01

    In the Eastern United States, water infrastructure and institutional frameworks have evolved in a historically water-rich environment. However, large regional droughts over the past decade combined with continuing population growth have marked a transition to a state of water scarcity, for which current planning paradigms are ill-suited. Significant opportunities exist to improve the efficiency of water infrastructure via regional coordination, namely, regional 'portfolios' of water-related assets such as reservoirs, conveyance, conservation measures, and transfer agreements. Regional coordination offers the potential to improve reliability, cost, and environmental impact in the expected future state of the world, and, with informed planning, to improve robustness to future uncertainty. In support of this challenge, this study advances a multi-agent many-objective robust decision making (multi-agent MORDM) framework that blends novel computational search and uncertainty analysis tools to discover flexible, robust regional portfolios. Our multi-agent MORDM framework is demonstrated for four water utilities in the Research Triangle region of North Carolina, USA. The utilities supply nearly two million customers and have the ability to interact with one another via transfer agreements and shared infrastructure. We show that strategies for this region which are Pareto-optimal in the expected future state of the world remain vulnerable to performance degradation under alternative scenarios of deeply uncertain hydrologic and economic factors. We then apply the Patient Rule Induction Method (PRIM) to identify which of these uncertain factors drives the individual and collective vulnerabilities for the four cooperating utilities. Our results indicate that clear multi-agent tradeoffs emerge for attaining robustness across the utilities. Furthermore, the key factor identified for improving the robustness of the region's water supply is cooperative demand reduction. This type of approach is critically important given the risks and challenges posed by rising supply development costs, limits on new infrastructure, growing water demands and the underlying uncertainties associated with climate change. The proposed framework serves as a planning template for other historically water-rich regions which must now confront the reality of impending water scarcity.

  19. An automated genotyping tool for enteroviruses and noroviruses.

    PubMed

    Kroneman, A; Vennema, H; Deforche, K; v d Avoort, H; Peñaranda, S; Oberste, M S; Vinjé, J; Koopmans, M

    2011-06-01

    Molecular techniques are established as routine in virological laboratories and virus typing through (partial) sequence analysis is increasingly common. Quality assurance for the use of typing data requires harmonization of genotype nomenclature, and agreement on target genes, depending on the level of resolution required, and robustness of methods. To develop and validate web-based open-access typing-tools for enteroviruses and noroviruses. An automated web-based typing algorithm was developed, starting with BLAST analysis of the query sequence against a reference set of sequences from viruses in the family Picornaviridae or Caliciviridae. The second step is phylogenetic analysis of the query sequence and a sub-set of the reference sequences, to assign the enterovirus type or norovirus genotype and/or variant, with profile alignment, construction of phylogenetic trees and bootstrap validation. Typing is performed on VP1 sequences of Human enterovirus A to D, and ORF1 and ORF2 sequences of genogroup I and II noroviruses. For validation, we used the tools to automatically type sequences in the RIVM and CDC enterovirus databases and the FBVE norovirus database. Using the typing-tools, 785(99%) of 795 Enterovirus VP1 sequences, and 8154(98.5%) of 8342 norovirus sequences were typed in accordance with previously used methods. Subtyping into variants was achieved for 4439(78.4%) of 5838 NoV GII.4 sequences. The online typing-tools reliably assign genotypes for enteroviruses and noroviruses. The use of phylogenetic methods makes these tools robust to ongoing evolution. This should facilitate standardized genotyping and nomenclature in clinical and public health laboratories, thus supporting inter-laboratory comparisons. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Robust H(∞) positional control of 2-DOF robotic arm driven by electro-hydraulic servo system.

    PubMed

    Guo, Qing; Yu, Tian; Jiang, Dan

    2015-11-01

    In this paper an H∞ positional feedback controller is developed to improve the robust performance under structural and parametric uncertainty disturbance in electro-hydraulic servo system (EHSS). The robust control model is described as the linear state-space equation by upper linear fractional transformation. According to the solution of H∞ sub-optimal control problem, the robust controller is designed and simplified to lower order linear model which is easily realized in EHSS. The simulation and experimental results can validate the robustness of this proposed method. The comparison result with PI control shows that the robust controller is suitable for this EHSS under the critical condition where the desired system bandwidth is higher and the external load of the hydraulic actuator is closed to its limited capability. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Robust Learning Control Design for Quantum Unitary Transformations.

    PubMed

    Wu, Chengzhi; Qi, Bo; Chen, Chunlin; Dong, Daoyi

    2017-12-01

    Robust control design for quantum unitary transformations has been recognized as a fundamental and challenging task in the development of quantum information processing due to unavoidable decoherence or operational errors in the experimental implementation of quantum operations. In this paper, we extend the systematic methodology of sampling-based learning control (SLC) approach with a gradient flow algorithm for the design of robust quantum unitary transformations. The SLC approach first uses a "training" process to find an optimal control strategy robust against certain ranges of uncertainties. Then a number of randomly selected samples are tested and the performance is evaluated according to their average fidelity. The approach is applied to three typical examples of robust quantum transformation problems including robust quantum transformations in a three-level quantum system, in a superconducting quantum circuit, and in a spin chain system. Numerical results demonstrate the effectiveness of the SLC approach and show its potential applications in various implementation of quantum unitary transformations.

  2. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  3. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  4. A Multi-Band Uncertainty Set Based Robust SCUC With Spatial and Temporal Budget Constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Chenxi; Wu, Lei; Wu, Hongyu

    2016-11-01

    The dramatic increase of renewable energy resources in recent years, together with the long-existing load forecast errors and increasingly involved price sensitive demands, has introduced significant uncertainties into power systems operation. In order to guarantee the operational security of power systems with such uncertainties, robust optimization has been extensively studied in security-constrained unit commitment (SCUC) problems, for immunizing the system against worst uncertainty realizations. However, traditional robust SCUC models with single-band uncertainty sets may yield over-conservative solutions in most cases. This paper proposes a multi-band robust model to accurately formulate various uncertainties with higher resolution. By properly tuning band intervalsmore » and weight coefficients of individual bands, the proposed multi-band robust model can rigorously and realistically reflect spatial/temporal relationships and asymmetric characteristics of various uncertainties, and in turn could effectively leverage the tradeoff between robustness and economics of robust SCUC solutions. The proposed multi-band robust SCUC model is solved by Benders decomposition (BD) and outer approximation (OA), while taking the advantage of integral property of the proposed multi-band uncertainty set. In addition, several accelerating techniques are developed for enhancing the computational performance and the convergence speed. Numerical studies on a 6-bus system and the modified IEEE 118-bus system verify the effectiveness of the proposed robust SCUC approach for enhancing uncertainty modeling capabilities and mitigating conservativeness of the robust SCUC solution.« less

  5. A reliable algorithm for optimal control synthesis

    NASA Technical Reports Server (NTRS)

    Vansteenwyk, Brett; Ly, Uy-Loi

    1992-01-01

    In recent years, powerful design tools for linear time-invariant multivariable control systems have been developed based on direct parameter optimization. In this report, an algorithm for reliable optimal control synthesis using parameter optimization is presented. Specifically, a robust numerical algorithm is developed for the evaluation of the H(sup 2)-like cost functional and its gradients with respect to the controller design parameters. The method is specifically designed to handle defective degenerate systems and is based on the well-known Pade series approximation of the matrix exponential. Numerical test problems in control synthesis for simple mechanical systems and for a flexible structure with densely packed modes illustrate positively the reliability of this method when compared to a method based on diagonalization. Several types of cost functions have been considered: a cost function for robust control consisting of a linear combination of quadratic objectives for deterministic and random disturbances, and one representing an upper bound on the quadratic objective for worst case initial conditions. Finally, a framework for multivariable control synthesis has been developed combining the concept of closed-loop transfer recovery with numerical parameter optimization. The procedure enables designers to synthesize not only observer-based controllers but also controllers of arbitrary order and structure. Numerical design solutions rely heavily on the robust algorithm due to the high order of the synthesis model and the presence of near-overlapping modes. The design approach is successfully applied to the design of a high-bandwidth control system for a rotorcraft.

  6. Intelligent Model Management in a Forest Ecosystem Management Decision Support System

    Treesearch

    Donald Nute; Walter D. Potter; Frederick Maier; Jin Wang; Mark Twery; H. Michael Rauscher; Peter Knopp; Scott Thomasma; Mayukh Dass; Hajime Uchiyama

    2002-01-01

    Decision making for forest ecosystem management can include the use of a wide variety of modeling tools. These tools include vegetation growth models, wildlife models, silvicultural models, GIS, and visualization tools. NED-2 is a robust, intelligent, goal-driven decision support system that integrates tools in each of these categories. NED-2 uses a blackboard...

  7. Design of a robust control law for the Vega launcher ballistic phase

    NASA Astrophysics Data System (ADS)

    Valli, Monica; Lavagna, Michèle R.; Panozzo, Thomas

    2012-02-01

    This work presents the design of a robust control law, and the related control system architecture, for the Vega launcher ballistic phase, taking into account the complete six degrees of freedom dynamics. To gain robustness a non-linear control approach has been preferred: more specifically the Lyapunov's second stability theorem has been exploited, being a very powerful tool to guarantee asymptotic stability of the controlled dynamics. The dynamics of Vega's actuators has also been taken into account. The system performance has been checked and analyzed by numerical simulations run on real mission data for different operational and configuration scenarios, and the effectiveness of the synthesized control highlighted: in particular scenarios including a wide range of composite's inertial configurations performing various typologies of maneuvers have been run. The robustness of the controlled dynamics has been validated by 100 cases Monte Carlo analysis campaign: the containment of the dispersion for the controlled variables - say the composite roll, yaw and pitch angles - confirmed the wide validity and generality of the proposed control law. This paper will show the theoretical approach and discuss the obtained results.

  8. Hard Constraints in Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.

    2008-01-01

    This paper proposes a methodology for the analysis and design of systems subject to parametric uncertainty where design requirements are specified via hard inequality constraints. Hard constraints are those that must be satisfied for all parameter realizations within a given uncertainty model. Uncertainty models given by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles, are the focus of this paper. These models, which are also quite practical, allow for a rigorous mathematical treatment within the proposed framework. Hard constraint feasibility is determined by sizing the largest uncertainty set for which the design requirements are satisfied. Analytically verifiable assessments of robustness are attained by comparing this set with the actual uncertainty model. Strategies that enable the comparison of the robustness characteristics of competing design alternatives, the description and approximation of the robust design space, and the systematic search for designs with improved robustness are also proposed. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, this methodology is applicable to a broad range of engineering problems.

  9. Efficient and Robust Optimization for Building Energy Simulation

    PubMed Central

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-01-01

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell’s Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell’s method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell’s Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell’s Hybrid method presently used in HVACSIM+. PMID:27325907

  10. Efficient and Robust Optimization for Building Energy Simulation.

    PubMed

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-06-15

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell's Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell's method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell's Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell's Hybrid method presently used in HVACSIM+.

  11. Generation of mature T cells from human hematopoietic stem and progenitor cells in artificial thymic organoids.

    PubMed

    Seet, Christopher S; He, Chongbin; Bethune, Michael T; Li, Suwen; Chick, Brent; Gschweng, Eric H; Zhu, Yuhua; Kim, Kenneth; Kohn, Donald B; Baltimore, David; Crooks, Gay M; Montel-Hagen, Amélie

    2017-05-01

    Studies of human T cell development require robust model systems that recapitulate the full span of thymopoiesis, from hematopoietic stem and progenitor cells (HSPCs) through to mature T cells. Existing in vitro models induce T cell commitment from human HSPCs; however, differentiation into mature CD3 + TCR-αβ + single-positive CD8 + or CD4 + cells is limited. We describe here a serum-free, artificial thymic organoid (ATO) system that supports efficient and reproducible in vitro differentiation and positive selection of conventional human T cells from all sources of HSPCs. ATO-derived T cells exhibited mature naive phenotypes, a diverse T cell receptor (TCR) repertoire and TCR-dependent function. ATOs initiated with TCR-engineered HSPCs produced T cells with antigen-specific cytotoxicity and near-complete lack of endogenous TCR Vβ expression, consistent with allelic exclusion of Vβ-encoding loci. ATOs provide a robust tool for studying human T cell differentiation and for the future development of stem-cell-based engineered T cell therapies.

  12. Generation of mature T cells from human hematopoietic stem/progenitor cells in artificial thymic organoids

    PubMed Central

    Seet, Christopher S.; He, Chongbin; Bethune, Michael T.; Li, Suwen; Chick, Brent; Gschweng, Eric H.; Zhu, Yuhua; Kim, Kenneth; Kohn, Donald B.; Baltimore, David; Crooks, Gay M.; Montel-Hagen, Amélie

    2017-01-01

    Studies of human T cell development require robust model systems that recapitulate the full span of thymopoiesis, from hematopoietic stem and progenitor cells (HSPCs) through to mature T cells. Existing in vitro models induce T cell commitment from human HSPCs; however, differentiation into mature CD3+TCRab+ single positive (SP) CD8+ or CD4+ cells is limited. We describe here a serum-free, artificial thymic organoid (ATO) system that supports highly efficient and reproducible in vitro differentiation and positive selection of conventional human T cells from all sources of HSPCs. ATO-derived T cells exhibited mature naïve phenotypes, a diverse TCR repertoire, and TCR-dependent function. ATOs initiated with TCR-engineered HSPCs produced T cells with antigen specific cytotoxicity and near complete lack of endogenous TCR Vβ expression, consistent with allelic exclusion of Vβ loci. ATOs provide a robust tool for studying human T cell development and stem cell based approaches to engineered T cell therapies. PMID:28369043

  13. A structural model of the VEGF signalling pathway: emergence of robustness and redundancy properties.

    PubMed

    Lignet, Floriane; Calvez, Vincent; Grenier, Emmanuel; Ribba, Benjamin

    2013-02-01

    The vascular endothelial growth factor (VEGF) is known as one of the main promoter of angiogenesis - the process of blood vessel formation. Angiogenesis has been recognized as a key stage for cancer development and metastasis. In this paper, we propose a structural model of the main molecular pathways involved in the endothelial cells response to VEGF stimuli. The model, built on qualitative information from knowledge databases, is composed of 38 ordinary differential equations with 78 parameters and focuses on the signalling driving endothelial cell proliferation, migration and resistance to apoptosis. Following a VEGF stimulus, the model predicts an increase of proliferation and migration capability, and a decrease in the apoptosis activity. Model simulations and sensitivity analysis highlight the emergence of robustness and redundancy properties of the pathway. If further calibrated and validated, this model could serve as tool to analyse and formulate new hypothesis on th e VEGF signalling cascade and its role in cancer development and treatment.

  14. MultiNest: Efficient and Robust Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Feroz, F.; Hobson, M. P.; Bridges, M.

    2011-09-01

    We present further development and the first public release of our multimodal nested sampling algorithm, called MultiNest. This Bayesian inference tool calculates the evidence, with an associated error estimate, and produces posterior samples from distributions that may contain multiple modes and pronounced (curving) degeneracies in high dimensions. The developments presented here lead to further substantial improvements in sampling efficiency and robustness, as compared to the original algorithm presented in Feroz & Hobson (2008), which itself significantly outperformed existing MCMC techniques in a wide range of astrophysical inference problems. The accuracy and economy of the MultiNest algorithm is demonstrated by application to two toy problems and to a cosmological inference problem focusing on the extension of the vanilla LambdaCDM model to include spatial curvature and a varying equation of state for dark energy. The MultiNest software is fully parallelized using MPI and includes an interface to CosmoMC. It will also be released as part of the SuperBayeS package, for the analysis of supersymmetric theories of particle physics, at this http URL.

  15. Revised upper limb module for spinal muscular atrophy: Development of a new module.

    PubMed

    Mazzone, Elena S; Mayhew, Anna; Montes, Jacqueline; Ramsey, Danielle; Fanelli, Lavinia; Young, Sally Dunaway; Salazar, Rachel; De Sanctis, Roberto; Pasternak, Amy; Glanzman, Allan; Coratti, Giorgia; Civitello, Matthew; Forcina, Nicola; Gee, Richard; Duong, Tina; Pane, Marika; Scoto, Mariacristina; Pera, Maria Carmela; Messina, Sonia; Tennekoon, Gihan; Day, John W; Darras, Basil T; De Vivo, Darryl C; Finkel, Richard; Muntoni, Francesco; Mercuri, Eugenio

    2017-06-01

    There is a growing need for a robust clinical measure to assess upper limb motor function in spinal muscular atrophy (SMA), as the available scales lack sensitivity at the extremes of the clinical spectrum. We report the development of the Revised Upper Limb Module (RULM), an assessment specifically designed for upper limb function in SMA patients. An international panel with specific neuromuscular expertise performed a thorough review of scales currently available to assess upper limb function in SMA. This review facilitated a revision of the existing upper limb function scales to make a more robust clinical scale. Multiple revisions of the scale included statistical analysis and captured clinically relevant changes to fulfill requirements by regulators and advocacy groups. The resulting RULM scale shows good reliability and validity, making it a suitable tool to assess upper extremity function in the SMA population for multi-center clinical research. Muscle Nerve 55: 869-874, 2017. © 2016 Wiley Periodicals, Inc.

  16. TU-H-206-04: An Effective Homomorphic Unsharp Mask Filtering Method to Correct Intensity Inhomogeneity in Daily Treatment MR Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D; Gach, H; Li, H

    Purpose: The daily treatment MRIs acquired on MR-IGRT systems, like diagnostic MRIs, suffer from intensity inhomogeneity issue, associated with B1 and B0 inhomogeneities. An improved homomorphic unsharp mask (HUM) filtering method, automatic and robust body segmentation, and imaging field-of-view (FOV) detection methods were developed to compute the multiplicative slow-varying correction field and correct the intensity inhomogeneity. The goal is to improve and normalize the voxel intensity so that the images could be processed more accurately by quantitative methods (e.g., segmentation and registration) that require consistent image voxel intensity values. Methods: HUM methods have been widely used for years. A bodymore » mask is required, otherwise the body surface in the corrected image would be incorrectly bright due to the sudden intensity transition at the body surface. In this study, we developed an improved HUM-based correction method that includes three main components: 1) Robust body segmentation on the normalized image gradient map, 2) Robust FOV detection (needed for body segmentation) using region growing and morphologic filters, and 3) An effective implementation of HUM using repeated Gaussian convolution. Results: The proposed method was successfully tested on patient images of common anatomical sites (H/N, lung, abdomen and pelvis). Initial qualitative comparisons showed that this improved HUM method outperformed three recently published algorithms (FCM, LEMS, MICO) in both computation speed (by 50+ times) and robustness (in intermediate to severe inhomogeneity situations). Currently implemented in MATLAB, it takes 20 to 25 seconds to process a 3D MRI volume. Conclusion: Compared to more sophisticated MRI inhomogeneity correction algorithms, the improved HUM method is simple and effective. The inhomogeneity correction, body mask, and FOV detection methods developed in this study would be useful as preprocessing tools for many MRI-related research and clinical applications in radiotherapy. Authors have received research grants from ViewRay and Varian.« less

  17. Frailty Index Developed From a Cancer-Specific Geriatric Assessment and the Association With Mortality Among Older Adults With Cancer.

    PubMed

    Guerard, Emily J; Deal, Allison M; Chang, YunKyung; Williams, Grant R; Nyrop, Kirsten A; Pergolotti, Mackenzi; Muss, Hyman B; Sanoff, Hanna K; Lund, Jennifer L

    2017-07-01

    Background: An objective measure is needed to identify frail older adults with cancer who are at increased risk for poor health outcomes. The primary objective of this study was to develop a frailty index from a cancer-specific geriatric assessment (GA) and evaluate its ability to predict all-cause mortality among older adults with cancer. Patients and Methods: Using a unique and novel data set that brings together GA data with cancer-specific and long-term mortality data, we developed the Carolina Frailty Index (CFI) from a cancer-specific GA based on the principles of deficit accumulation. CFI scores (range, 0-1) were categorized as robust (0-0.2), pre-frail (0.2-0.35), and frail (>0.35). The primary outcome for evaluating predictive validity was all-cause mortality. The Kaplan-Meier method and log-rank tests were used to compare survival between frailty groups, and Cox proportional hazards regression models were used to evaluate associations. Results: In our sample of 546 older adults with cancer, the median age was 72 years, 72% were women, 85% were white, and 47% had a breast cancer diagnosis. Overall, 58% of patients were robust, 24% were pre-frail, and 18% were frail. The estimated 5-year survival rate was 72% in robust patients, 58% in pre-frail patients, and 34% in frail patients (log-rank test, P <.0001). Frail patients had more than a 2-fold increased risk of all-cause mortality compared with robust patients (adjusted hazard ratio, 2.36; 95% CI, 1.51-3.68). Conclusions: The CFI was predictive of all-cause mortality in older adults with cancer, a finding that was independent of age, sex, cancer type and stage, and number of medical comorbidities. The CFI has the potential to become a tool that oncologists can use to objectively identify frailty in older adults with cancer. Copyright © 2017 by the National Comprehensive Cancer Network.

  18. Multi-point objective-oriented sequential sampling strategy for constrained robust design

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; Zhang, Siliang; Chen, Wei

    2015-03-01

    Metamodelling techniques are widely used to approximate system responses of expensive simulation models. In association with the use of metamodels, objective-oriented sequential sampling methods have been demonstrated to be effective in balancing the need for searching an optimal solution versus reducing the metamodelling uncertainty. However, existing infilling criteria are developed for deterministic problems and restricted to one sampling point in one iteration. To exploit the use of multiple samples and identify the true robust solution in fewer iterations, a multi-point objective-oriented sequential sampling strategy is proposed for constrained robust design problems. In this article, earlier development of objective-oriented sequential sampling strategy for unconstrained robust design is first extended to constrained problems. Next, a double-loop multi-point sequential sampling strategy is developed. The proposed methods are validated using two mathematical examples followed by a highly nonlinear automotive crashworthiness design example. The results show that the proposed method can mitigate the effect of both metamodelling uncertainty and design uncertainty, and identify the robust design solution more efficiently than the single-point sequential sampling approach.

  19. Trade-offs between robustness and small-world effect in complex networks

    PubMed Central

    Peng, Guan-Sheng; Tan, Suo-Yi; Wu, Jun; Holme, Petter

    2016-01-01

    Robustness and small-world effect are two crucial structural features of complex networks and have attracted increasing attention. However, little is known about the relation between them. Here we demonstrate that, there is a conflicting relation between robustness and small-world effect for a given degree sequence. We suggest that the robustness-oriented optimization will weaken the small-world effect and vice versa. Then, we propose a multi-objective trade-off optimization model and develop a heuristic algorithm to obtain the optimal trade-off topology for robustness and small-world effect. We show that the optimal network topology exhibits a pronounced core-periphery structure and investigate the structural properties of the optimized networks in detail. PMID:27853301

  20. 2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carrington, David Bradley; Waters, Jiajia

    Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.

  1. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics1[OPEN

    PubMed Central

    Poeschl, Yvonne; Plötner, Romina

    2017-01-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. PMID:28931626

  2. A Universal and Robust Integrated Platform for the Scalable Production of Human Cardiomyocytes From Pluripotent Stem Cells.

    PubMed

    Fonoudi, Hananeh; Ansari, Hassan; Abbasalizadeh, Saeed; Larijani, Mehran Rezaei; Kiani, Sahar; Hashemizadeh, Shiva; Zarchi, Ali Sharifi; Bosman, Alexis; Blue, Gillian M; Pahlavan, Sara; Perry, Matthew; Orr, Yishay; Mayorchak, Yaroslav; Vandenberg, Jamie; Talkhabi, Mahmood; Winlaw, David S; Harvey, Richard P; Aghdami, Nasser; Baharvand, Hossein

    2015-12-01

    Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs), in conjunction with the promising outcomes from preclinical and clinical studies, have raised new hopes for cardiac cell therapy. We report the development of a scalable, robust, and integrated differentiation platform for large-scale production of hPSC-CM aggregates in a stirred suspension bioreactor as a single-unit operation. Precise modulation of the differentiation process by small molecule activation of WNT signaling, followed by inactivation of transforming growth factor-β and WNT signaling and activation of sonic hedgehog signaling in hPSCs as size-controlled aggregates led to the generation of approximately 100% beating CM spheroids containing virtually pure (∼90%) CMs in 10 days. Moreover, the developed differentiation strategy was universal, as demonstrated by testing multiple hPSC lines (5 human embryonic stem cell and 4 human inducible PSC lines) without cell sorting or selection. The produced hPSC-CMs successfully expressed canonical lineage-specific markers and showed high functionality, as demonstrated by microelectrode array and electrophysiology tests. This robust and universal platform could become a valuable tool for the mass production of functional hPSC-CMs as a prerequisite for realizing their promising potential for therapeutic and industrial applications, including drug discovery and toxicity assays. Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs) and the development of novel cell therapy strategies using hPSC-CMs (e.g., cardiac patches) in conjunction with promising preclinical and clinical studies, have raised new hopes for patients with end-stage cardiovascular disease, which remains the leading cause of morbidity and mortality globally. In this study, a simplified, scalable, robust, and integrated differentiation platform was developed to generate clinical grade hPSC-CMs as cell aggregates under chemically defined culture conditions. This approach resulted in approximately 100% beating CM spheroids with virtually pure (∼90%) functional cardiomyocytes in 10 days from multiple hPSC lines. This universal and robust bioprocessing platform can provide sufficient numbers of hPSC-CMs for companies developing regenerative medicine technologies to rescue, replace, and help repair damaged heart tissues and for pharmaceutical companies developing advanced biologics and drugs for regeneration of lost heart tissue using high-throughput technologies. It is believed that this technology can expedite clinical progress in these areas to achieve a meaningful impact on improving clinical outcomes, cost of care, and quality of life for those patients disabled and experiencing heart disease. ©AlphaMed Press.

  3. A Universal and Robust Integrated Platform for the Scalable Production of Human Cardiomyocytes From Pluripotent Stem Cells

    PubMed Central

    Fonoudi, Hananeh; Ansari, Hassan; Abbasalizadeh, Saeed; Larijani, Mehran Rezaei; Kiani, Sahar; Hashemizadeh, Shiva; Zarchi, Ali Sharifi; Bosman, Alexis; Blue, Gillian M.; Pahlavan, Sara; Perry, Matthew; Orr, Yishay; Mayorchak, Yaroslav; Vandenberg, Jamie; Talkhabi, Mahmood; Winlaw, David S.; Harvey, Richard P.; Aghdami, Nasser

    2015-01-01

    Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs), in conjunction with the promising outcomes from preclinical and clinical studies, have raised new hopes for cardiac cell therapy. We report the development of a scalable, robust, and integrated differentiation platform for large-scale production of hPSC-CM aggregates in a stirred suspension bioreactor as a single-unit operation. Precise modulation of the differentiation process by small molecule activation of WNT signaling, followed by inactivation of transforming growth factor-β and WNT signaling and activation of sonic hedgehog signaling in hPSCs as size-controlled aggregates led to the generation of approximately 100% beating CM spheroids containing virtually pure (∼90%) CMs in 10 days. Moreover, the developed differentiation strategy was universal, as demonstrated by testing multiple hPSC lines (5 human embryonic stem cell and 4 human inducible PSC lines) without cell sorting or selection. The produced hPSC-CMs successfully expressed canonical lineage-specific markers and showed high functionality, as demonstrated by microelectrode array and electrophysiology tests. This robust and universal platform could become a valuable tool for the mass production of functional hPSC-CMs as a prerequisite for realizing their promising potential for therapeutic and industrial applications, including drug discovery and toxicity assays. Significance Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs) and the development of novel cell therapy strategies using hPSC-CMs (e.g., cardiac patches) in conjunction with promising preclinical and clinical studies, have raised new hopes for patients with end-stage cardiovascular disease, which remains the leading cause of morbidity and mortality globally. In this study, a simplified, scalable, robust, and integrated differentiation platform was developed to generate clinical grade hPSC-CMs as cell aggregates under chemically defined culture conditions. This approach resulted in approximately 100% beating CM spheroids with virtually pure (∼90%) functional cardiomyocytes in 10 days from multiple hPSC lines. This universal and robust bioprocessing platform can provide sufficient numbers of hPSC-CMs for companies developing regenerative medicine technologies to rescue, replace, and help repair damaged heart tissues and for pharmaceutical companies developing advanced biologics and drugs for regeneration of lost heart tissue using high-throughput technologies. It is believed that this technology can expedite clinical progress in these areas to achieve a meaningful impact on improving clinical outcomes, cost of care, and quality of life for those patients disabled and experiencing heart disease. PMID:26511653

  4. Application of multi-factorial design of experiments to successfully optimize immunoassays for robust measurements of therapeutic proteins.

    PubMed

    Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh

    2009-02-20

    Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method development time.

  5. Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.

    2014-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials

  6. Optimization-Based Robust Nonlinear Control

    DTIC Science & Technology

    2006-08-01

    ABSTRACT New control algorithms were developed for robust stabilization of nonlinear dynamical systems . Novel, linear matrix inequality-based synthesis...was to further advance optimization-based robust nonlinear control design, for general nonlinear systems (especially in discrete time ), for linear...Teel, IEEE Transactions on Control Systems Technology, vol. 14, no. 3, p. 398-407, May 2006. 3. "A unified framework for input-to-state stability in

  7. Distribution path robust optimization of electric vehicle with multiple distribution centers

    PubMed Central

    Hao, Wei; He, Ruichun; Jia, Xiaoyan; Pan, Fuquan; Fan, Jing; Xiong, Ruiqi

    2018-01-01

    To identify electrical vehicle (EV) distribution paths with high robustness, insensitivity to uncertainty factors, and detailed road-by-road schemes, optimization of the distribution path problem of EV with multiple distribution centers and considering the charging facilities is necessary. With the minimum transport time as the goal, a robust optimization model of EV distribution path with adjustable robustness is established based on Bertsimas’ theory of robust discrete optimization. An enhanced three-segment genetic algorithm is also developed to solve the model, such that the optimal distribution scheme initially contains all road-by-road path data using the three-segment mixed coding and decoding method. During genetic manipulation, different interlacing and mutation operations are carried out on different chromosomes, while, during population evolution, the infeasible solution is naturally avoided. A part of the road network of Xifeng District in Qingyang City is taken as an example to test the model and the algorithm in this study, and the concrete transportation paths are utilized in the final distribution scheme. Therefore, more robust EV distribution paths with multiple distribution centers can be obtained using the robust optimization model. PMID:29518169

  8. An adaptive discontinuous Galerkin solver for aerodynamic flows

    NASA Astrophysics Data System (ADS)

    Burgess, Nicholas K.

    This work considers the accuracy, efficiency, and robustness of an unstructured high-order accurate discontinuous Galerkin (DG) solver for computational fluid dynamics (CFD). Recently, there has been a drive to reduce the discretization error of CFD simulations using high-order methods on unstructured grids. However, high-order methods are often criticized for lacking robustness and having high computational cost. The goal of this work is to investigate methods that enhance the robustness of high-order discontinuous Galerkin (DG) methods on unstructured meshes, while maintaining low computational cost and high accuracy of the numerical solutions. This work investigates robustness enhancement of high-order methods by examining effective non-linear solvers, shock capturing methods, turbulence model discretizations and adaptive refinement techniques. The goal is to develop an all encompassing solver that can simulate a large range of physical phenomena, where all aspects of the solver work together to achieve a robust, efficient and accurate solution strategy. The components and framework for a robust high-order accurate solver that is capable of solving viscous, Reynolds Averaged Navier-Stokes (RANS) and shocked flows is presented. In particular, this work discusses robust discretizations of the turbulence model equation used to close the RANS equations, as well as stable shock capturing strategies that are applicable across a wide range of discretization orders and applicable to very strong shock waves. Furthermore, refinement techniques are considered as both efficiency and robustness enhancement strategies. Additionally, efficient non-linear solvers based on multigrid and Krylov subspace methods are presented. The accuracy, efficiency, and robustness of the solver is demonstrated using a variety of challenging aerodynamic test problems, which include turbulent high-lift and viscous hypersonic flows. Adaptive mesh refinement was found to play a critical role in obtaining a robust and efficient high-order accurate flow solver. A goal-oriented error estimation technique has been developed to estimate the discretization error of simulation outputs. For high-order discretizations, it is shown that functional output error super-convergence can be obtained, provided the discretization satisfies a property known as dual consistency. The dual consistency of the DG methods developed in this work is shown via mathematical analysis and numerical experimentation. Goal-oriented error estimation is also used to drive an hp-adaptive mesh refinement strategy, where a combination of mesh or h-refinement, and order or p-enrichment, is employed based on the smoothness of the solution. The results demonstrate that the combination of goal-oriented error estimation and hp-adaptation yield superior accuracy, as well as enhanced robustness and efficiency for a variety of aerodynamic flows including flows with strong shock waves. This work demonstrates that DG discretizations can be the basis of an accurate, efficient, and robust CFD solver. Furthermore, enhancing the robustness of DG methods does not adversely impact the accuracy or efficiency of the solver for challenging and complex flow problems. In particular, when considering the computation of shocked flows, this work demonstrates that the available shock capturing techniques are sufficiently accurate and robust, particularly when used in conjunction with adaptive mesh refinement . This work also demonstrates that robust solutions of the Reynolds Averaged Navier-Stokes (RANS) and turbulence model equations can be obtained for complex and challenging aerodynamic flows. In this context, the most robust strategy was determined to be a low-order turbulence model discretization coupled to a high-order discretization of the RANS equations. Although RANS solutions using high-order accurate discretizations of the turbulence model were obtained, the behavior of current-day RANS turbulence models discretized to high-order was found to be problematic, leading to solver robustness issues. This suggests that future work is warranted in the area of turbulence model formulation for use with high-order discretizations. Alternately, the use of Large-Eddy Simulation (LES) subgrid scale models with high-order DG methods offers the potential to leverage the high accuracy of these methods for very high fidelity turbulent simulations. This thesis has developed the algorithmic improvements that will lay the foundation for the development of a three-dimensional high-order flow solution strategy that can be used as the basis for future LES simulations.

  9. Synthesis Methods for Robust Passification and Control

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul G.; Joshi, Suresh M. (Technical Monitor)

    2000-01-01

    The research effort under this cooperative agreement has been essentially the continuation of the work from previous grants. The ongoing work has primarily focused on developing passivity-based control techniques for Linear Time-Invariant (LTI) systems. During this period, there has been a significant progress made in the area of passivity-based control of LTI systems and some preliminary results have also been obtained for nonlinear systems, as well. The prior work has addressed optimal control design for inherently passive as well as non- passive linear systems. For exploiting the robustness characteristics of passivity-based controllers the passification methodology was developed for LTI systems that are not inherently passive. Various methods of passification were first proposed in and further developed. The robustness of passification was addressed for multi-input multi-output (MIMO) systems for certain classes of uncertainties using frequency-domain methods. For MIMO systems, a state-space approach using Linear Matrix Inequality (LMI)-based formulation was presented, for passification of non-passive LTI systems. An LMI-based robust passification technique was presented for systems with redundant actuators and sensors. The redundancy in actuators and sensors was used effectively for robust passification using the LMI formulation. The passification was designed to be robust to an interval-type uncertainties in system parameters. The passification techniques were used to design a robust controller for Benchmark Active Control Technology wing under parametric uncertainties. The results on passive nonlinear systems, however, are very limited to date. Our recent work in this area was presented, wherein some stability results were obtained for passive nonlinear systems that are affine in control.

  10. External skeletal robusticity of children and adolescents - European references from birth to adulthood and international comparisons.

    PubMed

    Mumm, Rebekka; Godina, Elena; Koziel, Slawomir; Musalek, Martin; Sedlak, Petr; Wittwer-Backofen, Ursula; Hesse, Volker; Dasgupta, Parasmani; Henneberg, Maciej; Scheffler, Christiane

    2018-06-11

    Background: In our modern world, the way of life in nutritional and activity behaviour has changed. As a consequence, parallel trends of an epidemic of overweight and a decline in external skeletal robusticity are observed in children and adolescents. Aim: We aim to develop reference centiles for external skeletal robusticity of European girls and boys aged 0 to 18 years using the Frame Index as an indicator and identify population specific age-related patterns. Methods: We analysed cross-sectional & longitudinal data on body height and elbow breadth of boys and girls from Europe (0-18 years, n = 41.679), India (7-18 years, n = 3.297) and South Africa (3-18 years, n = 4.346). As an indicator of external skeletal robusticity Frame Index after Frisancho (1990) was used. We developed centiles for boys and girls using the LMS-method and its extension. Results: Boys have greater external skeletal robusticity than girls. Whereas in girls Frame Index decreases continuously during growth, an increase of Frame Index from 12 to 16 years in European boys can be observed. Indian and South African boys are almost similar in Frame Index to European boys. In girls, the pattern is slightly different. Whereas South African girls are similar to European girls, Indian girls show a lesser external skeletal robusticity. Conclusion: Accurate references for external skeletal robusticity are needed to evaluate if skeletal development is adequate per age. They should be used to monitor effects of changes in way of life and physical activity levels in children and adolescents to avoid negative health outcomes like osteoporosis and arthrosis.

  11. Investigating the Impact of Off-Nominal Events on High-Density "Green" Arrivals

    NASA Technical Reports Server (NTRS)

    Callatine, Todd J.; Cabrall, Christopher; Kupfer, Michael; Martin, Lynne; Mercer, Joey; Palmer, Everett A.

    2012-01-01

    Trajectory-based controller tools developed to support a schedule-based terminal-area air traffic management (ATM) concept have been shown effective for enabling green arrivals along Area Navigation (RNAV) routes in moderately high-density traffic conditions. A recent human-in-the-loop simulation investigated the robustness of the concept and tools to off-nominal events events that lead to situations in which runway arrival schedules require adjustments and controllers can no longer use speed control alone to impose the necessary delays. Study participants included a terminal-area Traffic Management Supervisor responsible for adjusting the schedules. Sector-controller participants could issue alternate RNAV transition routes to absorb large delays. The study also included real-time winds/wind-forecast changes. The results indicate that arrival spacing accuracy, schedule conformance, and tool usage and usefulness are similar to that observed in simulations of nominal operations. However, the time and effort required to recover from an off-nominal event is highly context-sensitive, and impacted by the required schedule adjustments and control methods available for managing the evolving situation. The research suggests ways to bolster the off-nominal recovery process, and highlights challenges related to using human-in-the-loop simulation to investigate the safety and robustness of advanced ATM concepts.

  12. The Exoplanet Characterization ToolKit (ExoCTK)

    NASA Astrophysics Data System (ADS)

    Stevenson, Kevin; Fowler, Julia; Lewis, Nikole K.; Fraine, Jonathan; Pueyo, Laurent; Valenti, Jeff; Bruno, Giovanni; Filippazzo, Joseph; Hill, Matthew; Batalha, Natasha E.; Bushra, Rafia

    2018-01-01

    The success of exoplanet characterization depends critically on a patchwork of analysis tools and spectroscopic libraries that currently require extensive development and lack a centralized support system. Due to the complexity of spectroscopic analyses and initial time commitment required to become productive, there are currently a limited number of teams that are actively advancing the field. New teams with significant expertise, but without the proper tools, face prohibitively steep hills to climb before they can contribute. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface focused primarily on atmospheric characterization of exoplanets and exoplanet transit observation planning with JWST. The foundation of these software tools and libraries exist within pockets of the exoplanet community. Our project will gather these seedling tools and grow a robust, uniform, and well maintained exoplanet characterization toolkit.

  13. Robust control algorithms for Mars aerobraking

    NASA Technical Reports Server (NTRS)

    Shipley, Buford W., Jr.; Ward, Donald T.

    1992-01-01

    Four atmospheric guidance concepts have been adapted to control an interplanetary vehicle aerobraking in the Martian atmosphere. The first two offer improvements to the Analytic Predictor Corrector (APC) to increase its robustness to density variations. The second two are variations of a new Liapunov tracking exit phase algorithm, developed to guide the vehicle along a reference trajectory. These four new controllers are tested using a six degree of freedom computer simulation to evaluate their robustness. MARSGRAM is used to develop realistic atmospheres for the study. When square wave density pulses perturb the atmosphere all four controllers are successful. The algorithms are tested against atmospheres where the inbound and outbound density functions are different. Square wave density pulses are again used, but only for the outbound leg of the trajectory. Additionally, sine waves are used to perturb the density function. The new algorithms are found to be more robust than any previously tested and a Liapunov controller is selected as the most robust control algorithm overall examined.

  14. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  15. A subagging regression method for estimating the qualitative and quantitative state of groundwater

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Park, E.; Choi, J.; Han, W. S.; Yun, S. T.

    2016-12-01

    A subagging regression (SBR) method for the analysis of groundwater data pertaining to the estimation of trend and the associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of the other methods and the uncertainties are reasonably estimated where the others have no uncertainty analysis option. To validate further, real quantitative and qualitative data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by SBR, whereas the GPR has limitations in representing the variability of non-Gaussian skewed data. From the implementations, it is determined that the SBR method has potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data.

  16. Exploiting structure: Introduction and motivation

    NASA Technical Reports Server (NTRS)

    Xu, Zhong Ling

    1994-01-01

    This annual report summarizes the research activities that were performed from 26 Jun. 1993 to 28 Feb. 1994. We continued to investigate the Robust Stability of Systems where transfer functions or characteristic polynomials are affine multilinear functions of parameters. An approach that differs from 'Stability by Linear Process' and that reduces the computational burden of checking the robust stability of the system with multilinear uncertainty was found for low order, 2-order, and 3-order cases. We proved a crucial theorem, the so-called Face Theorem. Previously, we have proven Kharitonov's Vertex Theorem and the Edge Theorem by Bartlett. The detail of this proof is contained in the Appendix. This Theorem provides a tool to describe the boundary of the image of the affine multilinear function. For SPR design, we have developed some new results. The third objective for this period is to design a controller for IHM by the H-infinity optimization technique. The details are presented in the Appendix.

  17. Effective Tools and Resources from the MAVEN Education and Public Outreach Program

    NASA Astrophysics Data System (ADS)

    Mason, T.

    2015-12-01

    Since 2010, NASA's Mars Atmosphere and Volatile Evolution (MAVEN) Education and Public Outreach (E/PO) team has developed and implemented a robust and varied suite of projects, serving audiences of all ages and diverse backgrounds from across the country. With a program designed to reach formal K-12 educators and students, afterschool and summertime communities, museum docents, journalists, and online audiences, we have incorporated an equally varied approach to developing tools, resources, and evaluation methods to specifically reach each target population and to determine the effectiveness of our efforts. This poster will highlight some of the tools and resources we have developed to share the complex science and engineering of the MAVEN mission, as well as initial evaluation results and lessons-learned from each of our E/PO projects.

  18. Software Hardware Asset Reuse Enterprise (SHARE) Repository Framework: Related Work and Development Plan

    DTIC Science & Technology

    2009-08-19

    designed to collect the data and assist the analyst in drawing relationships between the data. Palantir Technologies has created one such software...application to support the DoD intelligence community by providing robust capabilities for managing data from various sources10. The Palantir tool...www.palantirtech.com/ - 38 - Figure 17. Palantir Graphical Interface (Gordon-Schlosberg, 2008) Similar examples of the use of ontologies to support data

  19. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  20. The NASA Constellation University Institutes Project: Thrust Chamber Assembly Virtual Institute

    NASA Technical Reports Server (NTRS)

    Tucker, P. Kevin; Rybak, Jeffry A.; Hulka, James R.; Jones, Gregg W.; Nesman, Tomas; West, Jeffrey S.

    2006-01-01

    This paper documents key aspects of the Constellation University Institutes Project (CUIP) Thrust Chamber Assembly (TCA) Virtual Institute (VI). Specifically, the paper details the TCA VI organizational and functional aspects relative to providing support for Constellation Systems. The TCA VI vision is put forth and discussed in detail. The vision provides the objective and approach for improving thrust chamber assembly design methodologies by replacing the current empirical tools with verified and validated CFD codes. The vision also sets out ignition, performance, thermal environments and combustion stability as focus areas where application of these improved tools is required. Flow physics and a study of the Space Shuttle Main Engine development program are used to conclude that the injector is the key to robust TCA design. Requirements are set out in terms of fidelity, robustness and demonstrated accuracy of the design tool. Lack of demonstrated accuracy is noted as the most significant obstacle to realizing the potential of CFD to be widely used as an injector design tool. A hierarchical decomposition process is outlined to facilitate the validation process. A simulation readiness level tool used to gauge progress toward the goal is described. Finally, there is a description of the current efforts in each focus area. The background of each focus area is discussed. The state of the art in each focus area is noted along with the TCA VI research focus in the area. Brief highlights of work in the area are also included.

  1. Decision Support Methods and Tools

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.

    2006-01-01

    This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed

  2. Block-diagonalization as a tool for the robust diabatization of high-dimensional potential energy surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venghaus, Florian; Eisfeld, Wolfgang, E-mail: wolfgang.eisfeld@uni-bielefeld.de

    2016-03-21

    Robust diabatization techniques are key for the development of high-dimensional coupled potential energy surfaces (PESs) to be used in multi-state quantum dynamics simulations. In the present study we demonstrate that, besides the actual diabatization technique, common problems with the underlying electronic structure calculations can be the reason why a diabatization fails. After giving a short review of the theoretical background of diabatization, we propose a method based on the block-diagonalization to analyse the electronic structure data. This analysis tool can be used in three different ways: First, it allows to detect issues with the ab initio reference data and ismore » used to optimize the setup of the electronic structure calculations. Second, the data from the block-diagonalization are utilized for the development of optimal parametrized diabatic model matrices by identifying the most significant couplings. Third, the block-diagonalization data are used to fit the parameters of the diabatic model, which yields an optimal initial guess for the non-linear fitting required by standard or more advanced energy based diabatization methods. The new approach is demonstrated by the diabatization of 9 electronic states of the propargyl radical, yielding fully coupled full-dimensional (12D) PESs in closed form.« less

  3. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example. PMID:23515190

  4. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology.

    PubMed

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example.

  5. Multiplex social ecological network analysis reveals how social changes affect community robustness more than resource depletion.

    PubMed

    Baggio, Jacopo A; BurnSilver, Shauna B; Arenas, Alex; Magdanz, James S; Kofinas, Gary P; De Domenico, Manlio

    2016-11-29

    Network analysis provides a powerful tool to analyze complex influences of social and ecological structures on community and household dynamics. Most network studies of social-ecological systems use simple, undirected, unweighted networks. We analyze multiplex, directed, and weighted networks of subsistence food flows collected in three small indigenous communities in Arctic Alaska potentially facing substantial economic and ecological changes. Our analysis of plausible future scenarios suggests that changes to social relations and key households have greater effects on community robustness than changes to specific wild food resources.

  6. An Overview of the Role of Systems Analysis in NASA's Hypersonics Project

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.; Martin John G.; Bowles, Jeffrey V> ; Mehta, Unmeel B.; Snyder, CHristopher A.

    2006-01-01

    NASA's Aeronautics Research Mission Directorate recently restructured its Vehicle Systems Program, refocusing it towards understanding the fundamental physics that govern flight in all speed regimes. Now called the Fundamental Aeronautics Program, it is comprised of four new projects, Subsonic Fixed Wing, Subsonic Rotary Wing, Supersonics, and Hypersonics. The Aeronautics Research Mission Directorate has charged the Hypersonics Project with having a basic understanding of all systems that travel at hypersonic speeds within the Earth's and other planets atmospheres. This includes both powered and unpowered systems, such as re-entry vehicles and vehicles powered by rocket or airbreathing propulsion that cruise in and accelerate through the atmosphere. The primary objective of the Hypersonics Project is to develop physics-based predictive tools that enable the design, analysis and optimization of such systems. The Hypersonics Project charges the systems analysis discipline team with providing it the decision-making information it needs to properly guide research and technology development. Credible, rapid, and robust multi-disciplinary system analysis processes and design tools are required in order to generate this information. To this end, the principal challenges for the systems analysis team are the introduction of high fidelity physics into the analysis process and integration into a design environment, quantification of design uncertainty through the use of probabilistic methods, reduction in design cycle time, and the development and implementation of robust processes and tools enabling a wide design space and associated technology assessment capability. This paper will discuss the roles and responsibilities of the systems analysis discipline team within the Hypersonics Project as well as the tools, methods, processes, and approach that the team will undertake in order to perform its project designated functions.

  7. Developing Optimized Trajectories Derived from Mission and Thermo-Structural Constraints

    NASA Technical Reports Server (NTRS)

    Lear, Matthew H.; McGrath, Brian E.; Anderson, Michael P.; Green, Peter W.

    2008-01-01

    In conjunction with NASA and the Department of Defense, the Johns Hopkins University Applied Physics Laboratory (JHU/APL) has been investigating analytical techniques to address many of the fundamental issues associated with solar exploration spacecraft and high-speed atmospheric vehicle systems. These issues include: thermo-structural response including the effects of thermal management via the use of surface optical properties for high-temperature composite structures; aerodynamics with the effects of non-equilibrium chemistry and gas radiation; and aero-thermodynamics with the effects of material ablation for a wide range of thermal protection system (TPS) materials. The need exists to integrate these discrete tools into a common framework that enables the investigation of interdisciplinary interactions (including analysis tool, applied load, and environment uncertainties) to provide high fidelity solutions. In addition to developing robust tools for the coupling of aerodynamically induced thermal and mechanical loads, JHU/APL has been studying the optimal design of high-speed vehicles as a function of their trajectory. Under traditional design methodology the optimization of system level mission parameters such as range and time of flight is performed independently of the optimization for thermal and mechanical constraints such as stress and temperature. A truly optimal trajectory should optimize over the entire range of mission and thermo-mechanical constraints. Under this research, a framework for the robust analysis of high-speed spacecraft and atmospheric vehicle systems has been developed. It has been built around a generic, loosely coupled framework such that a variety of readily available analysis tools can be used. The methodology immediately addresses many of the current analysis inadequacies and allows for future extension in order to handle more complex problems.

  8. Reliability testing of a portfolio assessment tool for postgraduate family medicine training in South Africa

    PubMed Central

    Mash, Bob; Derese, Anselme

    2013-01-01

    Abstract Background Competency-based education and the validity and reliability of workplace-based assessment of postgraduate trainees have received increasing attention worldwide. Family medicine was recognised as a speciality in South Africa six years ago and a satisfactory portfolio of learning is a prerequisite to sit the national exit exam. A massive scaling up of the number of family physicians is needed in order to meet the health needs of the country. Aim The aim of this study was to develop a reliable, robust and feasible portfolio assessment tool (PAT) for South Africa. Methods Six raters each rated nine portfolios from the Stellenbosch University programme, using the PAT, to test for inter-rater reliability. This rating was repeated three months later to determine test–retest reliability. Following initial analysis and feedback the PAT was modified and the inter-rater reliability again assessed on nine new portfolios. An acceptable intra-class correlation was considered to be > 0.80. Results The total score was found to be reliable, with a coefficient of 0.92. For test–retest reliability, the difference in mean total score was 1.7%, which was not statistically significant. Amongst the subsections, only assessment of the educational meetings and the logbook showed reliability coefficients > 0.80. Conclusion This was the first attempt to develop a reliable, robust and feasible national portfolio assessment tool to assess postgraduate family medicine training in the South African context. The tool was reliable for the total score, but the low reliability of several sections in the PAT helped us to develop 12 recommendations regarding the use of the portfolio, the design of the PAT and the training of raters.

  9. Robust Control Systems.

    DTIC Science & Technology

    1981-12-01

    time control system algorithms that will perform adequately (i.e., at least maintain closed-loop system stability) when ucertain parameters in the...system design models vary significantly. Such a control algorithm is said to have stability robustness-or more simply is said to be "robust". This...cas6s above, the performance is analyzed using a covariance analysis. The development of all the controllers and the performance analysis algorithms is

  10. Impact of Spot Size and Spacing on the Quality of Robustly Optimized Intensity Modulated Proton Therapy Plans for Lung Cancer.

    PubMed

    Liu, Chenbin; Schild, Steven E; Chang, Joe Y; Liao, Zhongxing; Korte, Shawn; Shen, Jiajian; Ding, Xiaoning; Hu, Yanle; Kang, Yixiu; Keole, Sameer R; Sio, Terence T; Wong, William W; Sahoo, Narayan; Bues, Martin; Liu, Wei

    2018-06-01

    To investigate how spot size and spacing affect plan quality, robustness, and interplay effects of robustly optimized intensity modulated proton therapy (IMPT) for lung cancer. Two robustly optimized IMPT plans were created for 10 lung cancer patients: first by a large-spot machine with in-air energy-dependent large spot size at isocenter (σ: 6-15 mm) and spacing (1.3 σ), and second by a small-spot machine with in-air energy-dependent small spot size (σ: 2-6 mm) and spacing (5 mm). Both plans were generated by optimizing radiation dose to internal target volume on averaged 4-dimensional computed tomography scans using an in-house-developed IMPT planning system. The dose-volume histograms band method was used to evaluate plan robustness. Dose evaluation software was developed to model time-dependent spot delivery to incorporate interplay effects with randomized starting phases for each field per fraction. Patient anatomy voxels were mapped phase-to-phase via deformable image registration, and doses were scored using in-house-developed software. Dose-volume histogram indices, including internal target volume dose coverage, homogeneity, and organs at risk (OARs) sparing, were compared using the Wilcoxon signed-rank test. Compared with the large-spot machine, the small-spot machine resulted in significantly lower heart and esophagus mean doses, with comparable target dose coverage, homogeneity, and protection of other OARs. Plan robustness was comparable for targets and most OARs. With interplay effects considered, significantly lower heart and esophagus mean doses with comparable target dose coverage and homogeneity were observed using smaller spots. Robust optimization with a small spot-machine significantly improves heart and esophagus sparing, with comparable plan robustness and interplay effects compared with robust optimization with a large-spot machine. A small-spot machine uses a larger number of spots to cover the same tumors compared with a large-spot machine, which gives the planning system more freedom to compensate for the higher sensitivity to uncertainties and interplay effects for lung cancer treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Info-gap robust-satisficing model of foraging behavior: do foragers optimize or satisfice?

    PubMed

    Carmel, Yohay; Ben-Haim, Yakov

    2005-11-01

    In this note we compare two mathematical models of foraging that reflect two competing theories of animal behavior: optimizing and robust satisficing. The optimal-foraging model is based on the marginal value theorem (MVT). The robust-satisficing model developed here is an application of info-gap decision theory. The info-gap robust-satisficing model relates to the same circumstances described by the MVT. We show how these two alternatives translate into specific predictions that at some points are quite disparate. We test these alternative predictions against available data collected in numerous field studies with a large number of species from diverse taxonomic groups. We show that a large majority of studies appear to support the robust-satisficing model and reject the optimal-foraging model.

  12. Analysis and improvements of Adaptive Particle Refinement (APR) through CPU time, accuracy and robustness considerations

    NASA Astrophysics Data System (ADS)

    Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.

    2018-02-01

    While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.

  13. Integrating uncertainty into public energy research and development decisions

    NASA Astrophysics Data System (ADS)

    Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina

    2017-05-01

    Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.

  14. Robust Light Filters Support Powerful Imaging Devices

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Infrared (IR) light filters developed by Lake Shore Cryotronics Inc. of Westerville, Ohio -- using SBIR funding from NASA s Jet Propulsion Laboratory and Langley Research Center -- employ porous silicon and metal mesh technology to provide optical filtration even at the ultra-low temperatures required by many IR sensors. With applications in the astronomy community, Lake Shore s SBIR-developed filters are also promising tools for use in terahertz imaging, the next wave of technology for applications like medical imaging, the study of fragile artworks, and airport security.

  15. Modal control theory and application to aircraft lateral handling qualities design

    NASA Technical Reports Server (NTRS)

    Srinathkumar, S.

    1978-01-01

    A multivariable synthesis procedure based on eigenvalue/eigenvector assignment is reviewed and is employed to develop a systematic design procedure to meet the lateral handling qualities design objectives of a fighter aircraft over a wide range of flight conditions. The closed loop modal characterization developed provides significant insight into the design process and plays a pivotal role in the synthesis of robust feedback systems. The simplicity of the synthesis algorithm yields an efficient computer aided interactive design tool for flight control system synthesis.

  16. Advanced Design Methodology for Robust Aircraft Sizing and Synthesis

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    1997-01-01

    Contract efforts are focused on refining the Robust Design Methodology for Conceptual Aircraft Design. Robust Design Simulation (RDS) was developed earlier as a potential solution to the need to do rapid trade-offs while accounting for risk, conflict, and uncertainty. The core of the simulation revolved around Response Surface Equations as approximations of bounded design spaces. An ongoing investigation is concerned with the advantages of using Neural Networks in conceptual design. Thought was also given to the development of systematic way to choose or create a baseline configuration based on specific mission requirements. Expert system was developed, which selects aerodynamics, performance and weights model from several configurations based on the user's mission requirements for subsonic civil transport. The research has also resulted in a step-by-step illustration on how to use the AMV method for distribution generation and the search for robust design solutions to multivariate constrained problems.

  17. Medical Imaging Lesion Detection Based on Unified Gravitational Fuzzy Clustering

    PubMed Central

    Vianney Kinani, Jean Marie; Gallegos Funes, Francisco; Mújica Vargas, Dante; Ramos Díaz, Eduardo; Arellano, Alfonso

    2017-01-01

    We develop a swift, robust, and practical tool for detecting brain lesions with minimal user intervention to assist clinicians and researchers in the diagnosis process, radiosurgery planning, and assessment of the patient's response to the therapy. We propose a unified gravitational fuzzy clustering-based segmentation algorithm, which integrates the Newtonian concept of gravity into fuzzy clustering. We first perform fuzzy rule-based image enhancement on our database which is comprised of T1/T2 weighted magnetic resonance (MR) and fluid-attenuated inversion recovery (FLAIR) images to facilitate a smoother segmentation. The scalar output obtained is fed into a gravitational fuzzy clustering algorithm, which separates healthy structures from the unhealthy. Finally, the lesion contour is automatically outlined through the initialization-free level set evolution method. An advantage of this lesion detection algorithm is its precision and its simultaneous use of features computed from the intensity properties of the MR scan in a cascading pattern, which makes the computation fast, robust, and self-contained. Furthermore, we validate our algorithm with large-scale experiments using clinical and synthetic brain lesion datasets. As a result, an 84%–93% overlap performance is obtained, with an emphasis on robustness with respect to different and heterogeneous types of lesion and a swift computation time. PMID:29158887

  18. Robustness and cognition in stabilization problem of dynamical systems based on asymptotic methods

    NASA Astrophysics Data System (ADS)

    Dubovik, S. A.; Kabanov, A. A.

    2017-01-01

    The problem of synthesis of stabilizing systems based on principles of cognitive (logical-dynamic) control for mobile objects used under uncertain conditions is considered. This direction in control theory is based on the principles of guaranteeing robust synthesis focused on worst-case scenarios of the controlled process. The guaranteeing approach is able to provide functioning of the system with the required quality and reliability only at sufficiently low disturbances and in the absence of large deviations from some regular features of the controlled process. The main tool for the analysis of large deviations and prediction of critical states here is the action functional. After the forecast is built, the choice of anti-crisis control is the supervisory control problem that optimizes the control system in a normal mode and prevents escape of the controlled process in critical states. An essential aspect of the approach presented here is the presence of a two-level (logical-dynamic) control: the input data are used not only for generating of synthesized feedback (local robust synthesis) in advance (off-line), but also to make decisions about the current (on-line) quality of stabilization in the global sense. An example of using the presented approach for the problem of development of the ship tilting prediction system is considered.

  19. Design and development of a robust ATP subsystem for the Altair UAV-to-Ground lasercomm 2.5 Gbps demonstration

    NASA Technical Reports Server (NTRS)

    Ortiz, G. G.; Lee, S.; Monacos, S.; Wright, M.; Biswas, A.

    2003-01-01

    A robust acquisition, tracking and pointing (ATP) subsystem is being developed for the 2.5 Gigabit per second (Gbps) Unmanned-Aerial-Vehicle (UAV) to ground free-space optical communications link project.

  20. Borehole Tool for the Comprehensive Characterization of Hydrate-bearing Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Sheng; Santamarina, J. Carlos

    Reservoir characterization and simulation require reliable parameters to anticipate hydrate deposits responses and production rates. The acquisition of the required fundamental properties currently relies on wireline logging, pressure core testing, and/or laboratory observations of synthesized specimens, which are challenged by testing capabilities and innate sampling disturbances. The project reviews hydrate-bearing sediments, properties, and inherent sampling effects, albeit lessen with the developments in pressure core technology, in order to develop robust correlations with index parameters. The resulting information is incorporated into a tool for optimal field characterization and parameter selection with uncertainty analyses. Ultimately, the project develops a borehole tool formore » the comprehensive characterization of hydrate-bearing sediments at in situ, with the design recognizing past developments and characterization experience and benefited from the inspiration of nature and sensor miniaturization.« less

  1. Aircraft ride quality controller design using new robust root clustering theory for linear uncertain systems

    NASA Technical Reports Server (NTRS)

    Yedavalli, R. K.

    1992-01-01

    The aspect of controller design for improving the ride quality of aircraft in terms of damping ratio and natural frequency specifications on the short period dynamics is addressed. The controller is designed to be robust with respect to uncertainties in the real parameters of the control design model such as uncertainties in the dimensional stability derivatives, imperfections in actuator/sensor locations and possibly variations in flight conditions, etc. The design is based on a new robust root clustering theory developed by the author by extending the nominal root clustering theory of Gutman and Jury to perturbed matrices. The proposed methodology allows to get an explicit relationship between the parameters of the root clustering region and the uncertainty radius of the parameter space. The current literature available for robust stability becomes a special case of this unified theory. The bounds derived on the parameter perturbation for robust root clustering are then used in selecting the robust controller.

  2. A robust multi-objective global supplier selection model under currency fluctuation and price discount

    NASA Astrophysics Data System (ADS)

    Zarindast, Atousa; Seyed Hosseini, Seyed Mohamad; Pishvaee, Mir Saman

    2017-06-01

    Robust supplier selection problem, in a scenario-based approach has been proposed, when the demand and exchange rates are subject to uncertainties. First, a deterministic multi-objective mixed integer linear programming is developed; then, the robust counterpart of the proposed mixed integer linear programming is presented using the recent extension in robust optimization theory. We discuss decision variables, respectively, by a two-stage stochastic planning model, a robust stochastic optimization planning model which integrates worst case scenario in modeling approach and finally by equivalent deterministic planning model. The experimental study is carried out to compare the performances of the three models. Robust model resulted in remarkable cost saving and it illustrated that to cope with such uncertainties, we should consider them in advance in our planning. In our case study different supplier were selected due to this uncertainties and since supplier selection is a strategic decision, it is crucial to consider these uncertainties in planning approach.

  3. SU-F-BRB-07: A Plan Comparison Tool to Ensure Robustness and Deliverability in Online-Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, P; Labby, Z; Bayliss, R A

    Purpose: To develop a plan comparison tool that will ensure robustness and deliverability through analysis of baseline and online-adaptive radiotherapy plans using similarity metrics. Methods: The ViewRay MRIdian treatment planning system allows export of a plan file that contains plan and delivery information. A software tool was developed to read and compare two plans, providing information and metrics to assess their similarity. In addition to performing direct comparisons (e.g. demographics, ROI volumes, number of segments, total beam-on time), the tool computes and presents histograms of derived metrics (e.g. step-and-shoot segment field sizes, segment average leaf gaps). Such metrics were investigatedmore » for their ability to predict that an online-adapted plan reasonably similar to a baseline plan where deliverability has already been established. Results: In the realm of online-adaptive planning, comparing ROI volumes offers a sanity check to verify observations found during contouring. Beyond ROI analysis, it has been found that simply editing contours and re-optimizing to adapt treatment can produce a delivery that is substantially different than the baseline plan (e.g. number of segments increased by 31%), with no changes in optimization parameters and only minor changes in anatomy. Currently the tool can quickly identify large omissions or deviations from baseline expectations. As our online-adaptive patient population increases, we will continue to develop and refine quantitative acceptance criteria for adapted plans and relate them historical delivery QA measurements. Conclusion: The plan comparison tool is in clinical use and reports a wide range of comparison metrics, illustrating key differences between two plans. This independent check is accomplished in seconds and can be performed in parallel to other tasks in the online-adaptive workflow. Current use prevents large planning or delivery errors from occurring, and ongoing refinements will lead to increased assurance of plan quality.« less

  4. Robust Variable Selection with Exponential Squared Loss.

    PubMed

    Wang, Xueqin; Jiang, Yunlu; Huang, Mian; Zhang, Heping

    2013-04-01

    Robust variable selection procedures through penalized regression have been gaining increased attention in the literature. They can be used to perform variable selection and are expected to yield robust estimates. However, to the best of our knowledge, the robustness of those penalized regression procedures has not been well characterized. In this paper, we propose a class of penalized robust regression estimators based on exponential squared loss. The motivation for this new procedure is that it enables us to characterize its robustness that has not been done for the existing procedures, while its performance is near optimal and superior to some recently developed methods. Specifically, under defined regularity conditions, our estimators are [Formula: see text] and possess the oracle property. Importantly, we show that our estimators can achieve the highest asymptotic breakdown point of 1/2 and that their influence functions are bounded with respect to the outliers in either the response or the covariate domain. We performed simulation studies to compare our proposed method with some recent methods, using the oracle method as the benchmark. We consider common sources of influential points. Our simulation studies reveal that our proposed method performs similarly to the oracle method in terms of the model error and the positive selection rate even in the presence of influential points. In contrast, other existing procedures have a much lower non-causal selection rate. Furthermore, we re-analyze the Boston Housing Price Dataset and the Plasma Beta-Carotene Level Dataset that are commonly used examples for regression diagnostics of influential points. Our analysis unravels the discrepancies of using our robust method versus the other penalized regression method, underscoring the importance of developing and applying robust penalized regression methods.

  5. Robust Variable Selection with Exponential Squared Loss

    PubMed Central

    Wang, Xueqin; Jiang, Yunlu; Huang, Mian; Zhang, Heping

    2013-01-01

    Robust variable selection procedures through penalized regression have been gaining increased attention in the literature. They can be used to perform variable selection and are expected to yield robust estimates. However, to the best of our knowledge, the robustness of those penalized regression procedures has not been well characterized. In this paper, we propose a class of penalized robust regression estimators based on exponential squared loss. The motivation for this new procedure is that it enables us to characterize its robustness that has not been done for the existing procedures, while its performance is near optimal and superior to some recently developed methods. Specifically, under defined regularity conditions, our estimators are n-consistent and possess the oracle property. Importantly, we show that our estimators can achieve the highest asymptotic breakdown point of 1/2 and that their influence functions are bounded with respect to the outliers in either the response or the covariate domain. We performed simulation studies to compare our proposed method with some recent methods, using the oracle method as the benchmark. We consider common sources of influential points. Our simulation studies reveal that our proposed method performs similarly to the oracle method in terms of the model error and the positive selection rate even in the presence of influential points. In contrast, other existing procedures have a much lower non-causal selection rate. Furthermore, we re-analyze the Boston Housing Price Dataset and the Plasma Beta-Carotene Level Dataset that are commonly used examples for regression diagnostics of influential points. Our analysis unravels the discrepancies of using our robust method versus the other penalized regression method, underscoring the importance of developing and applying robust penalized regression methods. PMID:23913996

  6. Robust geostatistical analysis of spatial data

    NASA Astrophysics Data System (ADS)

    Papritz, A.; Künsch, H. R.; Schwierz, C.; Stahel, W. A.

    2012-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outlying observations may results from errors (e.g. in data transcription) or from local perturbations in the processes that are responsible for a given pattern of spatial variation. As an example, the spatial distribution of some trace metal in the soils of a region may be distorted by emissions of local anthropogenic sources. Outliers affect the modelling of the large-scale spatial variation, the so-called external drift or trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) [2] proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) [1] for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation. Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and unsampled locations and kriging variances. The method has been implemented in an R package. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis of the Tarrawarra soil moisture data set [3].

  7. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    NASA Astrophysics Data System (ADS)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  8. SU-F-BRD-04: Robustness Analysis of Proton Breast Treatments Using An Alpha-Stable Distribution Parameterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van den Heuvel, F; Hackett, S; Fiorini, F

    Purpose: Currently, planning systems allow robustness calculations to be performed, but a generalized assessment methodology is not yet available. We introduce and evaluate a methodology to quantify the robustness of a plan on an individual patient basis. Methods: We introduce the notion of characterizing a treatment instance (i.e. one single fraction delivery) by describing the dose distribution within an organ as an alpha-stable distribution. The parameters of the distribution (shape(α), scale(γ), position(δ), and symmetry(β)), will vary continuously (in a mathematical sense) as the distributions change with the different positions. The rate of change of the parameters provides a measure ofmore » the robustness of the treatment. The methodology is tested in a planning study of 25 patients with known residual errors at each fraction. Each patient was planned using Eclipse with an IBA-proton beam model. The residual error space for every patient was sampled 30 times, yielding 31 treatment plans for each patient and dose distributions in 5 organs. The parameters’ change rate as a function of Euclidean distance from the original plan was analyzed. Results: More than 1,000 dose distributions were analyzed. For 4 of the 25 patients the change in scale rate (γ) was considerably higher than the lowest change rate, indicating a lack of robustness. The sign of the shape change rate (α) also seemed indicative but the experiment lacked the power to prove significance. Conclusion: There are indications that this robustness measure is a valuable tool to allow a more patient individualized approach to the determination of margins. In a further study we will also evaluate this robustness measure using photon treatments, and evaluate the impact of using breath hold techniques, and the a Monte Carlo based dose deposition calculation. A principle component analysis is also planned.« less

  9. The robustness of ecosystems to the species loss of community

    NASA Astrophysics Data System (ADS)

    Cai, Qing; Liu, Jiming

    2016-10-01

    To study the robustness of ecosystems is crucial to promote the sustainable development of human society. This paper aims to analyze the robustness of ecosystems from an interesting viewpoint of the species loss of community. Unlike the existing definitions, we first introduce the notion of a community as a population of species belonging to the same trophic level. We then put forward a novel multiobjective optimization model which can be utilized to discover community structures from arbitrary unipartite networks. Because an ecosystem is commonly represented as a multipartite network, we further introduce a mechanism of competition among species whereby a multipartite network is transformed into a unipartite signed network without loss of species interaction information. Finally, we examine three strategies to test the robustness of an ecosystem. Our experiments indicate that ecosystems are robust to random species loss of community but fragile to target ones. We also investigate the relationships between the robustness of an ecosystem and that of its community composed network both to species loss. Our experiments indicate that the robustness analysis of a large-scale ecosystem to species loss may be akin to that of its community composed network which is usually small in size.

  10. Robust optimization modelling with applications to industry and environmental problems

    NASA Astrophysics Data System (ADS)

    Chaerani, Diah; Dewanto, Stanley P.; Lesmana, Eman

    2017-10-01

    Robust Optimization (RO) modeling is one of the existing methodology for handling data uncertainty in optimization problem. The main challenge in this RO methodology is how and when we can reformulate the robust counterpart of uncertain problems as a computationally tractable optimization problem or at least approximate the robust counterpart by a tractable problem. Due to its definition the robust counterpart highly depends on how we choose the uncertainty set. As a consequence we can meet this challenge only if this set is chosen in a suitable way. The development on RO grows fast, since 2004, a new approach of RO called Adjustable Robust Optimization (ARO) is introduced to handle uncertain problems when the decision variables must be decided as a ”wait and see” decision variables. Different than the classic Robust Optimization (RO) that models decision variables as ”here and now”. In ARO, the uncertain problems can be considered as a multistage decision problem, thus decision variables involved are now become the wait and see decision variables. In this paper we present the applications of both RO and ARO. We present briefly all results to strengthen the importance of RO and ARO in many real life problems.

  11. Robust Operation of Soft Open Points in Active Distribution Networks with High Penetration of Photovoltaic Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Ji, Haoran; Wang, Chengshan

    Distributed generators (DGs) including photovoltaic panels (PVs) have been integrated dramatically in active distribution networks (ADNs). Due to the strong volatility and uncertainty, the high penetration of PV generation immensely exacerbates the conditions of voltage violation in ADNs. However, the emerging flexible interconnection technology based on soft open points (SOPs) provides increased controllability and flexibility to the system operation. For fully exploiting the regulation ability of SOPs to address the problems caused by PV, this paper proposes a robust optimization method to achieve the robust optimal operation of SOPs in ADNs. A two-stage adjustable robust optimization model is built tomore » tackle the uncertainties of PV outputs, in which robust operation strategies of SOPs are generated to eliminate the voltage violations and reduce the power losses of ADNs. A column-and-constraint generation (C&CG) algorithm is developed to solve the proposed robust optimization model, which are formulated as second-order cone program (SOCP) to facilitate the accuracy and computation efficiency. Case studies on the modified IEEE 33-node system and comparisons with the deterministic optimization approach are conducted to verify the effectiveness and robustness of the proposed method.« less

  12. Robust climate policies under uncertainty: a comparison of robust decision making and info-gap methods.

    PubMed

    Hall, Jim W; Lempert, Robert J; Keller, Klaus; Hackbarth, Andrew; Mijere, Christophe; McInerney, David J

    2012-10-01

    This study compares two widely used approaches for robustness analysis of decision problems: the info-gap method originally developed by Ben-Haim and the robust decision making (RDM) approach originally developed by Lempert, Popper, and Bankes. The study uses each approach to evaluate alternative paths for climate-altering greenhouse gas emissions given the potential for nonlinear threshold responses in the climate system, significant uncertainty about such a threshold response and a variety of other key parameters, as well as the ability to learn about any threshold responses over time. Info-gap and RDM share many similarities. Both represent uncertainty as sets of multiple plausible futures, and both seek to identify robust strategies whose performance is insensitive to uncertainties. Yet they also exhibit important differences, as they arrange their analyses in different orders, treat losses and gains in different ways, and take different approaches to imprecise probabilistic information. The study finds that the two approaches reach similar but not identical policy recommendations and that their differing attributes raise important questions about their appropriate roles in decision support applications. The comparison not only improves understanding of these specific methods, it also suggests some broader insights into robustness approaches and a framework for comparing them. © 2012 RAND Corporation.

  13. Purple L1 Milestone Review Panel TotalView Debugger Functionality and Performance for ASC Purple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, M

    2006-12-12

    ASC code teams require a robust software debugging tool to help developers quickly find bugs in their codes and get their codes running. Development debugging commonly runs up to 512 processes. Production jobs run up to full ASC Purple scale, and at times require introspection while running. Developers want a debugger that runs on all their development and production platforms and that works with all compilers and runtimes used with ASC codes. The TotalView Multiprocess Debugger made by Etnus was specified for ASC Purple to address this needed capability. The ASC Purple environment builds on the environment seen by TotalViewmore » on ASCI White. The debugger must now operate with the Power5 CPU, Federation switch, AIX 5.3 operating system including large pages, IBM compilers 7 and 9, POE 4.2 parallel environment, and rs6000 SLURM resource manager. Users require robust, basic debugger functionality with acceptable performance at development debugging scale. A TotalView installation must be provided at the beginning of the early user access period that meets these requirements. A functional enhancement, fast conditional data watchpoints, and a scalability enhancement, capability up to 8192 processes, are to be demonstrated.« less

  14. Developing the science product algorithm testbed for Chinese next-generation geostationary meteorological satellites: Fengyun-4 series

    NASA Astrophysics Data System (ADS)

    Min, Min; Wu, Chunqiang; Li, Chuan; Liu, Hui; Xu, Na; Wu, Xiao; Chen, Lin; Wang, Fu; Sun, Fenglin; Qin, Danyu; Wang, Xi; Li, Bo; Zheng, Zhaojun; Cao, Guangzhen; Dong, Lixin

    2017-08-01

    Fengyun-4A (FY-4A), the first of the Chinese next-generation geostationary meteorological satellites, launched in 2016, offers several advances over the FY-2: more spectral bands, faster imaging, and infrared hyperspectral measurements. To support the major objective of developing the prototypes of FY-4 science algorithms, two science product algorithm testbeds for imagers and sounders have been developed by the scientists in the FY-4 Algorithm Working Group (AWG). Both testbeds, written in FORTRAN and C programming languages for Linux or UNIX systems, have been tested successfully by using Intel/g compilers. Some important FY-4 science products, including cloud mask, cloud properties, and temperature profiles, have been retrieved successfully through using a proxy imager, Himawari-8/Advanced Himawari Imager (AHI), and sounder data, obtained from the Atmospheric InfraRed Sounder, thus demonstrating their robustness. In addition, in early 2016, the FY-4 AWG was developed based on the imager testbed—a near real-time processing system for Himawari-8/AHI data for use by Chinese weather forecasters. Consequently, robust and flexible science product algorithm testbeds have provided essential and productive tools for popularizing FY-4 data and developing substantial improvements in FY-4 products.

  15. Vehicle Modeling for use in the CAFE model: Process description and modeling assumptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moawad, Ayman; Kim, Namdoo; Rousseau, Aymeric

    2016-06-01

    The objective of this project is to develop and demonstrate a process that, at a minimum, provides more robust information that can be used to calibrate inputs applicable under the CAFE model’s existing structure. The project will be more fully successful if a process can be developed that minimizes the need for decision trees and replaces the synergy factors by inputs provided directly from a vehicle simulation tool. The report provides a description of the process that was developed by Argonne National Laboratory and implemented in Autonomie.

  16. De-identification of health records using Anonym: effectiveness and robustness across datasets.

    PubMed

    Zuccon, Guido; Kotzur, Daniel; Nguyen, Anthony; Bergheim, Anton

    2014-07-01

    Evaluate the effectiveness and robustness of Anonym, a tool for de-identifying free-text health records based on conditional random fields classifiers informed by linguistic and lexical features, as well as features extracted by pattern matching techniques. De-identification of personal health information in electronic health records is essential for the sharing and secondary usage of clinical data. De-identification tools that adapt to different sources of clinical data are attractive as they would require minimal intervention to guarantee high effectiveness. The effectiveness and robustness of Anonym are evaluated across multiple datasets, including the widely adopted Integrating Biology and the Bedside (i2b2) dataset, used for evaluation in a de-identification challenge. The datasets used here vary in type of health records, source of data, and their quality, with one of the datasets containing optical character recognition errors. Anonym identifies and removes up to 96.6% of personal health identifiers (recall) with a precision of up to 98.2% on the i2b2 dataset, outperforming the best system proposed in the i2b2 challenge. The effectiveness of Anonym across datasets is found to depend on the amount of information available for training. Findings show that Anonym compares to the best approach from the 2006 i2b2 shared task. It is easy to retrain Anonym with new datasets; if retrained, the system is robust to variations of training size, data type and quality in presence of sufficient training data. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  17. A μ analysis-based, controller-synthesis framework for robust bioinspired visual navigation in less-structured environments.

    PubMed

    Keshavan, J; Gremillion, G; Escobar-Alvarez, H; Humbert, J S

    2014-06-01

    Safe, autonomous navigation by aerial microsystems in less-structured environments is a difficult challenge to overcome with current technology. This paper presents a novel visual-navigation approach that combines bioinspired wide-field processing of optic flow information with control-theoretic tools for synthesis of closed loop systems, resulting in robustness and performance guarantees. Structured singular value analysis is used to synthesize a dynamic controller that provides good tracking performance in uncertain environments without resorting to explicit pose estimation or extraction of a detailed environmental depth map. Experimental results with a quadrotor demonstrate the vehicle's robust obstacle-avoidance behaviour in a straight line corridor, an S-shaped corridor and a corridor with obstacles distributed in the vehicle's path. The computational efficiency and simplicity of the current approach offers a promising alternative to satisfying the payload, power and bandwidth constraints imposed by aerial microsystems.

  18. Robust Inference of Risks of Large Portfolios

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han; Vickers, Byron

    2016-01-01

    We propose a bootstrap-based robust high-confidence level upper bound (Robust H-CLUB) for assessing the risks of large portfolios. The proposed approach exploits rank-based and quantile-based estimators, and can be viewed as a robust extension of the H-CLUB procedure (Fan et al., 2015). Such an extension allows us to handle possibly misspecified models and heavy-tailed data, which are stylized features in financial returns. Under mixing conditions, we analyze the proposed approach and demonstrate its advantage over H-CLUB. We further provide thorough numerical results to back up the developed theory, and also apply the proposed method to analyze a stock market dataset. PMID:27818569

  19. Sub-metric Resolution FWI of Ultra-High-Frequency Marine Reflection Seismograms. A Remote Sensing Tool for the Characterisation of Shallow Marine Geohazard

    NASA Astrophysics Data System (ADS)

    Provenzano, G.; Vardy, M. E.; Henstock, T.; Zervos, A.

    2017-12-01

    A quantitative high-resolution physical model of the top 100 meters of the sub-seabed is of key importance for a wide range of shallow geohazard scenarios: identification of potential shallow landsliding, monitoring of gas storage sites, and assessment of offshore structures stability. Cur- rently, engineering-scale sediment characterisation relies heavily on direct sampling of the seabed and in-situ measurements. Such an approach is expensive and time-consuming, as well as liable to alter the sediment properties during the coring process. As opposed to reservoir-scale seismic exploration, ultra-high-frequency (UHF, 0.2-4.0 kHz) multi-channel marine reflection seismic data are most often limited to a to semi-quantitative interpretation of the reflection amplitudes and facies geometries, leaving largely unexploited its intrinsic value as a remote characterisation tool. In this work, we develop a seismic inversion methodology to obtain a robust sub-metric resolution elastic model from limited-offset, limited-bandwidth UHF seismic reflection data, with minimal pre-processing and limited a priori information. The Full Waveform Inversion is implemented as a stochastic optimiser based upon a Genetic Algorithm, modified in order to improve the robustness against inaccurate starting model populations. Multiple independent runs are used to create a robust posterior model distribution and quantify the uncertainties on the solution. The methodology has been applied to complex synthetic examples and to real datasets acquired in areas prone to shallow landsliding. The inverted elastic models show a satisfactory match with the ground-truths and a good sensitivity to relevant variations in the sediment texture and saturation state. We apply the methodology to a range of synthetic consolidating slopes under different loading conditions and sediment properties distributions. Our work demonstrates that the seismic inversion of UHF data has the potential to become an important practical tool for marine ground model building in spatially heterogeneous areas, reducing the reliance on expensive and time-consuming coring campaigns.

  20. Baby-MONITOR: A Composite Indicator of NICU Quality

    PubMed Central

    Kowalkowski, Marc A.; Zupancic, John A. F.; Pietz, Kenneth; Richardson, Peter; Draper, David; Hysong, Sylvia J.; Thomas, Eric J.; Petersen, Laura A.; Gould, Jeffrey B.

    2014-01-01

    BACKGROUND AND OBJECTIVES: NICUs vary in the quality of care delivered to very low birth weight (VLBW) infants. NICU performance on 1 measure of quality only modestly predicts performance on others. Composite measurement of quality of care delivery may provide a more comprehensive assessment of quality. The objective of our study was to develop a robust composite indicator of quality of NICU care provided to VLBW infants that accurately discriminates performance among NICUs. METHODS: We developed a composite indicator, Baby-MONITOR, based on 9 measures of quality chosen by a panel of experts. Measures were standardized, equally weighted, and averaged. We used the California Perinatal Quality Care Collaborative database to perform across-sectional analysis of care given to VLBW infants between 2004 and 2010. Performance on the Baby-MONITOR is not an absolute marker of quality but indicates overall performance relative to that of the other NICUs. We used sensitivity analyses to assess the robustness of the composite indicator, by varying assumptions and methods. RESULTS: Our sample included 9023 VLBW infants in 22 California regional NICUs. We found significant variations within and between NICUs on measured components of the Baby-MONITOR. Risk-adjusted composite scores discriminated performance among this sample of NICUs. Sensitivity analysis that included different approaches to normalization, weighting, and aggregation of individual measures showed the Baby-MONITOR to be robust (r = 0.89–0.99). CONCLUSIONS: The Baby-MONITOR may be a useful tool to comprehensively assess the quality of care delivered by NICUs. PMID:24918221

  1. GWAR: robust analysis and meta-analysis of genome-wide association studies.

    PubMed

    Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G

    2017-05-15

    In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  2. Quantitative Biofractal Feedback Part II ’Devices, Scalability & Robust Control’

    DTIC Science & Technology

    2008-05-01

    in the modelling of proton exchange membrane fuel cells ( PEMFC ) may work as a powerful tool in the development and widespread testing of alternative...energy sources in the next decade [9], where biofractal controllers will be used to control these complex systems. The dynamic model of PEMFC , is...dynamic response of the PEMFC . In the Iftukhar model, the fuel cell is represented by an equivalent circuit, whose components are identified with

  3. Topological robustness analysis of protein interaction networks reveals key targets for overcoming chemotherapy resistance in glioma

    NASA Astrophysics Data System (ADS)

    Azevedo, Hátylas; Moreira-Filho, Carlos Alberto

    2015-11-01

    Biological networks display high robustness against random failures but are vulnerable to targeted attacks on central nodes. Thus, network topology analysis represents a powerful tool for investigating network susceptibility against targeted node removal. Here, we built protein interaction networks associated with chemoresistance to temozolomide, an alkylating agent used in glioma therapy, and analyzed their modular structure and robustness against intentional attack. These networks showed functional modules related to DNA repair, immunity, apoptosis, cell stress, proliferation and migration. Subsequently, network vulnerability was assessed by means of centrality-based attacks based on the removal of node fractions in descending orders of degree, betweenness, or the product of degree and betweenness. This analysis revealed that removing nodes with high degree and high betweenness was more effective in altering networks’ robustness parameters, suggesting that their corresponding proteins may be particularly relevant to target temozolomide resistance. In silico data was used for validation and confirmed that central nodes are more relevant for altering proliferation rates in temozolomide-resistant glioma cell lines and for predicting survival in glioma patients. Altogether, these results demonstrate how the analysis of network vulnerability to topological attack facilitates target prioritization for overcoming cancer chemoresistance.

  4. Improving tablet coating robustness by selecting critical process parameters from retrospective data.

    PubMed

    Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M

    2016-09-01

    Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.

  5. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    NASA Astrophysics Data System (ADS)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  6. Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story

    NASA Technical Reports Server (NTRS)

    Ly, Vuong

    2017-01-01

    The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.

  7. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics.

    PubMed

    Möller, Birgit; Poeschl, Yvonne; Plötner, Romina; Bürstenbinder, Katharina

    2017-11-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. © 2017 American Society of Plant Biologists. All Rights Reserved.

  8. Strong Inference in Mathematical Modeling: A Method for Robust Science in the Twenty-First Century.

    PubMed

    Ganusov, Vitaly V

    2016-01-01

    While there are many opinions on what mathematical modeling in biology is, in essence, modeling is a mathematical tool, like a microscope, which allows consequences to logically follow from a set of assumptions. Only when this tool is applied appropriately, as microscope is used to look at small items, it may allow to understand importance of specific mechanisms/assumptions in biological processes. Mathematical modeling can be less useful or even misleading if used inappropriately, for example, when a microscope is used to study stars. According to some philosophers (Oreskes et al., 1994), the best use of mathematical models is not when a model is used to confirm a hypothesis but rather when a model shows inconsistency of the model (defined by a specific set of assumptions) and data. Following the principle of strong inference for experimental sciences proposed by Platt (1964), I suggest "strong inference in mathematical modeling" as an effective and robust way of using mathematical modeling to understand mechanisms driving dynamics of biological systems. The major steps of strong inference in mathematical modeling are (1) to develop multiple alternative models for the phenomenon in question; (2) to compare the models with available experimental data and to determine which of the models are not consistent with the data; (3) to determine reasons why rejected models failed to explain the data, and (4) to suggest experiments which would allow to discriminate between remaining alternative models. The use of strong inference is likely to provide better robustness of predictions of mathematical models and it should be strongly encouraged in mathematical modeling-based publications in the Twenty-First century.

  9. Strong Inference in Mathematical Modeling: A Method for Robust Science in the Twenty-First Century

    PubMed Central

    Ganusov, Vitaly V.

    2016-01-01

    While there are many opinions on what mathematical modeling in biology is, in essence, modeling is a mathematical tool, like a microscope, which allows consequences to logically follow from a set of assumptions. Only when this tool is applied appropriately, as microscope is used to look at small items, it may allow to understand importance of specific mechanisms/assumptions in biological processes. Mathematical modeling can be less useful or even misleading if used inappropriately, for example, when a microscope is used to study stars. According to some philosophers (Oreskes et al., 1994), the best use of mathematical models is not when a model is used to confirm a hypothesis but rather when a model shows inconsistency of the model (defined by a specific set of assumptions) and data. Following the principle of strong inference for experimental sciences proposed by Platt (1964), I suggest “strong inference in mathematical modeling” as an effective and robust way of using mathematical modeling to understand mechanisms driving dynamics of biological systems. The major steps of strong inference in mathematical modeling are (1) to develop multiple alternative models for the phenomenon in question; (2) to compare the models with available experimental data and to determine which of the models are not consistent with the data; (3) to determine reasons why rejected models failed to explain the data, and (4) to suggest experiments which would allow to discriminate between remaining alternative models. The use of strong inference is likely to provide better robustness of predictions of mathematical models and it should be strongly encouraged in mathematical modeling-based publications in the Twenty-First century. PMID:27499750

  10. A Robust Design Methodology for Optimal Microscale Secondary Flow Control in Compact Inlet Diffusers

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Keller, Dennis J.

    2001-01-01

    It is the purpose of this study to develop an economical Robust design methodology for microscale secondary flow control in compact inlet diffusers. To illustrate the potential of economical Robust Design methodology, two different mission strategies were considered for the subject inlet, namely Maximum Performance and Maximum HCF Life Expectancy. The Maximum Performance mission maximized total pressure recovery while the Maximum HCF Life Expectancy mission minimized the mean of the first five Fourier harmonic amplitudes, i.e., 'collectively' reduced all the harmonic 1/2 amplitudes of engine face distortion. Each of the mission strategies was subject to a low engine face distortion constraint, i.e., DC60<0.10, which is a level acceptable for commercial engines. For each of these missions strategies, an 'Optimal Robust' (open loop control) and an 'Optimal Adaptive' (closed loop control) installation was designed over a twenty degree angle-of-incidence range. The Optimal Robust installation used economical Robust Design methodology to arrive at a single design which operated over the entire angle-of-incident range (open loop control). The Optimal Adaptive installation optimized all the design parameters at each angle-of-incidence. Thus, the Optimal Adaptive installation would require a closed loop control system to sense a proper signal for each effector and modify that effector device, whether mechanical or fluidic, for optimal inlet performance. In general, the performance differences between the Optimal Adaptive and Optimal Robust installation designs were found to be marginal. This suggests, however, that Optimal Robust open loop installation designs can be very competitive with Optimal Adaptive close loop designs. Secondary flow control in inlets is inherently robust, provided it is optimally designed. Therefore, the new methodology presented in this paper, combined array 'Lower Order' approach to Robust DOE, offers the aerodynamicist a very viable and economical way of exploring the concept of Robust inlet design, where the mission variables are brought directly into the inlet design process and insensitivity or robustness to the mission variables becomes a design objective.

  11. Practical Results from the Application of Model Checking and Test Generation from UML/SysML Models of On-Board Space Applications

    NASA Astrophysics Data System (ADS)

    Faria, J. M.; Mahomad, S.; Silva, N.

    2009-05-01

    The deployment of complex safety-critical applications requires rigorous techniques and powerful tools both for the development and V&V stages. Model-based technologies are increasingly being used to develop safety-critical software, and arguably, turning to them can bring significant benefits to such processes, however, along with new challenges. This paper presents the results of a research project where we tried to extend current V&V methodologies to be applied on UML/SysML models and aiming at answering the demands related to validation issues. Two quite different but complementary approaches were investigated: (i) model checking and the (ii) extraction of robustness test-cases from the same models. These two approaches don't overlap and when combined provide a wider reaching model/design validation ability than each one alone thus offering improved safety assurance. Results are very encouraging, even though they either fell short of the desired outcome as shown for model checking, or still appear as not fully matured as shown for robustness test case extraction. In the case of model checking, it was verified that the automatic model validation process can become fully operational and even expanded in scope once tool vendors help (inevitably) to improve the XMI standard interoperability situation. For the robustness test case extraction methodology, the early approach produced interesting results but need further systematisation and consolidation effort in order to produce results in a more predictable fashion and reduce reliance on expert's heuristics. Finally, further improvements and innovation research projects were immediately apparent for both investigated approaches, which point to either circumventing current limitations in XMI interoperability on one hand and bringing test case specification onto the same graphical level as the models themselves and then attempting to automate the generation of executable test cases from its standard UML notation.

  12. On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.

    PubMed

    Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar

    2015-01-01

    Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.

  13. Improvement of medical content in the curriculum of biomedical engineering based on assessment of students outcomes.

    PubMed

    Abdulhay, Enas; Khnouf, Ruba; Haddad, Shireen; Al-Bashir, Areen

    2017-08-04

    Improvement of medical content in Biomedical Engineering curricula based on a qualitative assessment process or on a comparison with another high-standard program has been approached by a number of studies. However, the quantitative assessment tools have not been emphasized. The quantitative assessment tools can be more accurate and robust in cases of challenging multidisciplinary fields like that of Biomedical Engineering which includes biomedicine elements mixed with technology aspects. The major limitations of the previous research are the high dependence on surveys or pure qualitative approaches as well as the absence of strong focus on medical outcomes without implicit confusion with the technical ones. The proposed work presents the development and evaluation of an accurate/robust quantitative approach to the improvement of the medical content in the challenging multidisciplinary BME curriculum. The work presents quantitative assessment tools and subsequent improvement of curriculum medical content applied, as example for explanation, to the ABET (Accreditation Board for Engineering and Technology, USA) accredited biomedical engineering BME department at Jordan University of Science and Technology. The quantitative results of assessment of curriculum/course, capstone, exit exam, course assessment by student (CAS) as well as of surveys filled by alumni, seniors, employers and training supervisors were, first, mapped to the expected students' outcomes related to the medical field (SOsM). The collected data were then analyzed and discussed to find curriculum weakness points by tracking shortcomings in every outcome degree of achievement. Finally, actions were taken to fill in the gaps of the curriculum. Actions were also mapped to the students' medical outcomes (SOsM). Weighted averages of obtained quantitative values, mapped to SOsM, indicated accurately the achievement levels of all outcomes as well as the necessary improvements to be performed in curriculum. Mapping the improvements to SOsM also helps in the assessment of the following cycle. The suggested assessment tools can be generalized and extended to any other BME department. Robust improvement of medical content in BME curriculum can subsequently be achieved.

  14. A 3D interactive multi-object segmentation tool using local robust statistics driven active contours.

    PubMed

    Gao, Yi; Kikinis, Ron; Bouix, Sylvain; Shenton, Martha; Tannenbaum, Allen

    2012-08-01

    Extracting anatomical and functional significant structures renders one of the important tasks for both the theoretical study of the medical image analysis, and the clinical and practical community. In the past, much work has been dedicated only to the algorithmic development. Nevertheless, for clinical end users, a well designed algorithm with an interactive software is necessary for an algorithm to be utilized in their daily work. Furthermore, the software would better be open sourced in order to be used and validated by not only the authors but also the entire community. Therefore, the contribution of the present work is twofolds: first, we propose a new robust statistics based conformal metric and the conformal area driven multiple active contour framework, to simultaneously extract multiple targets from MR and CT medical imagery in 3D. Second, an open source graphically interactive 3D segmentation tool based on the aforementioned contour evolution is implemented and is publicly available for end users on multiple platforms. In using this software for the segmentation task, the process is initiated by the user drawn strokes (seeds) in the target region in the image. Then, the local robust statistics are used to describe the object features, and such features are learned adaptively from the seeds under a non-parametric estimation scheme. Subsequently, several active contours evolve simultaneously with their interactions being motivated by the principles of action and reaction-this not only guarantees mutual exclusiveness among the contours, but also no longer relies upon the assumption that the multiple objects fill the entire image domain, which was tacitly or explicitly assumed in many previous works. In doing so, the contours interact and converge to equilibrium at the desired positions of the desired multiple objects. Furthermore, with the aim of not only validating the algorithm and the software, but also demonstrating how the tool is to be used, we provide the reader reproducible experiments that demonstrate the capability of the proposed segmentation tool on several public available data sets. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. A 3D Interactive Multi-object Segmentation Tool using Local Robust Statistics Driven Active Contours

    PubMed Central

    Gao, Yi; Kikinis, Ron; Bouix, Sylvain; Shenton, Martha; Tannenbaum, Allen

    2012-01-01

    Extracting anatomical and functional significant structures renders one of the important tasks for both the theoretical study of the medical image analysis, and the clinical and practical community. In the past, much work has been dedicated only to the algorithmic development. Nevertheless, for clinical end users, a well designed algorithm with an interactive software is necessary for an algorithm to be utilized in their daily work. Furthermore, the software would better be open sourced in order to be used and validated by not only the authors but also the entire community. Therefore, the contribution of the present work is twofolds: First, we propose a new robust statistics based conformal metric and the conformal area driven multiple active contour framework, to simultaneously extract multiple targets from MR and CT medical imagery in 3D. Second, an open source graphically interactive 3D segmentation tool based on the aforementioned contour evolution is implemented and is publicly available for end users on multiple platforms. In using this software for the segmentation task, the process is initiated by the user drawn strokes (seeds) in the target region in the image. Then, the local robust statistics are used to describe the object features, and such features are learned adaptively from the seeds under a non-parametric estimation scheme. Subsequently, several active contours evolve simultaneously with their interactions being motivated by the principles of action and reaction — This not only guarantees mutual exclusiveness among the contours, but also no longer relies upon the assumption that the multiple objects fill the entire image domain, which was tacitly or explicitly assumed in many previous works. In doing so, the contours interact and converge to equilibrium at the desired positions of the desired multiple objects. Furthermore, with the aim of not only validating the algorithm and the software, but also demonstrating how the tool is to be used, we provide the reader reproducible experiments that demonstrate the capability of the proposed segmentation tool on several public available data sets. PMID:22831773

  16. Reconfigurable Robust Routing for Mobile Outreach Network

    NASA Technical Reports Server (NTRS)

    Lin, Ching-Fang

    2010-01-01

    The Reconfigurable Robust Routing for Mobile Outreach Network (R3MOO N) provides advanced communications networking technologies suitable for the lunar surface environment and applications. The R3MOON techn ology is based on a detailed concept of operations tailored for luna r surface networks, and includes intelligent routing algorithms and wireless mesh network implementation on AGNC's Coremicro Robots. The product's features include an integrated communication solution inco rporating energy efficiency and disruption-tolerance in a mobile ad h oc network, and a real-time control module to provide researchers an d engineers a convenient tool for reconfiguration, investigation, an d management.

  17. Smart Growth Self-Assessment for Rural Communities

    EPA Pesticide Factsheets

    Tool to help small towns and rural communities assess their existing policies, plans, codes, and zoning regulations to determine how well they work to create healthy, environmentally resilient, and economically robust places.

  18. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  19. Teacher Research as a Robust and Reflective Path to Professional Development

    ERIC Educational Resources Information Center

    Roberts, Sherron Killingsworth; Crawford, Patricia A.; Hickmann, Rosemary

    2010-01-01

    This article explores the role of teacher research as part of a robust program of professional development. Teacher research offers teachers at every stage of development a recursive and reflective means of bridging the gap between current practice and potential professional growth. The purpose of this dual level inquiry was to probe the concept…

  20. Horsetail matching: a flexible approach to optimization under uncertainty

    NASA Astrophysics Data System (ADS)

    Cook, L. W.; Jarrett, J. P.

    2018-04-01

    It is important to design engineering systems to be robust with respect to uncertainties in the design process. Often, this is done by considering statistical moments, but over-reliance on statistical moments when formulating a robust optimization can produce designs that are stochastically dominated by other feasible designs. This article instead proposes a formulation for optimization under uncertainty that minimizes the difference between a design's cumulative distribution function and a target. A standard target is proposed that produces stochastically non-dominated designs, but the formulation also offers enough flexibility to recover existing approaches for robust optimization. A numerical implementation is developed that employs kernels to give a differentiable objective function. The method is applied to algebraic test problems and a robust transonic airfoil design problem where it is compared to multi-objective, weighted-sum and density matching approaches to robust optimization; several advantages over these existing methods are demonstrated.

  1. Characterization of stem cells and cancer cells on the basis of gene expression profile stability, plasticity, and robustness: dynamical systems theory of gene expressions under cell-cell interaction explains mutational robustness of differentiated cells and suggests how cancer cells emerge.

    PubMed

    Kaneko, Kunihiko

    2011-06-01

    Here I present and discuss a model that, among other things, appears able to describe the dynamics of cancer cell origin from the perspective of stable and unstable gene expression profiles. In identifying such aberrant gene expression profiles as lying outside the normal stable states attracted through development and normal cell differentiation, the hypothesis explains why cancer cells accumulate mutations, to which they are not robust, and why these mutations create a new stable state far from the normal gene expression profile space. Such cells are in strong contrast with normal cell types that appeared as an attractor state in the gene expression dynamical system under cell-cell interaction and achieved robustness to noise through evolution, which in turn also conferred robustness to mutation. In complex gene regulation networks, other aberrant cellular states lacking such high robustness are expected to remain, which would correspond to cancer cells. Copyright © 2011 WILEY Periodicals, Inc.

  2. Robust input design for nonlinear dynamic modeling of AUV.

    PubMed

    Nouri, Nowrouz Mohammad; Valadi, Mehrdad

    2017-09-01

    Input design has a dominant role in developing the dynamic model of autonomous underwater vehicles (AUVs) through system identification. Optimal input design is the process of generating informative inputs that can be used to generate the good quality dynamic model of AUVs. In a problem with optimal input design, the desired input signal depends on the unknown system which is intended to be identified. In this paper, the input design approach which is robust to uncertainties in model parameters is used. The Bayesian robust design strategy is applied to design input signals for dynamic modeling of AUVs. The employed approach can design multiple inputs and apply constraints on an AUV system's inputs and outputs. Particle swarm optimization (PSO) is employed to solve the constraint robust optimization problem. The presented algorithm is used for designing the input signals for an AUV, and the estimate obtained by robust input design is compared with that of the optimal input design. According to the results, proposed input design can satisfy both robustness of constraints and optimality. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Emergence of robustness in networks of networks

    NASA Astrophysics Data System (ADS)

    Roth, Kevin; Morone, Flaviano; Min, Byungjoon; Makse, Hernán A.

    2017-06-01

    A model of interdependent networks of networks (NONs) was introduced recently [Proc. Natl. Acad. Sci. (USA) 114, 3849 (2017), 10.1073/pnas.1620808114] in the context of brain activation to identify the neural collective influencers in the brain NON. Here we investigate the emergence of robustness in such a model, and we develop an approach to derive an exact expression for the random percolation transition in Erdös-Rényi NONs of this kind. Analytical calculations are in agreement with numerical simulations, and highlight the robustness of the NON against random node failures, which thus presents a new robust universality class of NONs. The key aspect of this robust NON model is that a node can be activated even if it does not belong to the giant mutually connected component, thus allowing the NON to be built from below the percolation threshold, which is not possible in previous models of interdependent networks. Interestingly, the phase diagram of the model unveils particular patterns of interconnectivity for which the NON is most vulnerable, thereby marking the boundary above which the robustness of the system improves with increasing dependency connections.

  4. Robust allocation of a defensive budget considering an attacker's private information.

    PubMed

    Nikoofal, Mohammad E; Zhuang, Jun

    2012-05-01

    Attackers' private information is one of the main issues in defensive resource allocation games in homeland security. The outcome of a defense resource allocation decision critically depends on the accuracy of estimations about the attacker's attributes. However, terrorists' goals may be unknown to the defender, necessitating robust decisions by the defender. This article develops a robust-optimization game-theoretical model for identifying optimal defense resource allocation strategies for a rational defender facing a strategic attacker while the attacker's valuation of targets, being the most critical attribute of the attacker, is unknown but belongs to bounded distribution-free intervals. To our best knowledge, no previous research has applied robust optimization in homeland security resource allocation when uncertainty is defined in bounded distribution-free intervals. The key features of our model include (1) modeling uncertainty in attackers' attributes, where uncertainty is characterized by bounded intervals; (2) finding the robust-optimization equilibrium for the defender using concepts dealing with budget of uncertainty and price of robustness; and (3) applying the proposed model to real data. © 2011 Society for Risk Analysis.

  5. Doubly robust nonparametric inference on the average treatment effect.

    PubMed

    Benkeser, D; Carone, M; Laan, M J Van Der; Gilbert, P B

    2017-12-01

    Doubly robust estimators are widely used to draw inference about the average effect of a treatment. Such estimators are consistent for the effect of interest if either one of two nuisance parameters is consistently estimated. However, if flexible, data-adaptive estimators of these nuisance parameters are used, double robustness does not readily extend to inference. We present a general theoretical study of the behaviour of doubly robust estimators of an average treatment effect when one of the nuisance parameters is inconsistently estimated. We contrast different methods for constructing such estimators and investigate the extent to which they may be modified to also allow doubly robust inference. We find that while targeted minimum loss-based estimation can be used to solve this problem very naturally, common alternative frameworks appear to be inappropriate for this purpose. We provide a theoretical study and a numerical evaluation of the alternatives considered. Our simulations highlight the need for and usefulness of these approaches in practice, while our theoretical developments have broad implications for the construction of estimators that permit doubly robust inference in other problems.

  6. Design and Experimental Evaluation of a Robust Position Controller for an Electrohydrostatic Actuator Using Adaptive Antiwindup Sliding Mode Scheme

    PubMed Central

    Lee, Ji Min; Park, Sung Hwan; Kim, Jong Shik

    2013-01-01

    A robust control scheme is proposed for the position control of the electrohydrostatic actuator (EHA) when considering hardware saturation, load disturbance, and lumped system uncertainties and nonlinearities. To reduce overshoot due to a saturation of electric motor and to realize robustness against load disturbance and lumped system uncertainties such as varying parameters and modeling error, this paper proposes an adaptive antiwindup PID sliding mode scheme as a robust position controller for the EHA system. An optimal PID controller and an optimal anti-windup PID controller are also designed to compare control performance. An EHA prototype is developed, carrying out system modeling and parameter identification in designing the position controller. The simply identified linear model serves as the basis for the design of the position controllers, while the robustness of the control systems is compared by experiments. The adaptive anti-windup PID sliding mode controller has been found to have the desired performance and become robust against hardware saturation, load disturbance, and lumped system uncertainties and nonlinearities. PMID:23983640

  7. Stochastic Noise and Synchronisation during Dictyostelium Aggregation Make cAMP Oscillations Robust

    PubMed Central

    Kim, Jongrae; Heslop-Harrison, Pat; Postlethwaite, Ian; Bates, Declan G

    2007-01-01

    Stable and robust oscillations in the concentration of adenosine 3′, 5′-cyclic monophosphate (cAMP) are observed during the aggregation phase of starvation-induced development in Dictyostelium discoideum. In this paper we use mathematical modelling together with ideas from robust control theory to identify two factors which appear to make crucial contributions to ensuring the robustness of these oscillations. Firstly, we show that stochastic fluctuations in the molecular interactions play an important role in preserving stable oscillations in the face of variations in the kinetics of the intracellular network. Secondly, we show that synchronisation of the aggregating cells through the diffusion of extracellular cAMP is a key factor in ensuring robustness of the oscillatory waves of cAMP observed in Dictyostelium cell cultures to cell-to-cell variations. A striking and quite general implication of the results is that the robustness analysis of models of oscillating biomolecular networks (circadian clocks, Ca2+ oscillations, etc.) can only be done reliably by using stochastic simulations, even in the case where molecular concentrations are very high. PMID:17997595

  8. Robust control of combustion instabilities

    NASA Astrophysics Data System (ADS)

    Hong, Boe-Shong

    Several interactive dynamical subsystems, each of which has its own time-scale and physical significance, are decomposed to build a feedback-controlled combustion- fluid robust dynamics. On the fast-time scale, the phenomenon of combustion instability is corresponding to the internal feedback of two subsystems: acoustic dynamics and flame dynamics, which are parametrically dependent on the slow-time-scale mean-flow dynamics controlled for global performance by a mean-flow controller. This dissertation constructs such a control system, through modeling, analysis and synthesis, to deal with model uncertainties, environmental noises and time- varying mean-flow operation. Conservation law is decomposed as fast-time acoustic dynamics and slow-time mean-flow dynamics, served for synthesizing LPV (linear parameter varying)- L2-gain robust control law, in which a robust observer is embedded for estimating and controlling the internal status, while achieving trade- offs among robustness, performances and operation. The robust controller is formulated as two LPV-type Linear Matrix Inequalities (LMIs), whose numerical solver is developed by finite-element method. Some important issues related to physical understanding and engineering application are discussed in simulated results of the control system.

  9. Temperature-Robust Neural Function from Activity-Dependent Ion Channel Regulation.

    PubMed

    O'Leary, Timothy; Marder, Eve

    2016-11-07

    Many species of cold-blooded animals experience substantial and rapid fluctuations in body temperature. Because biological processes are differentially temperature dependent, it is difficult to understand how physiological processes in such animals can be temperature robust [1-8]. Experiments have shown that core neural circuits, such as the pyloric circuit of the crab stomatogastric ganglion (STG), exhibit robust neural activity in spite of large (20°C) temperature fluctuations [3, 5, 7, 8]. This robustness is surprising because (1) each neuron has many different kinds of ion channels with different temperature dependencies (Q 10 s) that interact in a highly nonlinear way to produce firing patterns and (2) across animals there is substantial variability in conductance densities that nonetheless produce almost identical firing properties. The high variability in conductance densities in these neurons [9, 10] appears to contradict the possibility that robustness is achieved through precise tuning of key temperature-dependent processes. In this paper, we develop a theoretical explanation for how temperature robustness can emerge from a simple regulatory control mechanism that is compatible with highly variable conductance densities [11-13]. The resulting model suggests a general mechanism for how nervous systems and excitable tissues can exploit degenerate relationships among temperature-sensitive processes to achieve robust function. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A robust fractional-order PID controller design based on active queue management for TCP network

    NASA Astrophysics Data System (ADS)

    Hamidian, Hamideh; Beheshti, Mohammad T. H.

    2018-01-01

    In this paper, a robust fractional-order controller is designed to control the congestion in transmission control protocol (TCP) networks with time-varying parameters. Fractional controllers can increase the stability and robustness. Regardless of advantages of fractional controllers, they are still not common in congestion control in TCP networks. The network parameters are time-varying, so the robust stability is important in congestion controller design. Therefore, we focused on the robust controller design. The fractional PID controller is developed based on active queue management (AQM). D-partition technique is used. The most important property of designed controller is the robustness to the time-varying parameters of the TCP network. The vertex quasi-polynomials of the closed-loop characteristic equation are obtained, and the stability boundaries are calculated for each vertex quasi-polynomial. The intersection of all stability regions is insensitive to network parameter variations, and results in robust stability of TCP/AQM system. NS-2 simulations show that the proposed algorithm provides a stable queue length. Moreover, simulations show smaller oscillations of the queue length and less packet drop probability for FPID compared to PI and PID controllers. We can conclude from NS-2 simulations that the average packet loss probability variations are negligible when the network parameters change.

  11. Adaptive Critic Nonlinear Robust Control: A Survey.

    PubMed

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    Adaptive dynamic programming (ADP) and reinforcement learning are quite relevant to each other when performing intelligent optimization. They are both regarded as promising methods involving important components of evaluation and improvement, at the background of information technology, such as artificial intelligence, big data, and deep learning. Although great progresses have been achieved and surveyed when addressing nonlinear optimal control problems, the research on robustness of ADP-based control strategies under uncertain environment has not been fully summarized. Hence, this survey reviews the recent main results of adaptive-critic-based robust control design of continuous-time nonlinear systems. The ADP-based nonlinear optimal regulation is reviewed, followed by robust stabilization of nonlinear systems with matched uncertainties, guaranteed cost control design of unmatched plants, and decentralized stabilization of interconnected systems. Additionally, further comprehensive discussions are presented, including event-based robust control design, improvement of the critic learning rule, nonlinear H ∞ control design, and several notes on future perspectives. By applying the ADP-based optimal and robust control methods to a practical power system and an overhead crane plant, two typical examples are provided to verify the effectiveness of theoretical results. Overall, this survey is beneficial to promote the development of adaptive critic control methods with robustness guarantee and the construction of higher level intelligent systems.

  12. Palmprint Based Verification System Using SURF Features

    NASA Astrophysics Data System (ADS)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  13. Feedback system design with an uncertain plant

    NASA Technical Reports Server (NTRS)

    Milich, D.; Valavani, L.; Athans, M.

    1986-01-01

    A method is developed to design a fixed-parameter compensator for a linear, time-invariant, SISO (single-input single-output) plant model characterized by significant structured, as well as unstructured, uncertainty. The controller minimizes the H(infinity) norm of the worst-case sensitivity function over the operating band and the resulting feedback system exhibits robust stability and robust performance. It is conjectured that such a robust nonadaptive control design technique can be used on-line in an adaptive control system.

  14. Research in robust control for hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Calise, A. J.

    1993-01-01

    The research during the second reporting period has focused on robust control design for hypersonic vehicles. An already existing design for the Hypersonic Winged-Cone Configuration has been enhanced. Uncertainty models for the effects of propulsion system perturbations due to angle of attack variations, structural vibrations, and uncertainty in control effectiveness were developed. Using H(sub infinity) and mu-synthesis techniques, various control designs were performed in order to investigate the impact of these effects on achievable robust performance.

  15. Robust Stability and Control of Multi-Body Ground Vehicles with Uncertain Dynamics and Failures

    DTIC Science & Technology

    2010-01-01

    and N. Zhang, 2008. “Robust stability control of vehicle rollover subject to actuator time delay”. Proc. IMechE Part I: J. of systems and control ...Dynamic Systems and Control Conference, Boston, MA, Sept 2010 R.K. Yedavalli,”Robust Stability of Linear Interval Parameter Matrix Family Problem...for control coupled output regulation for a class of systems is presented. In section 2.1.7, the control design algorithm developed in section

  16. Trading Robustness Requirements in Mars Entry Trajectory Design

    NASA Technical Reports Server (NTRS)

    Lafleur, Jarret M.

    2009-01-01

    One of the most important metrics characterizing an atmospheric entry trajectory in preliminary design is the size of its predicted landing ellipse. Often, requirements for this ellipse are set early in design and significantly influence both the expected scientific return from a particular mission and the cost of development. Requirements typically specify a certain probability level (6-level) for the prescribed ellipse, and frequently this latter requirement is taken at 36. However, searches for the justification of 36 as a robustness requirement suggest it is an empirical rule of thumb borrowed from non-aerospace fields. This paper presents an investigation into the sensitivity of trajectory performance to varying robustness (6-level) requirements. The treatment of robustness as a distinct objective is discussed, and an analysis framework is presented involving the manipulation of design variables to effect trades between performance and robustness objectives. The scenario for which this method is illustrated is the ballistic entry of an MSL-class Mars entry vehicle. Here, the design variable is entry flight path angle, and objectives are parachute deploy altitude performance and error ellipse robustness. Resulting plots show the sensitivities between these objectives and trends in the entry flight path angles required to design to these objectives. Relevance to the trajectory designer is discussed, as are potential steps for further development and use of this type of analysis.

  17. Robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming.

    PubMed

    Baran, Richard; Northen, Trent R

    2013-10-15

    Untargeted metabolite profiling using liquid chromatography and mass spectrometry coupled via electrospray ionization is a powerful tool for the discovery of novel natural products, metabolic capabilities, and biomarkers. However, the elucidation of the identities of uncharacterized metabolites from spectral features remains challenging. A critical step in the metabolite identification workflow is the assignment of redundant spectral features (adducts, fragments, multimers) and calculation of the underlying chemical formula. Inspection of the data by experts using computational tools solving partial problems (e.g., chemical formula calculation for individual ions) can be performed to disambiguate alternative solutions and provide reliable results. However, manual curation is tedious and not readily scalable or standardized. Here we describe an automated procedure for the robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming optimization (RAMSI). Chemical rules among related ions are expressed as linear constraints and both the spectra interpretation and chemical formula calculation are performed in a single optimization step. This approach is unbiased in that it does not require predefined sets of neutral losses and positive and negative polarity spectra can be combined in a single optimization. The procedure was evaluated with 30 experimental mass spectra and was found to effectively identify the protonated or deprotonated molecule ([M + H](+) or [M - H](-)) while being robust to the presence of background ions. RAMSI provides a much-needed standardized tool for interpreting ions for subsequent identification in untargeted metabolomics workflows.

  18. MicroRNA function in Drosophila melanogaster.

    PubMed

    Carthew, Richard W; Agbu, Pamela; Giri, Ritika

    2017-05-01

    Over the last decade, microRNAs have emerged as critical regulators in the expression and function of animal genomes. This review article discusses the relationship between microRNA-mediated regulation and the biology of the fruit fly Drosophila melanogaster. We focus on the roles that microRNAs play in tissue growth, germ cell development, hormone action, and the development and activity of the central nervous system. We also discuss the ways in which microRNAs affect robustness. Many gene regulatory networks are robust; they are relatively insensitive to the precise values of reaction constants and concentrations of molecules acting within the networks. MicroRNAs involved in robustness appear to be nonessential under uniform conditions used in conventional laboratory experiments. However, the robust functions of microRNAs can be revealed when environmental or genetic variation otherwise has an impact on developmental outcomes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Stability characterization and modeling of robust distributed benthic microbial fuel cell (DBMFC) system.

    PubMed

    Karra, Udayarka; Huang, Guoxian; Umaz, Ridvan; Tenaglier, Christopher; Wang, Lei; Li, Baikun

    2013-09-01

    A novel and robust distributed benthic microbial fuel cell (DBMFC) was developed to address the energy supply issues for oceanographic sensor network applications, especially under scouring and bioturbation by aquatic life. Multi-anode/cathode configuration was employed in the DBMFC system for enhanced robustness and stability in the harsh ocean environment. The results showed that the DBMFC system achieved peak power and current densities of 190mW/m(2) and 125mA/m(2) respectively. Stability characterization tests indicated the DBMFC with multiple anodes achieved higher power generation over the systems with single anode. A computational model that integrated physical, electrochemical and biological factors of MFCs was developed to validate the overall performance of the DBMFC system. The model simulation well corresponded with the experimental results, and confirmed the hypothesis that using a multi anode/cathode MFC configuration results in reliable and robust power generation. Published by Elsevier Ltd.

  20. WaveJava: Wavelet-based network computing

    NASA Astrophysics Data System (ADS)

    Ma, Kun; Jiao, Licheng; Shi, Zhuoer

    1997-04-01

    Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.

  1. Allogeneic Cell Therapy Bioprocess Economics and Optimization: Single-Use Cell Expansion Technologies

    PubMed Central

    Simaria, Ana S; Hassan, Sally; Varadaraju, Hemanthram; Rowley, Jon; Warren, Kim; Vanek, Philip; Farid, Suzanne S

    2014-01-01

    For allogeneic cell therapies to reach their therapeutic potential, challenges related to achieving scalable and robust manufacturing processes will need to be addressed. A particular challenge is producing lot-sizes capable of meeting commercial demands of up to 109 cells/dose for large patient numbers due to the current limitations of expansion technologies. This article describes the application of a decisional tool to identify the most cost-effective expansion technologies for different scales of production as well as current gaps in the technology capabilities for allogeneic cell therapy manufacture. The tool integrates bioprocess economics with optimization to assess the economic competitiveness of planar and microcarrier-based cell expansion technologies. Visualization methods were used to identify the production scales where planar technologies will cease to be cost-effective and where microcarrier-based bioreactors become the only option. The tool outputs also predict that for the industry to be sustainable for high demand scenarios, significant increases will likely be needed in the performance capabilities of microcarrier-based systems. These data are presented using a technology S-curve as well as windows of operation to identify the combination of cell productivities and scale of single-use bioreactors required to meet future lot sizes. The modeling insights can be used to identify where future R&D investment should be focused to improve the performance of the most promising technologies so that they become a robust and scalable option that enables the cell therapy industry reach commercially relevant lot sizes. The tool outputs can facilitate decision-making very early on in development and be used to predict, and better manage, the risk of process changes needed as products proceed through the development pathway. Biotechnol. Bioeng. 2014;111: 69–83. © 2013 Wiley Periodicals, Inc. PMID:23893544

  2. Allogeneic cell therapy bioprocess economics and optimization: single-use cell expansion technologies.

    PubMed

    Simaria, Ana S; Hassan, Sally; Varadaraju, Hemanthram; Rowley, Jon; Warren, Kim; Vanek, Philip; Farid, Suzanne S

    2014-01-01

    For allogeneic cell therapies to reach their therapeutic potential, challenges related to achieving scalable and robust manufacturing processes will need to be addressed. A particular challenge is producing lot-sizes capable of meeting commercial demands of up to 10(9) cells/dose for large patient numbers due to the current limitations of expansion technologies. This article describes the application of a decisional tool to identify the most cost-effective expansion technologies for different scales of production as well as current gaps in the technology capabilities for allogeneic cell therapy manufacture. The tool integrates bioprocess economics with optimization to assess the economic competitiveness of planar and microcarrier-based cell expansion technologies. Visualization methods were used to identify the production scales where planar technologies will cease to be cost-effective and where microcarrier-based bioreactors become the only option. The tool outputs also predict that for the industry to be sustainable for high demand scenarios, significant increases will likely be needed in the performance capabilities of microcarrier-based systems. These data are presented using a technology S-curve as well as windows of operation to identify the combination of cell productivities and scale of single-use bioreactors required to meet future lot sizes. The modeling insights can be used to identify where future R&D investment should be focused to improve the performance of the most promising technologies so that they become a robust and scalable option that enables the cell therapy industry reach commercially relevant lot sizes. The tool outputs can facilitate decision-making very early on in development and be used to predict, and better manage, the risk of process changes needed as products proceed through the development pathway. © 2013 Wiley Periodicals, Inc.

  3. TH-CD-209-05: Impact of Spot Size and Spacing On the Quality of Robustly-Optimized Intensity-Modulated Proton Therapy Plans for Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, W; Ding, X; Hu, Y

    Purpose: To investigate how spot size and spacing affect plan quality, especially, plan robustness and the impact of interplay effect, of robustly-optimized intensity-modulated proton therapy (IMPT) plans for lung cancer. Methods: Two robustly-optimized IMPT plans were created for 10 lung cancer patients: (1) one for a proton beam with in-air energy dependent large spot size at isocenter (σ: 5–15 mm) and spacing (1.53σ); (2) the other for a proton beam with small spot size (σ: 2–6 mm) and spacing (5 mm). Both plans were generated on the average CTs with internal-gross-tumor-volume density overridden to irradiate internal target volume (ITV). Themore » root-mean-square-dose volume histograms (RVH) measured the sensitivity of the dose to uncertainties, and the areas under RVH curves were used to evaluate plan robustness. Dose evaluation software was developed to model time-dependent spot delivery to incorporate interplay effect with randomized starting phases of each field per fraction. Patient anatomy voxels were mapped from phase to phase via deformable image registration to score doses. Dose-volume-histogram indices including ITV coverage, homogeneity, and organs-at-risk (OAR) sparing were compared using Student-t test. Results: Compared to large spots, small spots resulted in significantly better OAR sparing with comparable ITV coverage and homogeneity in the nominal plan. Plan robustness was comparable for ITV and most OARs. With interplay effect considered, significantly better OAR sparing with comparable ITV coverage and homogeneity is observed using smaller spots. Conclusion: Robust optimization with smaller spots significantly improves OAR sparing with comparable plan robustness and similar impact of interplay effect compare to larger spots. Small spot size requires the use of larger number of spots, which gives optimizer more freedom to render a plan more robust. The ratio between spot size and spacing was found to be more relevant to determine plan robustness and the impact of interplay effect than spot size alone. This research was supported by the National Cancer Institute Career Developmental Award K25CA168984, by the Fraternal Order of Eagles Cancer Research Fund Career Development Award, by The Lawrence W. and Marilyn W. Matteson Fund for Cancer Research, by Mayo Arizona State University Seed Grant, and by The Kemper Marley Foundation.« less

  4. Assessing Instructional Sensitivity Using the Pre-Post Difference Index: A Nontechnical Tool for Extension Educators

    ERIC Educational Resources Information Center

    Adedokun, Omolola A.

    2018-01-01

    This article provides an illustrative description of the pre-post difference index (PPDI), a simple, nontechnical yet robust tool for examining the instructional sensitivity of assessment items. Extension educators often design pretest-posttest instruments to assess the impact of their curricula on participants' knowledge and understanding of the…

  5. Integrated Testlets: A New Form of Expert-Student Collaborative Testing

    ERIC Educational Resources Information Center

    Shiell, Ralph C.; Slepkov, Aaron D.

    2015-01-01

    Integrated testlets are a new assessment tool that encompass the procedural benefits of multiple-choice testing, the pedagogical advantages of free-response-based tests, and the collaborative aspects of a viva voce or defence examination format. The result is a robust assessment tool that provides a significant formative aspect for students.…

  6. Evaluation of Cross-Protocol Stability of a Fully Automated Brain Multi-Atlas Parcellation Tool.

    PubMed

    Liang, Zifei; He, Xiaohai; Ceritoglu, Can; Tang, Xiaoying; Li, Yue; Kutten, Kwame S; Oishi, Kenichi; Miller, Michael I; Mori, Susumu; Faria, Andreia V

    2015-01-01

    Brain parcellation tools based on multiple-atlas algorithms have recently emerged as a promising method with which to accurately define brain structures. When dealing with data from various sources, it is crucial that these tools are robust for many different imaging protocols. In this study, we tested the robustness of a multiple-atlas, likelihood fusion algorithm using Alzheimer's Disease Neuroimaging Initiative (ADNI) data with six different protocols, comprising three manufacturers and two magnetic field strengths. The entire brain was parceled into five different levels of granularity. In each level, which defines a set of brain structures, ranging from eight to 286 regions, we evaluated the variability of brain volumes related to the protocol, age, and diagnosis (healthy or Alzheimer's disease). Our results indicated that, with proper pre-processing steps, the impact of different protocols is minor compared to biological effects, such as age and pathology. A precise knowledge of the sources of data variation enables sufficient statistical power and ensures the reliability of an anatomical analysis when using this automated brain parcellation tool on datasets from various imaging protocols, such as clinical databases.

  7. Potentials for the use of tool-integrated in-line data acquisition systems in press shops

    NASA Astrophysics Data System (ADS)

    Maier, S.; Schmerbeck, T.; Liebig, A.; Kautz, T.; Volk, W.

    2017-09-01

    Robust in-line data acquisition systems are required for the realization of process monitoring and control systems in press shops. A promising approach is the integration of sensors in the following press tools. There they can be easy integrated and maintained. It also achieves the necessary robustness for the rough press environment. Such concepts were already investigated for the measurement of the geometrical accuracy as well as for the material flow of inner part areas. They enable the monitoring of each produced part’s quality. An important success factor are practical approaches to the use of this new process information in press shops. This work presents various applications of these measuring concepts, based on real car body components of the BMW Group. For example, the procedure of retroactive error analysis is explained for a side frame. It also shows how this data acquisition can be used for the optimization of drawing tools in tool shops. With the skid-line, there is a continuous value that can be monitored from planning to serial production.

  8. Linear Covariance Analysis for a Lunar Lander

    NASA Technical Reports Server (NTRS)

    Jang, Jiann-Woei; Bhatt, Sagar; Fritz, Matthew; Woffinden, David; May, Darryl; Braden, Ellen; Hannan, Michael

    2017-01-01

    A next-generation lunar lander Guidance, Navigation, and Control (GNC) system, which includes a state-of-the-art optical sensor suite, is proposed in a concept design cycle. The design goal is to allow the lander to softly land within the prescribed landing precision. The achievement of this precision landing requirement depends on proper selection of the sensor suite. In this paper, a robust sensor selection procedure is demonstrated using a Linear Covariance (LinCov) analysis tool developed by Draper.

  9. Reconstruction of a piecewise constant conductivity on a polygonal partition via shape optimization in EIT

    NASA Astrophysics Data System (ADS)

    Beretta, Elena; Micheletti, Stefano; Perotto, Simona; Santacesaria, Matteo

    2018-01-01

    In this paper, we develop a shape optimization-based algorithm for the electrical impedance tomography (EIT) problem of determining a piecewise constant conductivity on a polygonal partition from boundary measurements. The key tool is to use a distributed shape derivative of a suitable cost functional with respect to movements of the partition. Numerical simulations showing the robustness and accuracy of the method are presented for simulated test cases in two dimensions.

  10. Contextual analysis of immunological response through whole-organ fluorescent imaging.

    PubMed

    Woodruff, Matthew C; Herndon, Caroline N; Heesters, B A; Carroll, Michael C

    2013-09-01

    As fluorescent microscopy has developed, significant insights have been gained into the establishment of immune response within secondary lymphoid organs, particularly in draining lymph nodes. While established techniques such as confocal imaging and intravital multi-photon microscopy have proven invaluable, they provide limited insight into the architectural and structural context in which these responses occur. To interrogate the role of the lymph node environment in immune response effectively, a new set of imaging tools taking into account broader architectural context must be implemented into emerging immunological questions. Using two different methods of whole-organ imaging, optical clearing and three-dimensional reconstruction of serially sectioned lymph nodes, fluorescent representations of whole lymph nodes can be acquired at cellular resolution. Using freely available post-processing tools, images of unlimited size and depth can be assembled into cohesive, contextual snapshots of immunological response. Through the implementation of robust iterative analysis techniques, these highly complex three-dimensional images can be objectified into sortable object data sets. These data can then be used to interrogate complex questions at the cellular level within the broader context of lymph node biology. By combining existing imaging technology with complex methods of sample preparation and capture, we have developed efficient systems for contextualizing immunological phenomena within lymphatic architecture. In combination with robust approaches to image analysis, these advances provide a path to integrating scientific understanding of basic lymphatic biology into the complex nature of immunological response.

  11. Noise in Neuronal and Electronic Circuits: A General Modeling Framework and Non-Monte Carlo Simulation Techniques.

    PubMed

    Kilinc, Deniz; Demir, Alper

    2017-08-01

    The brain is extremely energy efficient and remarkably robust in what it does despite the considerable variability and noise caused by the stochastic mechanisms in neurons and synapses. Computational modeling is a powerful tool that can help us gain insight into this important aspect of brain mechanism. A deep understanding and computational design tools can help develop robust neuromorphic electronic circuits and hybrid neuroelectronic systems. In this paper, we present a general modeling framework for biological neuronal circuits that systematically captures the nonstationary stochastic behavior of ion channels and synaptic processes. In this framework, fine-grained, discrete-state, continuous-time Markov chain models of both ion channels and synaptic processes are treated in a unified manner. Our modeling framework features a mechanism for the automatic generation of the corresponding coarse-grained, continuous-state, continuous-time stochastic differential equation models for neuronal variability and noise. Furthermore, we repurpose non-Monte Carlo noise analysis techniques, which were previously developed for analog electronic circuits, for the stochastic characterization of neuronal circuits both in time and frequency domain. We verify that the fast non-Monte Carlo analysis methods produce results with the same accuracy as computationally expensive Monte Carlo simulations. We have implemented the proposed techniques in a prototype simulator, where both biological neuronal and analog electronic circuits can be simulated together in a coupled manner.

  12. Ultra Lightweight Ballutes for Return to Earth from the Moon

    NASA Technical Reports Server (NTRS)

    Masciarelli, James P.; Lin, John K. H.; Ware, Joanne S.; Rohrschneider, Reuben R.; Braun, Robert D.; Bartels, Robert E.; Moses, Robert W.; Hall, Jeffery L.

    2006-01-01

    Ultra lightweight ballutes offer revolutionary mass and cost benefits along with flexibility in flight system design compared to traditional entry system technologies. Under funding provided by NASA s Exploration Systems Research & Technology program, our team was able to make progress in developing this technology through systems analysis and design, evaluation of materials and construction methods, and development of critical analysis tools. Results show that once this technology is mature, significant launch mass savings, operational simplicity, and mission robustness will be available to help carry out NASA s Vision for Space Exploration.

  13. Quantum Communications Systems

    DTIC Science & Technology

    2012-09-21

    metrology practical. The strategy was to develop robust photonic quantum states and sensors serving as an archetype for loss-tolerant information...communications and metrology. Our strategy consisted of developing robust photonic quantum states and sensors serving as an archetype for loss-tolerant...developed atomic memories in caesium vapour, based on a stimulated Raman transition, that have demonstrated a TBP greater than 1000 and are uniquely suited

  14. Interrogating the topological robustness of gene regulatory circuits by randomization

    PubMed Central

    Levine, Herbert; Onuchic, Jose N.

    2017-01-01

    One of the most important roles of cells is performing their cellular tasks properly for survival. Cells usually achieve robust functionality, for example, cell-fate decision-making and signal transduction, through multiple layers of regulation involving many genes. Despite the combinatorial complexity of gene regulation, its quantitative behavior has been typically studied on the basis of experimentally verified core gene regulatory circuitry, composed of a small set of important elements. It is still unclear how such a core circuit operates in the presence of many other regulatory molecules and in a crowded and noisy cellular environment. Here we report a new computational method, named random circuit perturbation (RACIPE), for interrogating the robust dynamical behavior of a gene regulatory circuit even without accurate measurements of circuit kinetic parameters. RACIPE generates an ensemble of random kinetic models corresponding to a fixed circuit topology, and utilizes statistical tools to identify generic properties of the circuit. By applying RACIPE to simple toggle-switch-like motifs, we observed that the stable states of all models converge to experimentally observed gene state clusters even when the parameters are strongly perturbed. RACIPE was further applied to a proposed 22-gene network of the Epithelial-to-Mesenchymal Transition (EMT), from which we identified four experimentally observed gene states, including the states that are associated with two different types of hybrid Epithelial/Mesenchymal phenotypes. Our results suggest that dynamics of a gene circuit is mainly determined by its topology, not by detailed circuit parameters. Our work provides a theoretical foundation for circuit-based systems biology modeling. We anticipate RACIPE to be a powerful tool to predict and decode circuit design principles in an unbiased manner, and to quantitatively evaluate the robustness and heterogeneity of gene expression. PMID:28362798

  15. The cardiovascular robustness hypothesis: Unmasking young adults' hidden risk for premature cardiovascular death.

    PubMed

    Kraushaar, Lutz E; Dressel, Alexander

    2018-03-01

    An undetected high risk for premature death of cardiovascular disease (CVD) among individuals with low-to-moderate risk factor levels is an acknowledged obstacle to CVD prevention. In this paper, we present the hypothesis that the vasculature's robustness against risk factor load will complement conventional risk factor models as a novel stratifier of risk. Figuratively speaking, mortality risk prediction without robustness scoring is akin to predicting the breaking risk of a lake's ice sheet considering load only while disregarding the sheet's bearing strength. Taking the cue from systems biology, which defines robustness as the ability to maintain function against internal and external challenges, we develop a robustness score from the physical parameters that comprehensively quantitate cardiovascular function. We derive the functional parameters using a recently introduced novel system, VascAssist 2 (iSYMED GmbH, Butzbach, Germany). VascAssist 2 (VA) applies the electronic-hydraulic analogy to a digital model of the arterial tree, replicating non-invasively acquired pule pressure waves by modulating the electronic equivalents of the physical parameters that describe in vivo arterial hemodynamics. As the latter is also subject to aging-associated degeneration which (a) progresses at inter-individually different rates, and which (b) affects the biomarker-mortality association, we express the robustness score as a correction factor to calendar age (CA), the dominant risk factor in all CVD risk factor models. We then propose a method for the validation of the score against known time-to-event data in reference populations. Our conceptualization of robustness implies that risk factor-challenged individuals with low robustness scores will face preferential elimination from the population resulting in a significant robustness-CA correlation in this strata absent in the unchallenged stratum. Hence, we also present an outline of a cross-sectional study design suitable to test this hypothesis. We finally discuss the objections that may validly be raised against our robustness hypothesis, and how available evidence encourages us to refute these objections. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Designing robust control laws using genetic algorithms

    NASA Technical Reports Server (NTRS)

    Marrison, Chris

    1994-01-01

    The purpose of this research is to create a method of finding practical, robust control laws. The robustness of a controller is judged by Stochastic Robustness metrics and the level of robustness is optimized by searching for design parameters that minimize a robustness cost function.

  17. Reliability, robustness, and reproducibility in mouse behavioral phenotyping: a cross-laboratory study

    PubMed Central

    Mandillo, Silvia; Tucci, Valter; Hölter, Sabine M.; Meziane, Hamid; Banchaabouchi, Mumna Al; Kallnik, Magdalena; Lad, Heena V.; Nolan, Patrick M.; Ouagazzal, Abdel-Mouttalib; Coghill, Emma L.; Gale, Karin; Golini, Elisabetta; Jacquot, Sylvie; Krezel, Wojtek; Parker, Andy; Riet, Fabrice; Schneider, Ilka; Marazziti, Daniela; Auwerx, Johan; Brown, Steve D. M.; Chambon, Pierre; Rosenthal, Nadia; Tocchini-Valentini, Glauco; Wurst, Wolfgang

    2008-01-01

    Establishing standard operating procedures (SOPs) as tools for the analysis of behavioral phenotypes is fundamental to mouse functional genomics. It is essential that the tests designed provide reliable measures of the process under investigation but most importantly that these are reproducible across both time and laboratories. For this reason, we devised and tested a set of SOPs to investigate mouse behavior. Five research centers were involved across France, Germany, Italy, and the UK in this study, as part of the EUMORPHIA program. All the procedures underwent a cross-validation experimental study to investigate the robustness of the designed protocols. Four inbred reference strains (C57BL/6J, C3HeB/FeJ, BALB/cByJ, 129S2/SvPas), reflecting their use as common background strains in mutagenesis programs, were analyzed to validate these tests. We demonstrate that the operating procedures employed, which includes open field, SHIRPA, grip-strength, rotarod, Y-maze, prepulse inhibition of acoustic startle response, and tail flick tests, generated reproducible results between laboratories for a number of the test output parameters. However, we also identified several uncontrolled variables that constitute confounding factors in behavioral phenotyping. The EUMORPHIA SOPs described here are an important start-point for the ongoing development of increasingly robust phenotyping platforms and their application in large-scale, multicentre mouse phenotyping programs. PMID:18505770

  18. Ecobat: An online resource to facilitate transparent, evidence-based interpretation of bat activity data.

    PubMed

    Lintott, Paul R; Davison, Sophie; van Breda, John; Kubasiewicz, Laura; Dowse, David; Daisley, Jonathan; Haddy, Emily; Mathews, Fiona

    2018-01-01

    Acoustic surveys of bats are one of the techniques most commonly used by ecological practitioners. The results are used in Ecological Impact Assessments to assess the likely impacts of future developments on species that are widely protected in law, and to monitor developments' postconstruction. However, there is no standardized methodology for analyzing or interpreting these data, which can make the assessment of the ecological value of a site very subjective. Comparisons of sites and projects are therefore difficult for ecologists and decision-makers, for example, when trying to identify the best location for a new road based on relative bat activity levels along alternative routes. Here, we present a new web-based, data-driven tool, Ecobat, which addresses the need for a more robust way of interpreting ecological data. Ecobat offers users an easy, standardized, and objective method for analyzing bat activity data. It allows ecological practitioners to compare bat activity data at regional and national scales and to generate a numerical indicator of the relative importance of a night's worth of bat activity. The tool is free and open-source; because the underlying algorithms are already developed, it could easily be expanded to new geographical regions and species. Data donation is required to ensure the robustness of the analyses; we use a positive feedback mechanism to encourage ecological practitioners to share data by providing in return high quality, contextualized data analysis, and graphical visualizations for direct use in ecological reports.

  19. Low-Cost, Robust, and Field Portable Smartphone Platform Photometric Sensor for Fluoride Level Detection in Drinking Water.

    PubMed

    Hussain, Iftak; Ahamad, Kamal Uddin; Nath, Pabitra

    2017-01-03

    Groundwater is the major source of drinking water for people living in rural areas of India. Pollutants such as fluoride in groundwater may be present in much higher concentration than the permissible limit. Fluoride does not give any visible coloration to water, and hence, no effort is made to remove or reduce the concentration of this chemical present in drinking water. This may lead to a serious health hazard for those people taking groundwater as their primary source of drinking water. Sophisticated laboratory grade tools such as ion selective electrodes (ISE) and portable spectrophotometers are commercially available for in-field detection of fluoride level in drinking water. However, such tools are generally expensive and require expertise to handle. In this paper, we demonstrate the working of a low cost, robust, and field portable smartphone platform fluoride sensor that can detect and analyze fluoride concentration level in drinking water. For development of the proposed sensor, we utilize the ambient light sensor (ALS) of the smartphone as light intensity detector and its LED flash light as an optical source. An android application "FSense" has been developed which can detect and analyze the fluoride concentration level in water samples. The custom developed application can be used for sharing of in-field sensing data from any remote location to the central water quality monitoring station. We envision that the proposed sensing technique could be useful for initiating a fluoride removal program undertaken by governmental and nongovernmental organizations here in India.

  20. Robust location and spread measures for nonparametric probability density function estimation.

    PubMed

    López-Rubio, Ezequiel

    2009-10-01

    Robustness against outliers is a desirable property of any unsupervised learning scheme. In particular, probability density estimators benefit from incorporating this feature. A possible strategy to achieve this goal is to substitute the sample mean and the sample covariance matrix by more robust location and spread estimators. Here we use the L1-median to develop a nonparametric probability density function (PDF) estimator. We prove its most relevant properties, and we show its performance in density estimation and classification applications.

  1. Using biological effects tools to define Good Environmental Status under the European Union Marine Strategy Framework Directive.

    PubMed

    Lyons, B P; Thain, J E; Stentiford, G D; Hylland, K; Davies, I M; Vethaak, A D

    2010-10-01

    The use of biological effects tools offer enormous potential to meet the challenges outlined by the European Union Marine Strategy Framework Directive (MSFD) whereby Member States are required to develop a robust set of tools for defining 11 qualitative descriptors of Good Environmental Status (GES), such as demonstrating that "Concentrations of contaminants are at levels not giving rise to pollution effects" (GES Descriptor 8). This paper discusses the combined approach of monitoring chemical contaminant levels, along side biological effect measurements relating to the effect of pollutants, for undertaking assessments of GES across European marine regions. We outline the minimum standards that biological effects tools should meet if they are to be used for defining GES in relation to Descriptor 8 and describe the current international initiatives underway to develop assessment criteria for these biological effects techniques. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  2. Tuning and Robustness Analysis for the Orion Absolute Navigation System

    NASA Technical Reports Server (NTRS)

    Holt, Greg N.; Zanetti, Renato; D'Souza, Christopher

    2013-01-01

    The Orion Multi-Purpose Crew Vehicle (MPCV) is currently under development as NASA's next-generation spacecraft for exploration missions beyond Low Earth Orbit. The MPCV is set to perform an orbital test flight, termed Exploration Flight Test 1 (EFT-1), some time in late 2014. The navigation system for the Orion spacecraft is being designed in a Multi-Organizational Design Environment (MODE) team including contractor and NASA personnel. The system uses an Extended Kalman Filter to process measurements and determine the state. The design of the navigation system has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to show the efforts made to-date in tuning the filter for the EFT-1 mission and instilling appropriate robustness into the system to meet the requirements of manned space ight. Filter performance is affected by many factors: data rates, sensor measurement errors, tuning, and others. This paper focuses mainly on the error characterization and tuning portion. Traditional efforts at tuning a navigation filter have centered around the observation/measurement noise and Gaussian process noise of the Extended Kalman Filter. While the Orion MODE team must certainly address those factors, the team is also looking at residual edit thresholds and measurement underweighting as tuning tools. Tuning analysis is presented with open loop Monte-Carlo simulation results showing statistical errors bounded by the 3-sigma filter uncertainty covariance. The Orion filter design uses 24 Exponentially Correlated Random Variable (ECRV) parameters to estimate the accel/gyro misalignment and nonorthogonality. By design, the time constant and noise terms of these ECRV parameters were set to manufacturer specifications and not used as tuning parameters. They are included in the filter as a more analytically correct method of modeling uncertainties than ad-hoc tuning of the process noise. Tuning is explored for the powered-flight ascent phase, where measurements are scarce and unmodelled vehicle accelerations dominate. On orbit, there are important trade-off cases between process and measurement noise. On entry, there are considerations about trading performance accuracy for robustness. Process Noise is divided into powered flight and coasting ight and can be adjusted for each phase and mode of the Orion EFT-1 mission. Measurement noise is used for the integrated velocity measurements during pad alignment. It is also used for Global Positioning System (GPS) pseudorange and delta- range measurements during the rest of the flight. The robustness effort has been focused on maintaining filter convergence and performance in the presence of unmodeled error sources. These include unmodeled forces on the vehicle and uncorrected errors on the sensor measurements. Orion uses a single-frequency, non-keyed GPS receiver, so the effects due to signal distortion in Earth's ionosphere and troposphere are present in the raw measurements. Results are presented showing the efforts to compensate for these errors as well as characterize the residual effect for measurement noise tuning. Another robustness tool in use is tuning the residual edit thresholds. The trade-off between noise tuning and edit thresholds is explored in the context of robustness to errors in dynamics models and sensor measurements. Measurement underweighting is also presented as a method of additional robustness when processing highly accurate measurements in the presence of large filter uncertainties.

  3. Management applications of discontinuity theory

    EPA Science Inventory

    1.Human impacts on the environment are multifaceted and can occur across distinct spatiotemporal scales. Ecological responses to environmental change are therefore difficult to predict, and entail large degrees of uncertainty. Such uncertainty requires robust tools for management...

  4. Development and utilization of a web-based application as a robust radiology teaching tool (radstax) for medical student anatomy teaching.

    PubMed

    Colucci, Philip G; Kostandy, Petro; Shrauner, William R; Arleo, Elizabeth; Fuortes, Michele; Griffin, Andrew S; Huang, Yun-Han; Juluru, Krishna; Tsiouris, Apostolos John

    2015-02-01

    Rationale and Objectives: The primary role of radiology in the preclinical setting is the use of imaging to improve students' understanding of anatomy. Many currently available Web-based anatomy programs include either suboptimal or overwhelming levels of detail for medical students.Our objective was to develop a user-friendly software program that anatomy instructors can completely tailor to match the desired level of detail for their curriculum, meets the unique needs of the first- and the second-year medical students, and is compatible with most Internet browsers and tablets.Materials and Methods: RadStax is a Web-based application developed using free, open-source, ubiquitous software. RadStax was first introduced as an interactive resource for independent study and later incorporated into lectures. First- and second-year medical students were surveyed for quantitative feedback regarding their experience.Results: RadStax was successfully introduced into our medical school curriculum. It allows the creation of learning modules with labeled multiplanar (MPR) image sets, basic anatomic information, and a self-assessment feature. The program received overwhelmingly positive feedback from students. Of 115 students surveyed, 87.0% found it highly effective as a study tool and 85.2% reported high user satisfaction with the program.Conclusions: RadStax is a novel application for instructors wishing to create an atlas of labeled MPR radiologic studies tailored to meet the specific needs their curriculum. Simple and focused, it provides an interactive experience for students similar to the practice of radiologists.This program is a robust anatomy teaching tool that effectively aids in educating the preclinical medical student.

  5. Development and Utilization of a Web-Based Application as a Robust Radiology Teaching Tool (RadStax) for Medical Student Anatomy Teaching

    PubMed Central

    Colucci, Philip G.; Kostandy, Petro; Shrauner, William R.; Arleo, Elizabeth; Fuortes, Michele; Griffin, Andrew S.; Huang, Yun-Han; Juluru, Krishna; Tsiouris, Apostolos John

    2016-01-01

    Rationale and Objectives The primary role of radiology in the preclinical setting is the use of imaging to improve students’ understanding of anatomy. Many currently available Web-based anatomy programs include either suboptimal or overwhelming levels of detail for medical students. Our objective was to develop a user-friendly software program that anatomy instructors can completely tailor to match the desired level of detail for their curriculum, meets the unique needs of the first- and the second-year medical students, and is compatible with most Internet browsers and tablets. Materials and Methods RadStax is a Web-based application developed using free, open-source, ubiquitous software. RadStax was first introduced as an interactive resource for independent study and later incorporated into lectures. First- and second-year medical students were surveyed for quantitative feedback regarding their experience. Results RadStax was successfully introduced into our medical school curriculum. It allows the creation of learning modules with labeled multiplanar (MPR) image sets, basic anatomic information, and a self-assessment feature. The program received overwhelmingly positive feedback from students. Of 115 students surveyed, 87.0% found it highly effective as a study tool and 85.2% reported high user satisfaction with the program. Conclusions RadStax is a novel application for instructors wishing to create an atlas of labeled MPR radiologic studies tailored to meet the specific needs their curriculum. Simple and focused, it provides an interactive experience for students similar to the practice of radiologists. This program is a robust anatomy teaching tool that effectively aids in educating the preclinical medical student. PMID:25964956

  6. Increased robustness of single-molecule counting with microfluidics, digital isothermal amplification, and a mobile phone versus real-time kinetic measurements.

    PubMed

    Selck, David A; Karymov, Mikhail A; Sun, Bing; Ismagilov, Rustem F

    2013-11-19

    Quantitative bioanalytical measurements are commonly performed in a kinetic format and are known to not be robust to perturbation that affects the kinetics itself or the measurement of kinetics. We hypothesized that the same measurements performed in a "digital" (single-molecule) format would show increased robustness to such perturbations. Here, we investigated the robustness of an amplification reaction (reverse-transcription loop-mediated amplification, RT-LAMP) in the context of fluctuations in temperature and time when this reaction is used for quantitative measurements of HIV-1 RNA molecules under limited-resource settings (LRS). The digital format that counts molecules using dRT-LAMP chemistry detected a 2-fold change in concentration of HIV-1 RNA despite a 6 °C temperature variation (p-value = 6.7 × 10(-7)), whereas the traditional kinetic (real-time) format did not (p-value = 0.25). Digital analysis was also robust to a 20 min change in reaction time, to poor imaging conditions obtained with a consumer cell-phone camera, and to automated cloud-based processing of these images (R(2) = 0.9997 vs true counts over a 100-fold dynamic range). Fluorescent output of multiplexed PCR amplification could also be imaged with the cell phone camera using flash as the excitation source. Many nonlinear amplification schemes based on organic, inorganic, and biochemical reactions have been developed, but their robustness is not well understood. This work implies that these chemistries may be significantly more robust in the digital, rather than kinetic, format. It also calls for theoretical studies to predict robustness of these chemistries and, more generally, to design robust reaction architectures. The SlipChip that we used here and other digital microfluidic technologies already exist to enable testing of these predictions. Such work may lead to identification or creation of robust amplification chemistries that enable rapid and precise quantitative molecular measurements under LRS. Furthermore, it may provide more general principles describing robustness of chemical and biological networks in digital formats.

  7. Robust, Optimal Water Infrastructure Planning Under Deep Uncertainty Using Metamodels

    NASA Astrophysics Data System (ADS)

    Maier, H. R.; Beh, E. H. Y.; Zheng, F.; Dandy, G. C.; Kapelan, Z.

    2015-12-01

    Optimal long-term planning plays an important role in many water infrastructure problems. However, this task is complicated by deep uncertainty about future conditions, such as the impact of population dynamics and climate change. One way to deal with this uncertainty is by means of robustness, which aims to ensure that water infrastructure performs adequately under a range of plausible future conditions. However, as robustness calculations require computationally expensive system models to be run for a large number of scenarios, it is generally computationally intractable to include robustness as an objective in the development of optimal long-term infrastructure plans. In order to overcome this shortcoming, an approach is developed that uses metamodels instead of computationally expensive simulation models in robustness calculations. The approach is demonstrated for the optimal sequencing of water supply augmentation options for the southern portion of the water supply for Adelaide, South Australia. A 100-year planning horizon is subdivided into ten equal decision stages for the purpose of sequencing various water supply augmentation options, including desalination, stormwater harvesting and household rainwater tanks. The objectives include the minimization of average present value of supply augmentation costs, the minimization of average present value of greenhouse gas emissions and the maximization of supply robustness. The uncertain variables are rainfall, per capita water consumption and population. Decision variables are the implementation stages of the different water supply augmentation options. Artificial neural networks are used as metamodels to enable all objectives to be calculated in a computationally efficient manner at each of the decision stages. The results illustrate the importance of identifying optimal staged solutions to ensure robustness and sustainability of water supply into an uncertain long-term future.

  8. Individualized relapse prediction: Personality measures and striatal and insular activity during reward-processing robustly predict relapse.

    PubMed

    Gowin, Joshua L; Ball, Tali M; Wittmann, Marc; Tapert, Susan F; Paulus, Martin P

    2015-07-01

    Nearly half of individuals with substance use disorders relapse in the year after treatment. A diagnostic tool to help clinicians make decisions regarding treatment does not exist for psychiatric conditions. Identifying individuals with high risk for relapse to substance use following abstinence has profound clinical consequences. This study aimed to develop neuroimaging as a robust tool to predict relapse. 68 methamphetamine-dependent adults (15 female) were recruited from 28-day inpatient treatment. During treatment, participants completed a functional MRI scan that examined brain activation during reward processing. Patients were followed 1 year later to assess abstinence. We examined brain activation during reward processing between relapsing and abstaining individuals and employed three random forest prediction models (clinical and personality measures, neuroimaging measures, a combined model) to generate predictions for each participant regarding their relapse likelihood. 18 individuals relapsed. There were significant group by reward-size interactions for neural activation in the left insula and right striatum for rewards. Abstaining individuals showed increased activation for large, risky relative to small, safe rewards, whereas relapsing individuals failed to show differential activation between reward types. All three random forest models yielded good test characteristics such that a positive test for relapse yielded a likelihood ratio 2.63, whereas a negative test had a likelihood ratio of 0.48. These findings suggest that neuroimaging can be developed in combination with other measures as an instrument to predict relapse, advancing tools providers can use to make decisions about individualized treatment of substance use disorders. Published by Elsevier Ireland Ltd.

  9. A simultaneous determination of related substances by high performance liquid chromatography in a drug product using quality by design approach.

    PubMed

    Tol, Trupti; Kadam, Nilesh; Raotole, Nilesh; Desai, Anita; Samanta, Gautam

    2016-02-05

    The combination of Abacavir, Lamivudine and Dolutegravir is an anti-retroviral formulation that displays high efficacy and superiority in comparison to other anti-retroviral combinations. Analysis of related substances in this combination drug product was very challenging due to the presence of nearly thirty peaks including the three active pharmaceutical ingredients (APIs), eleven known impurities and other pharmaceutical excipients. Objective of this study was to develop a single, selective, and robust high performance liquid chromatography method for the efficient separation of all peaks. Initially, one-factor-at-a-time (OFAT) approach was adopted to develop the method. But, it could not resolve all the critical peaks in such complex matrix. This led to the advent of two different HPLC methods for the determination of related substances, one for Abacavir and Lamivudine and the other for Dolutegravir. But, since analysis of a single sample using two methods instead of one is time and resource consuming and thus expensive, an attempt was made to develop a single and robust method by adopting quality by design (QbD) principles. Design of Experiments (DoE) was applied as a tool to achieve the optimum conditions through Response surface methodology with three method variables, pH, temperature, and mobile phase composition. As the study progressed, it was discovered that establishment of the design space was not viable due to the completely distant pH requirements of the two responses, i.e. (i) retention time for Lamivudine carboxylic acid and (ii) resolution between Abacavir impurity B and unknown impurity. Eventually, neglecting one of these two responses each time, two distinguished design spaces have been established and verified. Edge of failures at both design spaces indicate high probability of failure. It therefore, becomes very important to identify the most robust zone or normal operating range (NOR) within the design space with low risk of failure and high quality assurance. For NOR establishment, Monte Carlo simulation was performed on the basis of which process capability index (Cpk) was derived. Finally, the selectivity issue problem faced due to the pH dependency and the dissimilar pH needs of the two critical responses was resolved by introducing pH gradient into the program. This new ternary gradient program has provided a single robust method. Thus, two HPLC methods for the analysis of the combination drug product have been replaced with a selective, robust, and cost effective single method. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Vehicle active steering control research based on two-DOF robust internal model control

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Liu, Yahui; Wang, Fengbo; Bao, Chunjiang; Sun, Qun; Zhao, Youqun

    2016-07-01

    Because of vehicle's external disturbances and model uncertainties, robust control algorithms have obtained popularity in vehicle stability control. The robust control usually gives up performance in order to guarantee the robustness of the control algorithm, therefore an improved robust internal model control(IMC) algorithm blending model tracking and internal model control is put forward for active steering system in order to reach high performance of yaw rate tracking with certain robustness. The proposed algorithm inherits the good model tracking ability of the IMC control and guarantees robustness to model uncertainties. In order to separate the design process of model tracking from the robustness design process, the improved 2 degree of freedom(DOF) robust internal model controller structure is given from the standard Youla parameterization. Simulations of double lane change maneuver and those of crosswind disturbances are conducted for evaluating the robust control algorithm, on the basis of a nonlinear vehicle simulation model with a magic tyre model. Results show that the established 2-DOF robust IMC method has better model tracking ability and a guaranteed level of robustness and robust performance, which can enhance the vehicle stability and handling, regardless of variations of the vehicle model parameters and the external crosswind interferences. Contradiction between performance and robustness of active steering control algorithm is solved and higher control performance with certain robustness to model uncertainties is obtained.

  11. On the robustness of vocal development: an examination of infants with moderate-to-severe hearing loss and additional risk factors.

    PubMed

    Nathani, Suneeti; Oller, D Kimbrough; Neal, A Rebecca

    2007-12-01

    Onset of canonical babbling by 10 months of age is surprisingly robust in infancy, suggesting that there must be deep biological forces that keep the development of this key vocal capability on course. This study further evaluated the robustness of canonical babbling and other aspects of prelinguistic vocal development. Longitudinal observation was conducted on 4 infants who were at risk for abnormal vocal development because of bilateral moderate-to-severe sensorineural hearing loss and additional risk factors for developmental delay. Two of the infants were delayed in the onset of canonical babbling and showed greater fluctuation in canonical babbling ratios following its onset than did typically developing infants. On the same measures, the remaining 2 infants were within normal limits, although their age of onset for canonical babbling was later than the mean for typically developing infants. Volubility was not notably different from typically developing infants. Differences from typically developing infants were, however, observed in proportions of various prelinguistic syllable types produced across time. Results provided further evidence of robustness of canonical babbling and indicated the need for a large parametric study evaluating effects of varying degrees of hearing loss and other risk factors on vocal development.

  12. Difet: Distributed Feature Extraction Tool for High Spatial Resolution Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Eken, S.; Aydın, E.; Sayar, A.

    2017-11-01

    In this paper, we propose distributed feature extraction tool from high spatial resolution remote sensing images. Tool is based on Apache Hadoop framework and Hadoop Image Processing Interface. Two corner detection (Harris and Shi-Tomasi) algorithms and five feature descriptors (SIFT, SURF, FAST, BRIEF, and ORB) are considered. Robustness of the tool in the task of feature extraction from LandSat-8 imageries are evaluated in terms of horizontal scalability.

  13. Flux control-based design of furfural-resistance strains of Saccharomyces cerevisiae for lignocellulosic biorefinery.

    PubMed

    Unrean, Pornkamol

    2017-04-01

    We have previously developed a dynamic flux balance analysis of Saccharomyces cerevisiae for elucidation of genome-wide flux response to furfural perturbation (Unrean and Franzen, Biotechnol J 10(8):1248-1258, 2015). Herein, the dynamic flux distributions were analyzed by flux control analysis to identify target overexpressed genes for improved yeast robustness against furfural. The flux control coefficient (FCC) identified overexpressing isocitrate dehydrogenase (IDH1), a rate-controlling flux for ethanol fermentation, and dicarboxylate carrier (DIC1), a limiting flux for cell growth, as keys of furfural-resistance phenotype. Consistent with the model prediction, strain characterization showed 1.2- and 2.0-fold improvement in ethanol synthesis and furfural detoxification rates, respectively, by IDH1 overexpressed mutant compared to the control. DIC1 overexpressed mutant grew at 1.3-fold faster and reduced furfural at 1.4-fold faster than the control under the furfural challenge. This study hence demonstrated the FCC-based approach as an effective tool for guiding the design of robust yeast strains.

  14. Differentiation and characterization of human pluripotent stem cell-derived brain microvascular endothelial cells.

    PubMed

    Stebbins, Matthew J; Wilson, Hannah K; Canfield, Scott G; Qian, Tongcheng; Palecek, Sean P; Shusta, Eric V

    2016-05-15

    The blood-brain barrier (BBB) is a critical component of the central nervous system (CNS) that regulates the flux of material between the blood and the brain. Because of its barrier properties, the BBB creates a bottleneck to CNS drug delivery. Human in vitro BBB models offer a potential tool to screen pharmaceutical libraries for CNS penetration as well as for BBB modulators in development and disease, yet primary and immortalized models respectively lack scalability and robust phenotypes. Recently, in vitro BBB models derived from human pluripotent stem cells (hPSCs) have helped overcome these challenges by providing a scalable and renewable source of human brain microvascular endothelial cells (BMECs). We have demonstrated that hPSC-derived BMECs exhibit robust structural and functional characteristics reminiscent of the in vivo BBB. Here, we provide a detailed description of the methods required to differentiate and functionally characterize hPSC-derived BMECs to facilitate their widespread use in downstream applications. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Design and test of three active flutter suppression controllers

    NASA Technical Reports Server (NTRS)

    Christhilf, David M.; Waszak, Martin R.; Adams, William M.; Srinathkumar, S.; Mukhopadhyay, Vivek

    1991-01-01

    Three flutter suppression control law design techniques are presented. Each uses multiple control surfaces and/or sensors. The first uses linear combinations of several accelerometer signals together with dynamic compensation to synthesize the modal rate of the critical mode for feedback to distributed control surfaces. The second uses traditional tools (pole/zero loci and Nyquist diagrams) to develop a good understanding of the flutter mechanism and produce a controller with minimal complexity and good robustness to plant uncertainty. The third starts with a minimum energy Linear Quadratic Gaussian controller, applies controller order reduction, and then modifies weight and noise covariance matrices to improve multi-variable robustness. The resulting designs were implemented digitally and tested subsonically on the Active Flexible Wing (AFW) wind tunnel model. Test results presented here include plant characteristics, maximum attained closed-loop dynamic pressure, and Root Mean Square control surface activity. A key result is that simultaneous symmetric and antisymmetric flutter suppression was achieved by the second control law, with a 24 percent increase in attainable dynamic pressure.

  16. Monocyte Chemotactic Protein 1 in Plasma from Soluble Leishmania Antigen-Stimulated Whole Blood as a Potential Biomarker of the Cellular Immune Response to Leishmania infantum

    PubMed Central

    Ibarra-Meneses, Ana V.; Sanchez, Carmen; Alvar, Jorge; Moreno, Javier; Carrillo, Eugenia

    2017-01-01

    New biomarkers are needed to identify asymptomatic Leishmania infection as well as immunity following vaccination or treatment. With the aim of finding a robust biomarker to assess an effective cellular immune response, monocyte chemotactic protein 1 (MCP-1) was examined in plasma from soluble Leishmania antigen (SLA)-stimulated whole blood collected from subjects living in a Leishmania infantum-endemic area. MCP-1, expressed 110 times more strongly than IL-2, identified 87.5% of asymptomatic subjects and verified some asymptomatic subjects close to the cutoff. MCP-1 was also significantly elevated in all patients cured of visceral leishmaniasis (VL), unlike IL-2, indicating the specific memory response generated against Leishmania. These results show MCP-1 to be a robust candidate biomarker of immunity that could be used as a marker of cure and to both select and follow the population in vaccine phase I–III human clinical trials with developed rapid, easy-to-use field tools. PMID:29033933

  17. The Teaching of Anthropogenic Climate Change and Earth Science via Technology-Enabled Inquiry Education

    NASA Technical Reports Server (NTRS)

    Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark

    2016-01-01

    A gap has existed between the tools and processes of scientists working on anthropogenic global climate change (AGCC) and the technologies and curricula available to educators teaching the subject through student inquiry. Designing realistic scientific inquiry into AGCC poses a challenge because research on it relies on complex computer models, globally distributed data sets, and complex laboratory and data collection procedures. Here we examine efforts by the scientific community and educational researchers to design new curricula and technology that close this gap and impart robust AGCC and Earth Science understanding. We find technology-based teaching shows promise in promoting robust AGCC understandings if associated curricula address mitigating factors such as time constraints in incorporating technology and the need to support teachers implementing AGCC and Earth Science inquiry. We recommend the scientific community continue to collaborate with educational researchers to focus on developing those inquiry technologies and curricula that use realistic scientific processes from AGCC research and/or the methods for determining how human society should respond to global change.

  18. Cyanobacterial chassis engineering for enhancing production of biofuels and chemicals.

    PubMed

    Gao, Xinyan; Sun, Tao; Pei, Guangsheng; Chen, Lei; Zhang, Weiwen

    2016-04-01

    To reduce dependence on fossil fuels and curb greenhouse effect, cyanobacteria have emerged as an important chassis candidate for producing biofuels and chemicals due to their capability to directly utilize sunlight and CO2 as the sole energy and carbon sources, respectively. Recent progresses in developing and applying various synthetic biology tools have led to the successful constructions of novel pathways of several dozen green fuels and chemicals utilizing cyanobacterial chassis. Meanwhile, it is increasingly recognized that in order to enhance productivity of the synthetic cyanobacterial systems, optimizing and engineering more robust and high-efficient cyanobacterial chassis should not be omitted. In recent years, numerous research studies have been conducted to enhance production of green fuels and chemicals through cyanobacterial chassis modifications involving photosynthesis, CO2 uptake and fixation, products exporting, tolerance, and cellular regulation. In this article, we critically reviewed recent progresses and universal strategies in cyanobacterial chassis engineering to make it more robust and effective for bio-chemicals production.

  19. A framework for sensitivity analysis of decision trees.

    PubMed

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  20. Efficient and Robust Paramyxoviridae Reverse Genetics Systems

    PubMed Central

    Beaty, Shannon M.; Won, Sohui T.; Hong, Patrick; Lyons, Michael; Vigant, Frederic; Freiberg, Alexander N.; tenOever, Benjamin R.; Duprex, W. Paul

    2017-01-01

    ABSTRACT The notoriously low efficiency of Paramyxoviridae reverse genetics systems has posed a limiting barrier to the study of viruses in this family. Previous approaches to reverse genetics have utilized a wide variety of techniques to overcome the technical hurdles. Although robustness (i.e., the number of attempts that result in successful rescue) has been improved in some systems with the use of stable cell lines, the efficiency of rescue (i.e., the proportion of transfected cells that yield at least one successful rescue event) has remained low. We have substantially increased rescue efficiency for representative viruses from all five major Paramyxoviridae genera (from ~1 in 106-107 to ~1 in 102-103 transfected cells) by the addition of a self-cleaving hammerhead ribozyme (Hh-Rbz) sequence immediately preceding the start of the recombinant viral antigenome and the use of a codon-optimized T7 polymerase (T7opt) gene to drive paramyxovirus rescue. Here, we report a strategy for robust, reliable, and high-efficiency rescue of paramyxovirus reverse genetics systems, featuring several major improvements: (i) a vaccinia virus-free method, (ii) freedom to use any transfectable cell type for viral rescue, (iii) a single-step transfection protocol, and (iv) use of the optimal T7 promoter sequence for high transcription levels from the antigenomic plasmid without incorporation of nontemplated G residues. The robustness of our T7opt-HhRbz system also allows for greater latitude in the ratios of transfected accessory plasmids used that result in successful rescue. Thus, our system may facilitate the rescue and interrogation of the increasing number of emerging paramyxoviruses. IMPORTANCE The ability to manipulate the genome of paramyxoviruses and evaluate the effects of these changes at the phenotypic level is a powerful tool for the investigation of specific aspects of the viral life cycle and viral pathogenesis. However, reverse genetics systems for paramyxoviruses are notoriously inefficient, when successful. The ability to efficiently and robustly rescue paramyxovirus reverse genetics systems can be used to answer basic questions about the biology of paramyxoviruses, as well as to facilitate the considerable translational efforts being devoted to developing live attenuated paramyxovirus vaccine vectors. PMID:28405630

  1. Development of an Intracellular Screen for New Compounds Able To Inhibit Mycobacterium tuberculosis Growth in Human Macrophages.

    PubMed

    Sorrentino, Flavia; Gonzalez del Rio, Ruben; Zheng, Xingji; Presa Matilla, Jesus; Torres Gomez, Pedro; Martinez Hoyos, Maria; Perez Herran, Maria Esther; Mendoza Losana, Alfonso; Av-Gay, Yossef

    2016-01-01

    Here we describe the development and validation of an intracellular high-throughput screening assay for finding new antituberculosis compounds active in human macrophages. The assay consists of a luciferase-based primary identification assay, followed by a green fluorescent protein-based secondary profiling assay. Standard tuberculosis drugs and 158 previously recognized active antimycobacterial compounds were used to evaluate assay robustness. Data show that the assay developed is a short and valuable tool for the discovery of new antimycobacterial compounds. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  2. Efficient Robust Regression via Two-Stage Generalized Empirical Likelihood

    PubMed Central

    Bondell, Howard D.; Stefanski, Leonard A.

    2013-01-01

    Large- and finite-sample efficiency and resistance to outliers are the key goals of robust statistics. Although often not simultaneously attainable, we develop and study a linear regression estimator that comes close. Efficiency obtains from the estimator’s close connection to generalized empirical likelihood, and its favorable robustness properties are obtained by constraining the associated sum of (weighted) squared residuals. We prove maximum attainable finite-sample replacement breakdown point, and full asymptotic efficiency for normal errors. Simulation evidence shows that compared to existing robust regression estimators, the new estimator has relatively high efficiency for small sample sizes, and comparable outlier resistance. The estimator is further illustrated and compared to existing methods via application to a real data set with purported outliers. PMID:23976805

  3. Percolation of localized attack on complex networks

    NASA Astrophysics Data System (ADS)

    Shao, Shuai; Huang, Xuqing; Stanley, H. Eugene; Havlin, Shlomo

    2015-02-01

    The robustness of complex networks against node failure and malicious attack has been of interest for decades, while most of the research has focused on random attack or hub-targeted attack. In many real-world scenarios, however, attacks are neither random nor hub-targeted, but localized, where a group of neighboring nodes in a network are attacked and fail. In this paper we develop a percolation framework to analytically and numerically study the robustness of complex networks against such localized attack. In particular, we investigate this robustness in Erdős-Rényi networks, random-regular networks, and scale-free networks. Our results provide insight into how to better protect networks, enhance cybersecurity, and facilitate the design of more robust infrastructures.

  4. Handling Uncertain Gross Margin and Water Demand in Agricultural Water Resources Management using Robust Optimization

    NASA Astrophysics Data System (ADS)

    Chaerani, D.; Lesmana, E.; Tressiana, N.

    2018-03-01

    In this paper, an application of Robust Optimization in agricultural water resource management problem under gross margin and water demand uncertainty is presented. Water resource management is a series of activities that includes planning, developing, distributing and managing the use of water resource optimally. Water resource management for agriculture can be one of the efforts to optimize the benefits of agricultural output. The objective function of agricultural water resource management problem is to maximizing total benefits by water allocation to agricultural areas covered by the irrigation network in planning horizon. Due to gross margin and water demand uncertainty, we assume that the uncertain data lies within ellipsoidal uncertainty set. We employ robust counterpart methodology to get the robust optimal solution.

  5. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    NASA Astrophysics Data System (ADS)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  6. The Development of a Novel High Throughput Computational Tool for Studying Individual and Collective Cellular Migration

    PubMed Central

    Chapnick, Douglas A.; Jacobsen, Jeremy; Liu, Xuedong

    2013-01-01

    Understanding how cells migrate individually and collectively during development and cancer metastasis can be significantly aided by a computation tool to accurately measure not only cellular migration speed, but also migration direction and changes in migration direction in a temporal and spatial manner. We have developed such a tool for cell migration researchers, named Pathfinder, which is capable of simultaneously measuring the migration speed, migration direction, and changes in migration directions of thousands of cells both instantaneously and over long periods of time from fluorescence microscopy data. Additionally, we demonstrate how the Pathfinder software can be used to quantify collective cell migration. The novel capability of the Pathfinder software to measure the changes in migration direction of large populations of cells in a spatiotemporal manner will aid cellular migration research by providing a robust method for determining the mechanisms of cellular guidance during individual and collective cell migration. PMID:24386097

  7. Contour plot assessment of existing meta-analyses confirms robust association of statin use and acute kidney injury risk.

    PubMed

    Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W

    2015-10-01

    Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Metabolic Engineering of Oleaginous Yeasts for Production of Fuels and Chemicals.

    PubMed

    Shi, Shuobo; Zhao, Huimin

    2017-01-01

    Oleaginous yeasts have been increasingly explored for production of chemicals and fuels via metabolic engineering. Particularly, there is a growing interest in using oleaginous yeasts for the synthesis of lipid-related products due to their high lipogenesis capability, robustness, and ability to utilize a variety of substrates. Most of the metabolic engineering studies in oleaginous yeasts focused on Yarrowia that already has plenty of genetic engineering tools. However, recent advances in systems biology and synthetic biology have provided new strategies and tools to engineer those oleaginous yeasts that have naturally high lipid accumulation but lack genetic tools, such as Rhodosporidium , Trichosporon , and Lipomyces . This review highlights recent accomplishments in metabolic engineering of oleaginous yeasts and recent advances in the development of genetic engineering tools in oleaginous yeasts within the last 3 years.

  9. Beyond optimality: Multistakeholder robustness tradeoffs for regional water portfolio planning under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Herman, Jonathan D.; Zeff, Harrison B.; Reed, Patrick M.; Characklis, Gregory W.

    2014-10-01

    While optimality is a foundational mathematical concept in water resources planning and management, "optimal" solutions may be vulnerable to failure if deeply uncertain future conditions deviate from those assumed during optimization. These vulnerabilities may produce severely asymmetric impacts across a region, making it vital to evaluate the robustness of management strategies as well as their impacts for regional stakeholders. In this study, we contribute a multistakeholder many-objective robust decision making (MORDM) framework that blends many-objective search and uncertainty analysis tools to discover key tradeoffs between water supply alternatives and their robustness to deep uncertainties (e.g., population pressures, climate change, and financial risks). The proposed framework is demonstrated for four interconnected water utilities representing major stakeholders in the "Research Triangle" region of North Carolina, U.S. The utilities supply well over one million customers and have the ability to collectively manage drought via transfer agreements and shared infrastructure. We show that water portfolios for this region that compose optimal tradeoffs (i.e., Pareto-approximate solutions) under expected future conditions may suffer significantly degraded performance with only modest changes in deeply uncertain hydrologic and economic factors. We then use the Patient Rule Induction Method (PRIM) to identify which uncertain factors drive the individual and collective vulnerabilities for the four cooperating utilities. Our framework identifies key stakeholder dependencies and robustness tradeoffs associated with cooperative regional planning, which are critical to understanding the tensions between individual versus regional water supply goals. Cooperative demand management was found to be the key factor controlling the robustness of regional water supply planning, dominating other hydroclimatic and economic uncertainties through the 2025 planning horizon. Results suggest that a modest reduction in the projected rate of demand growth (from approximately 3% per year to 2.4%) will substantially improve the utilities' robustness to future uncertainty and reduce the potential for regional tensions. The proposed multistakeholder MORDM framework offers critical insights into the risks and challenges posed by rising water demands and hydrological uncertainties, providing a planning template for regions now forced to confront rapidly evolving water scarcity risks.

  10. A computational image analysis glossary for biologists.

    PubMed

    Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M

    2012-09-01

    Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies.

  11. Chaos and Robustness in a Single Family of Genetic Oscillatory Networks

    PubMed Central

    Fu, Daniel; Tan, Patrick; Kuznetsov, Alexey; Molkov, Yaroslav I.

    2014-01-01

    Genetic oscillatory networks can be mathematically modeled with delay differential equations (DDEs). Interpreting genetic networks with DDEs gives a more intuitive understanding from a biological standpoint. However, it presents a problem mathematically, for DDEs are by construction infinitely-dimensional and thus cannot be analyzed using methods common for systems of ordinary differential equations (ODEs). In our study, we address this problem by developing a method for reducing infinitely-dimensional DDEs to two- and three-dimensional systems of ODEs. We find that the three-dimensional reductions provide qualitative improvements over the two-dimensional reductions. We find that the reducibility of a DDE corresponds to its robustness. For non-robust DDEs that exhibit high-dimensional dynamics, we calculate analytic dimension lines to predict the dependence of the DDEs’ correlation dimension on parameters. From these lines, we deduce that the correlation dimension of non-robust DDEs grows linearly with the delay. On the other hand, for robust DDEs, we find that the period of oscillation grows linearly with delay. We find that DDEs with exclusively negative feedback are robust, whereas DDEs with feedback that changes its sign are not robust. We find that non-saturable degradation damps oscillations and narrows the range of parameter values for which oscillations exist. Finally, we deduce that natural genetic oscillators with highly-regular periods likely have solely negative feedback. PMID:24667178

  12. Robust on-off pulse control of flexible space vehicles

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Sinha, Ravi

    1993-01-01

    The on-off reaction jet control system is often used for attitude and orbital maneuvering of various spacecraft. Future space vehicles such as the orbital transfer vehicles, orbital maneuvering vehicles, and space station will extensively use reaction jets for orbital maneuvering and attitude stabilization. The proposed robust fuel- and time-optimal control algorithm is used for a three-mass spacing model of flexible spacecraft. A fuel-efficient on-off control logic is developed for robust rest-to-rest maneuver of a flexible vehicle with minimum excitation of structural modes. The first part of this report is concerned with the problem of selecting a proper pair of jets for practical trade-offs among the maneuvering time, fuel consumption, structural mode excitation, and performance robustness. A time-optimal control problem subject to parameter robustness constraints is formulated and solved. The second part of this report deals with obtaining parameter insensitive fuel- and time- optimal control inputs by solving a constrained optimization problem subject to robustness constraints. It is shown that sensitivity to modeling errors can be significantly reduced by the proposed, robustified open-loop control approach. The final part of this report deals with sliding mode control design for uncertain flexible structures. The benchmark problem of a flexible structure is used as an example for the feedback sliding mode controller design with bounded control inputs and robustness to parameter variations is investigated.

  13. Robustness analysis of non-ordinary Petri nets for flexible assembly systems

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2010-05-01

    Non-ordinary controlled Petri nets (NCPNs) have the advantages to model flexible assembly systems in which multiple identical resources may be required to perform an operation. However, existing studies on NCPNs are still limited. For example, the robustness properties of NCPNs have not been studied. This motivates us to develop an analysis method for NCPNs. Robustness analysis concerns the ability for a system to maintain operation in the presence of uncertainties. It provides an alternative way to analyse a perturbed system without reanalysis. In our previous research, we have analysed the robustness properties of several subclasses of ordinary controlled Petri nets. To study the robustness properties of NCPNs, we augment NCPNs with an uncertainty model, which specifies an upper bound on the uncertainties for each reachable marking. The resulting PN models are called non-ordinary controlled Petri nets with uncertainties (NCPNU). Based on NCPNU, the problem is to characterise the maximal tolerable uncertainties for each reachable marking. The computational complexities to characterise maximal tolerable uncertainties for each reachable marking grow exponentially with the size of the nets. Instead of considering general NCPNU, we limit our scope to a subclass of PN models called non-ordinary controlled flexible assembly Petri net with uncertainties (NCFAPNU) for assembly systems and study its robustness. We will extend the robustness analysis to NCFAPNU. We identify two types of uncertainties under which the liveness of NCFAPNU can be maintained.

  14. Robust biological parametric mapping: an improved technique for multimodal brain image analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.

    2011-03-01

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.

  15. Robust control of burst suppression for medical coma

    NASA Astrophysics Data System (ADS)

    Westover, M. Brandon; Kim, Seong-Eun; Ching, ShiNung; Purdon, Patrick L.; Brown, Emery N.

    2015-08-01

    Objective. Medical coma is an anesthetic-induced state of brain inactivation, manifest in the electroencephalogram by burst suppression. Feedback control can be used to regulate burst suppression, however, previous designs have not been robust. Robust control design is critical under real-world operating conditions, subject to substantial pharmacokinetic and pharmacodynamic parameter uncertainty and unpredictable external disturbances. We sought to develop a robust closed-loop anesthesia delivery (CLAD) system to control medical coma. Approach. We developed a robust CLAD system to control the burst suppression probability (BSP). We developed a novel BSP tracking algorithm based on realistic models of propofol pharmacokinetics and pharmacodynamics. We also developed a practical method for estimating patient-specific pharmacodynamics parameters. Finally, we synthesized a robust proportional integral controller. Using a factorial design spanning patient age, mass, height, and gender, we tested whether the system performed within clinically acceptable limits. Throughout all experiments we subjected the system to disturbances, simulating treatment of refractory status epilepticus in a real-world intensive care unit environment. Main results. In 5400 simulations, CLAD behavior remained within specifications. Transient behavior after a step in target BSP from 0.2 to 0.8 exhibited a rise time (the median (min, max)) of 1.4 [1.1, 1.9] min; settling time, 7.8 [4.2, 9.0] min; and percent overshoot of 9.6 [2.3, 10.8]%. Under steady state conditions the CLAD system exhibited a median error of 0.1 [-0.5, 0.9]%; inaccuracy of 1.8 [0.9, 3.4]%; oscillation index of 1.8 [0.9, 3.4]%; and maximum instantaneous propofol dose of 4.3 [2.1, 10.5] mg kg-1. The maximum hourly propofol dose was 4.3 [2.1, 10.3] mg kg-1 h-1. Performance fell within clinically acceptable limits for all measures. Significance. A CLAD system designed using robust control theory achieves clinically acceptable performance in the presence of realistic unmodeled disturbances and in spite of realistic model uncertainty, while maintaining infusion rates within acceptable safety limits.

  16. Robust control of burst suppression for medical coma

    PubMed Central

    Westover, M Brandon; Kim, Seong-Eun; Ching, ShiNung; Purdon, Patrick L; Brown, Emery N

    2015-01-01

    Objective Medical coma is an anesthetic-induced state of brain inactivation, manifest in the electroencephalogram by burst suppression. Feedback control can be used to regulate burst suppression, however, previous designs have not been robust. Robust control design is critical under real-world operating conditions, subject to substantial pharmacokinetic and pharmacodynamic parameter uncertainty and unpredictable external disturbances. We sought to develop a robust closed-loop anesthesia delivery (CLAD) system to control medical coma. Approach We developed a robust CLAD system to control the burst suppression probability (BSP). We developed a novel BSP tracking algorithm based on realistic models of propofol pharmacokinetics and pharmacodynamics. We also developed a practical method for estimating patient-specific pharmacodynamics parameters. Finally, we synthesized a robust proportional integral controller. Using a factorial design spanning patient age, mass, height, and gender, we tested whether the system performed within clinically acceptable limits. Throughout all experiments we subjected the system to disturbances, simulating treatment of refractory status epilepticus in a real-world intensive care unit environment. Main results In 5400 simulations, CLAD behavior remained within specifications. Transient behavior after a step in target BSP from 0.2 to 0.8 exhibited a rise time (the median (min, max)) of 1.4 [1.1, 1.9] min; settling time, 7.8 [4.2, 9.0] min; and percent overshoot of 9.6 [2.3, 10.8]%. Under steady state conditions the CLAD system exhibited a median error of 0.1 [−0.5, 0.9]%; inaccuracy of 1.8 [0.9, 3.4]%; oscillation index of 1.8 [0.9, 3.4]%; and maximum instantaneous propofol dose of 4.3 [2.1, 10.5] mg kg−1. The maximum hourly propofol dose was 4.3 [2.1, 10.3] mg kg−1 h−1. Performance fell within clinically acceptable limits for all measures. Significance A CLAD system designed using robust control theory achieves clinically acceptable performance in the presence of realistic unmodeled disturbances and in spite of realistic model uncertainty, while maintaining infusion rates within acceptable safety limits. PMID:26020243

  17. The effects of variations in parameters and algorithm choices on calculated radiomics feature values: initial investigations and comparisons to feature variability across CT image acquisition conditions

    NASA Astrophysics Data System (ADS)

    Emaminejad, Nastaran; Wahi-Anwar, Muhammad; Hoffman, John; Kim, Grace H.; Brown, Matthew S.; McNitt-Gray, Michael

    2018-02-01

    Translation of radiomics into clinical practice requires confidence in its interpretations. This may be obtained via understanding and overcoming the limitations in current radiomic approaches. Currently there is a lack of standardization in radiomic feature extraction. In this study we examined a few factors that are potential sources of inconsistency in characterizing lung nodules, such as 1)different choices of parameters and algorithms in feature calculation, 2)two CT image dose levels, 3)different CT reconstruction algorithms (WFBP, denoised WFBP, and Iterative). We investigated the effect of variation of these factors on entropy textural feature of lung nodules. CT images of 19 lung nodules identified from our lung cancer screening program were identified by a CAD tool and contours provided. The radiomics features were extracted by calculating 36 GLCM based and 4 histogram based entropy features in addition to 2 intensity based features. A robustness index was calculated across different image acquisition parameters to illustrate the reproducibility of features. Most GLCM based and all histogram based entropy features were robust across two CT image dose levels. Denoising of images slightly improved robustness of some entropy features at WFBP. Iterative reconstruction resulted in improvement of robustness in a fewer times and caused more variation in entropy feature values and their robustness. Within different choices of parameters and algorithms texture features showed a wide range of variation, as much as 75% for individual nodules. Results indicate the need for harmonization of feature calculations and identification of optimum parameters and algorithms in a radiomics study.

  18. Recovery Schemes for Primitive Variables in General-relativistic Magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Siegel, Daniel M.; Mösta, Philipp; Desai, Dhruv; Wu, Samantha

    2018-05-01

    General-relativistic magnetohydrodynamic (GRMHD) simulations are an important tool to study a variety of astrophysical systems such as neutron star mergers, core-collapse supernovae, and accretion onto compact objects. A conservative GRMHD scheme numerically evolves a set of conservation equations for “conserved” quantities and requires the computation of certain primitive variables at every time step. This recovery procedure constitutes a core part of any conservative GRMHD scheme and it is closely tied to the equation of state (EOS) of the fluid. In the quest to include nuclear physics, weak interactions, and neutrino physics, state-of-the-art GRMHD simulations employ finite-temperature, composition-dependent EOSs. While different schemes have individually been proposed, the recovery problem still remains a major source of error, failure, and inefficiency in GRMHD simulations with advanced microphysics. The strengths and weaknesses of the different schemes when compared to each other remain unclear. Here we present the first systematic comparison of various recovery schemes used in different dynamical spacetime GRMHD codes for both analytic and tabulated microphysical EOSs. We assess the schemes in terms of (i) speed, (ii) accuracy, and (iii) robustness. We find large variations among the different schemes and that there is not a single ideal scheme. While the computationally most efficient schemes are less robust, the most robust schemes are computationally less efficient. More robust schemes may require an order of magnitude more calls to the EOS, which are computationally expensive. We propose an optimal strategy of an efficient three-dimensional Newton–Raphson scheme and a slower but more robust one-dimensional scheme as a fall-back.

  19. Teaching Evaluation Tools as Robust Ethical Codes

    ERIC Educational Resources Information Center

    Talanker, Sergei

    2018-01-01

    I argue that teaching evaluation tools (TETs) may function as ethical codes (ECs), and answer certain demands that ECs cannot sufficiently fulfill. In order to be viable, an EC related to the teaching profession must assume a different form, and such a form is already present in several of the contemporary TETs. The TET matrix form allows for…

  20. Six degree of freedom simulation system for evaluating automated rendezvous and docking spacecraft

    NASA Technical Reports Server (NTRS)

    Rourke, Kenneth H.; Tsugawa, Roy K.

    1991-01-01

    Future logistics supply and servicing vehicles such as cargo transfer vehicles (CTV) must have full 6 degree of freedom (6DOF) capability in order to perform requisite rendezvous, proximity operations, and capture operations. The design and performance issues encountered when developing a 6DOF maneuvering spacecraft are very complex with subtle interactions which are not immediately obvious or easily anticipated. In order to deal with these complexities and develop robust maneuvering spacecraft designs, a simulation system and associated family of tools are used at TRW for generating and validating spacecraft performance requirements and guidance algorithms. An overview of the simulator and tools is provided. These are used by TRW for autonomous rendezvous and docking research projects including CTV studies.

  1. A Short Review of Ablative-Material Response Models and Simulation Tools

    NASA Technical Reports Server (NTRS)

    Lachaud, Jean; Magin, Thierry E.; Cozmuta, Ioana; Mansour, Nagi N.

    2011-01-01

    A review of the governing equations and boundary conditions used to model the response of ablative materials submitted to a high-enthalpy flow is proposed. The heritage of model-development efforts undertaken in the 1960s is extremely clear: the bases of the models used in the community are mathematically equivalent. Most of the material-response codes implement a single model in which the equation parameters may be modified to model different materials or conditions. The level of fidelity of the models implemented in design tools only slightly varies. Research and development codes are generally more advanced but often not as robust. The capabilities of each of these codes are summarized in a color-coded table along with research and development efforts currently in progress.

  2. Capacity planning for waste management systems: an interval fuzzy robust dynamic programming approach.

    PubMed

    Nie, Xianghui; Huang, Guo H; Li, Yongping

    2009-11-01

    This study integrates the concepts of interval numbers and fuzzy sets into optimization analysis by dynamic programming as a means of accounting for system uncertainty. The developed interval fuzzy robust dynamic programming (IFRDP) model improves upon previous interval dynamic programming methods. It allows highly uncertain information to be effectively communicated into the optimization process through introducing the concept of fuzzy boundary interval and providing an interval-parameter fuzzy robust programming method for an embedded linear programming problem. Consequently, robustness of the optimization process and solution can be enhanced. The modeling approach is applied to a hypothetical problem for the planning of waste-flow allocation and treatment/disposal facility expansion within a municipal solid waste (MSW) management system. Interval solutions for capacity expansion of waste management facilities and relevant waste-flow allocation are generated and interpreted to provide useful decision alternatives. The results indicate that robust and useful solutions can be obtained, and the proposed IFRDP approach is applicable to practical problems that are associated with highly complex and uncertain information.

  3. A study of the temporal robustness of the growing global container-shipping network

    PubMed Central

    Wang, Nuo; Wu, Nuan; Dong, Ling-ling; Yan, Hua-kun; Wu, Di

    2016-01-01

    Whether they thrive as they grow must be determined for all constantly expanding networks. However, few studies have focused on this important network feature or the development of quantitative analytical methods. Given the formation and growth of the global container-shipping network, we proposed the concept of network temporal robustness and quantitative method. As an example, we collected container liner companies’ data at two time points (2004 and 2014) and built a shipping network with ports as nodes and routes as links. We thus obtained a quantitative value of the temporal robustness. The temporal robustness is a significant network property because, for the first time, we can clearly recognize that the shipping network has become more vulnerable to damage over the last decade: When the node failure scale reached 50% of the entire network, the temporal robustness was approximately −0.51% for random errors and −12.63% for intentional attacks. The proposed concept and analytical method described in this paper are significant for other network studies. PMID:27713549

  4. Robust fractional order sliding mode control of doubly-fed induction generator (DFIG)-based wind turbines.

    PubMed

    Ebrahimkhani, Sadegh

    2016-07-01

    Wind power plants have nonlinear dynamics and contain many uncertainties such as unknown nonlinear disturbances and parameter uncertainties. Thus, it is a difficult task to design a robust reliable controller for this system. This paper proposes a novel robust fractional-order sliding mode (FOSM) controller for maximum power point tracking (MPPT) control of doubly fed induction generator (DFIG)-based wind energy conversion system. In order to enhance the robustness of the control system, uncertainties and disturbances are estimated using a fractional order uncertainty estimator. In the proposed method a continuous control strategy is developed to achieve the chattering free fractional order sliding-mode control, and also no knowledge of the uncertainties and disturbances or their bound is assumed. The boundedness and convergence properties of the closed-loop signals are proven using Lyapunov׳s stability theory. Simulation results in the presence of various uncertainties were carried out to evaluate the effectiveness and robustness of the proposed control scheme. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Integration of the Response Surface Methodology with the Compromise Decision Support Problem in Developing a General Robust Design Procedure

    NASA Technical Reports Server (NTRS)

    Chen, Wei; Tsui, Kwok-Leung; Allen, Janet K.; Mistree, Farrokh

    1994-01-01

    In this paper we introduce a comprehensive and rigorous robust design procedure to overcome some limitations of the current approaches. A comprehensive approach is general enough to model the two major types of robust design applications, namely, robust design associated with the minimization of the deviation of performance caused by the deviation of noise factors (uncontrollable parameters), and robust design due to the minimization of the deviation of performance caused by the deviation of control factors (design variables). We achieve mathematical rigor by using, as a foundation, principles from the design of experiments and optimization. Specifically, we integrate the Response Surface Method (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closed-form solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. Our focus in this paper is on illustrating our approach rather than on the results per se.

  6. Thermal cracking performance prediction and asset management integration.

    DOT National Transportation Integrated Search

    2011-03-01

    With shrinking maintenance budgets and the need to do more with less, accurate, robust asset management tools are greatly needed for the transportation engineering community. In addition, the increased use of recycled materials and low energy p...

  7. Robust iterative learning contouring controller with disturbance observer for machine tool feed drives.

    PubMed

    Simba, Kenneth Renny; Bui, Ba Dinh; Msukwa, Mathew Renny; Uchiyama, Naoki

    2018-04-01

    In feed drive systems, particularly machine tools, a contour error is more significant than the individual axial tracking errors from the view point of enhancing precision in manufacturing and production systems. The contour error must be within the permissible tolerance of given products. In machining complex or sharp-corner products, large contour errors occur mainly owing to discontinuous trajectories and the existence of nonlinear uncertainties. Therefore, it is indispensable to design robust controllers that can enhance the tracking ability of feed drive systems. In this study, an iterative learning contouring controller consisting of a classical Proportional-Derivative (PD) controller and disturbance observer is proposed. The proposed controller was evaluated experimentally by using a typical sharp-corner trajectory, and its performance was compared with that of conventional controllers. The results revealed that the maximum contour error can be reduced by about 37% on average. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Robust validation of approximate 1-matrix functionals with few-electron harmonium atoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cioslowski, Jerzy, E-mail: jerzy@wmf.univ.szczecin.pl; Piris, Mario; Matito, Eduard

    2015-12-07

    A simple comparison between the exact and approximate correlation components U of the electron-electron repulsion energy of several states of few-electron harmonium atoms with varying confinement strengths provides a stringent validation tool for 1-matrix functionals. The robustness of this tool is clearly demonstrated in a survey of 14 known functionals, which reveals their substandard performance within different electron correlation regimes. Unlike spot-testing that employs dissociation curves of diatomic molecules or more extensive benchmarking against experimental atomization energies of molecules comprising some standard set, the present approach not only uncovers the flaws and patent failures of the functionals but, even moremore » importantly, also allows for pinpointing their root causes. Since the approximate values of U are computed at exact 1-densities, the testing requires minimal programming and thus is particularly suitable for rapid screening of new functionals.« less

  9. Resolving Off-Nominal Situations in Schedule-Based Terminal Area Operations: Results from a Human-in-the-Loop Simulation

    NASA Technical Reports Server (NTRS)

    Mercer, Joey; Callantine, Todd; Martin, Lynne

    2012-01-01

    A recent human-in-the-loop simulation in the Airspace Operations Laboratory (AOL) at NASA's Ames Research Center investigated the robustness of Controller-Managed Spacing (CMS) operations. CMS refers to AOL-developed controller tools and procedures for enabling arrivals to conduct efficient Optimized Profile Descents with sustained high throughput. The simulation provided a rich data set for examining how a traffic management supervisor and terminal-area controller participants used the CMS tools and coordinated to respond to off-nominal events. This paper proposes quantitative measures for characterizing the participants responses. Case studies of go-around events, replicated during the simulation, provide insights into the strategies employed and the role the CMS tools played in supporting them.

  10. DNA Electrochemistry and Electrochemical Sensors for Nucleic Acids.

    PubMed

    Ferapontova, Elena E

    2018-06-12

    Sensitive, specific, and fast analysis of nucleic acids (NAs) is strongly needed in medicine, environmental science, biodefence, and agriculture for the study of bacterial contamination of food and beverages and genetically modified organisms. Electrochemistry offers accurate, simple, inexpensive, and robust tools for the development of such analytical platforms that can successfully compete with other approaches for NA detection. Here, electrode reactions of DNA, basic principles of electrochemical NA analysis, and their relevance for practical applications are reviewed and critically discussed.

  11. Fracturing And Liquid CONvection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-02-29

    FALCON has been developed to enable simulation of the tightly coupled fluid-rock behavior in hydrothermal and engineered geothermal system (EGS) reservoirs, targeting the dynamics of fracture stimulation, fluid flow, rock deformation, and heat transport in a single integrated code, with the ultimate goal of providing a tool that can be used to test the viability of EGS in the United States and worldwide. Reliable reservoir performance predictions of EGS systems require accurate and robust modeling for the coupled thermal­hydrological­mechanical processes.

  12. Multi-Scale Homogenization for 3D Multiphase Composites: Development of Robust Software Tools for Material/Structural Characterization Across Length Scales

    DTIC Science & Technology

    2013-11-01

    person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number...position‐dependent  [ ]. Thanks  to  this  relation, eqs.  (1)  can be reduced to a single ( vectorial ) equation for the displacement  :     ⋅ 0. (3

  13. Simulation of Inviscid Compressible Multi-Phase Flow with Condensation

    NASA Technical Reports Server (NTRS)

    Kelleners, Philip

    2003-01-01

    Condensation of vapours in rapid expansions of compressible gases is investigated. In the case of high temperature gradients the condensation will start at conditions well away from thermodynamic equilibrium of the fluid. In those cases homogeneous condensation is dominant over heterogeneous condensation. The present work is concerned with development of a simulation tool for computation of high speed compressible flows with homogeneous condensation. The resulting ow solver should preferably be accurate and robust to be used for simulation of industrial flows in general geometries.

  14. Believing Your Eyes: Strengthening the Reliability of Tags and Seals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brim, Cornelia P.; Denlinger, Laura S.

    2013-07-01

    NNSA’s Office of Nonproliferation and International Security (NIS) is working together with scientific experts at the DOE national laboratories to develop the tools needed to safeguard and secure nuclear material from diversion, theft, and sabotage--tasks critical to support future arms control treaties that may involve the new challenge of monitoring nuclear weapons dismantlement. Use of optically stimulated luminescent material is one method to enhance the security and robustness of existing tamper indicating devices such as tags and seals.

  15. Robust doubly charged nodal lines and nodal surfaces in centrosymmetric systems

    NASA Astrophysics Data System (ADS)

    Bzdušek, Tomáš; Sigrist, Manfred

    2017-10-01

    Weyl points in three spatial dimensions are characterized by a Z -valued charge—the Chern number—which makes them stable against a wide range of perturbations. A set of Weyl points can mutually annihilate only if their net charge vanishes, a property we refer to as robustness. While nodal loops are usually not robust in this sense, it has recently been shown using homotopy arguments that in the centrosymmetric extension of the AI symmetry class they nevertheless develop a Z2 charge analogous to the Chern number. Nodal loops carrying a nontrivial value of this Z2 charge are robust, i.e., they can be gapped out only by a pairwise annihilation and not on their own. As this is an additional charge independent of the Berry π -phase flowing along the band degeneracy, such nodal loops are, in fact, doubly charged. In this manuscript, we generalize the homotopy discussion to the centrosymmetric extensions of all Atland-Zirnbauer classes. We develop a tailored mathematical framework dubbed the AZ +I classification and show that in three spatial dimensions such robust and multiply charged nodes appear in four of such centrosymmetric extensions, namely, AZ +I classes CI and AI lead to doubly charged nodal lines, while D and BDI support doubly charged nodal surfaces. We remark that no further crystalline symmetries apart from the spatial inversion are necessary for their stability. We provide a description of the corresponding topological charges, and develop simple tight-binding models of various semimetallic and superconducting phases that exhibit these nodes. We also indicate how the concept of robust and multiply charged nodes generalizes to other spatial dimensions.

  16. Robust and Simple Non-Reflecting Boundary Conditions for the Euler Equations: A New Approach Based on the Space-Time CE/SE Method

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Himansu, Ananda; Loh, Ching-Yuen; Wang, Xiao-Yen; Yu, Shang-Tao

    2003-01-01

    This paper reports on a significant advance in the area of non-reflecting boundary conditions (NRBCs) for unsteady flow computations. As a part of the development of the space-time conservation element and solution element (CE/SE) method, sets of NRBCs for 1D Euler problems are developed without using any characteristics-based techniques. These conditions are much simpler than those commonly reported in the literature, yet so robust that they are applicable to subsonic, transonic and supersonic flows even in the presence of discontinuities. In addition, the straightforward multidimensional extensions of the present 1D NRBCs have been shown numerically to be equally simple and robust. The paper details the theoretical underpinning of these NRBCs, and explains their unique robustness and accuracy in terms of the conservation of space-time fluxes. Some numerical results for an extended Sod's shock-tube problem, illustrating the effectiveness of the present NRBCs are included, together with an associated simple Fortran computer program. As a preliminary to the present development, a review of the basic CE/SE schemes is also included.

  17. Testing the robustness of optimal access vessel fleet selection for operation and maintenance of offshore wind farms

    DOE PAGES

    Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain; ...

    2017-09-23

    Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less

  18. Testing the robustness of optimal access vessel fleet selection for operation and maintenance of offshore wind farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain

    Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less

  19. Adaptation of a Simple Microfluidic Platform for High-Dimensional Quantitative Morphological Analysis of Human Mesenchymal Stromal Cells on Polystyrene-Based Substrates.

    PubMed

    Lam, Johnny; Marklein, Ross A; Jimenez-Torres, Jose A; Beebe, David J; Bauer, Steven R; Sung, Kyung E

    2017-12-01

    Multipotent stromal cells (MSCs, often called mesenchymal stem cells) have garnered significant attention within the field of regenerative medicine because of their purported ability to differentiate down musculoskeletal lineages. Given the inherent heterogeneity of MSC populations, recent studies have suggested that cell morphology may be indicative of MSC differentiation potential. Toward improving current methods and developing simple yet effective approaches for the morphological evaluation of MSCs, we combined passive pumping microfluidic technology with high-dimensional morphological characterization to produce robust tools for standardized high-throughput analysis. Using ultraviolet (UV) light as a modality for reproducible polystyrene substrate modification, we show that MSCs seeded on microfluidic straight channel devices incorporating UV-exposed substrates exhibited morphological changes that responded accordingly to the degree of substrate modification. Substrate modification also effected greater morphological changes in MSCs seeded at a lower rather than higher density within microfluidic channels. Despite largely comparable trends in morphology, MSCs seeded in microscale as opposed to traditional macroscale platforms displayed much higher sensitivity to changes in substrate properties. In summary, we adapted and qualified microfluidic cell culture platforms comprising simple straight channel arrays as a viable and robust tool for high-throughput quantitative morphological analysis to study cell-material interactions.

  20. Robust Coefficients Alpha and Omega and Confidence Intervals With Outlying Observations and Missing Data: Methods and Software.

    PubMed

    Zhang, Zhiyong; Yuan, Ke-Hai

    2016-06-01

    Cronbach's coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald's omega has been used as a popular alternative to alpha in the literature. Traditional estimation methods for alpha and omega often implicitly assume that data are complete and normally distributed. This study proposes robust procedures to estimate both alpha and omega as well as corresponding standard errors and confidence intervals from samples that may contain potential outlying observations and missing values. The influence of outlying observations and missing data on the estimates of alpha and omega is investigated through two simulation studies. Results show that the newly developed robust method yields substantially improved alpha and omega estimates as well as better coverage rates of confidence intervals than the conventional nonrobust method. An R package coefficientalpha is developed and demonstrated to obtain robust estimates of alpha and omega.

  1. Robust Coefficients Alpha and Omega and Confidence Intervals With Outlying Observations and Missing Data

    PubMed Central

    Zhang, Zhiyong; Yuan, Ke-Hai

    2015-01-01

    Cronbach’s coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald’s omega has been used as a popular alternative to alpha in the literature. Traditional estimation methods for alpha and omega often implicitly assume that data are complete and normally distributed. This study proposes robust procedures to estimate both alpha and omega as well as corresponding standard errors and confidence intervals from samples that may contain potential outlying observations and missing values. The influence of outlying observations and missing data on the estimates of alpha and omega is investigated through two simulation studies. Results show that the newly developed robust method yields substantially improved alpha and omega estimates as well as better coverage rates of confidence intervals than the conventional nonrobust method. An R package coefficientalpha is developed and demonstrated to obtain robust estimates of alpha and omega. PMID:29795870

  2. Morphological change in machines accelerates the evolution of robust behavior

    PubMed Central

    Bongard, Josh

    2011-01-01

    Most animals exhibit significant neurological and morphological change throughout their lifetime. No robots to date, however, grow new morphological structure while behaving. This is due to technological limitations but also because it is unclear that morphological change provides a benefit to the acquisition of robust behavior in machines. Here I show that in evolving populations of simulated robots, if robots grow from anguilliform into legged robots during their lifetime in the early stages of evolution, and the anguilliform body plan is gradually lost during later stages of evolution, gaits are evolved for the final, legged form of the robot more rapidly—and the evolved gaits are more robust—compared to evolving populations of legged robots that do not transition through the anguilliform body plan. This suggests that morphological change, as well as the evolution of development, are two important processes that improve the automatic generation of robust behaviors for machines. It also provides an experimental platform for investigating the relationship between the evolution of development and robust behavior in biological organisms. PMID:21220304

  3. Sample manipulation and data assembly for robust microcrystal synchrotron crystallography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Gongrui; Fuchs, Martin R.; Shi, Wuxian

    With the recent developments in microcrystal handling, synchrotron microdiffraction beamline instrumentation and data analysis, microcrystal crystallography with crystal sizes of less than 10 µm is appealing at synchrotrons. However, challenges remain in sample manipulation and data assembly for robust microcrystal synchrotron crystallography. Here, the development of micro-sized polyimide well-mounts for the manipulation of microcrystals of a few micrometres in size and the implementation of a robust data-analysis method for the assembly of rotational microdiffraction data sets from many microcrystals are described. Here, the method demonstrates that microcrystals may be routinely utilized for the acquisition and assembly of complete data setsmore » from synchrotron microdiffraction beamlines.« less

  4. Sample manipulation and data assembly for robust microcrystal synchrotron crystallography

    DOE PAGES

    Guo, Gongrui; Fuchs, Martin R.; Shi, Wuxian; ...

    2018-04-19

    With the recent developments in microcrystal handling, synchrotron microdiffraction beamline instrumentation and data analysis, microcrystal crystallography with crystal sizes of less than 10 µm is appealing at synchrotrons. However, challenges remain in sample manipulation and data assembly for robust microcrystal synchrotron crystallography. Here, the development of micro-sized polyimide well-mounts for the manipulation of microcrystals of a few micrometres in size and the implementation of a robust data-analysis method for the assembly of rotational microdiffraction data sets from many microcrystals are described. Here, the method demonstrates that microcrystals may be routinely utilized for the acquisition and assembly of complete data setsmore » from synchrotron microdiffraction beamlines.« less

  5. A topological quantum optics interface

    NASA Astrophysics Data System (ADS)

    Barik, Sabyasachi; Karasahin, Aziz; Flower, Christopher; Cai, Tao; Miyake, Hirokazu; DeGottardi, Wade; Hafezi, Mohammad; Waks, Edo

    2018-02-01

    The application of topology in optics has led to a new paradigm in developing photonic devices with robust properties against disorder. Although considerable progress on topological phenomena has been achieved in the classical domain, the realization of strong light-matter coupling in the quantum domain remains unexplored. We demonstrate a strong interface between single quantum emitters and topological photonic states. Our approach creates robust counterpropagating edge states at the boundary of two distinct topological photonic crystals. We demonstrate the chiral emission of a quantum emitter into these modes and establish their robustness against sharp bends. This approach may enable the development of quantum optics devices with built-in protection, with potential applications in quantum simulation and sensing.

  6. BROJA-2PID: A Robust Estimator for Bivariate Partial Information Decomposition

    NASA Astrophysics Data System (ADS)

    Makkeh, Abdullah; Theis, Dirk; Vicente, Raul

    2018-04-01

    Makkeh, Theis, and Vicente found in [8] that Cone Programming model is the most robust to compute the Bertschinger et al. partial information decompostion (BROJA PID) measure [1]. We developed a production-quality robust software that computes the BROJA PID measure based on the Cone Programming model. In this paper, we prove the important property of strong duality for the Cone Program and prove an equivalence between the Cone Program and the original Convex problem. Then describe in detail our software and how to use it.\

  7. Robust functional regression model for marginal mean and subject-specific inferences.

    PubMed

    Cao, Chunzheng; Shi, Jian Qing; Lee, Youngjo

    2017-01-01

    We introduce flexible robust functional regression models, using various heavy-tailed processes, including a Student t-process. We propose efficient algorithms in estimating parameters for the marginal mean inferences and in predicting conditional means as well as interpolation and extrapolation for the subject-specific inferences. We develop bootstrap prediction intervals (PIs) for conditional mean curves. Numerical studies show that the proposed model provides a robust approach against data contamination or distribution misspecification, and the proposed PIs maintain the nominal confidence levels. A real data application is presented as an illustrative example.

  8. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series

    PubMed Central

    2011-01-01

    Background Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Results Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. Conclusions The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html. PMID:21851598

  9. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series.

    PubMed

    Yuan, Yuan; Chen, Yi-Ping Phoebe; Ni, Shengyu; Xu, Augix Guohua; Tang, Lin; Vingron, Martin; Somel, Mehmet; Khaitovich, Philipp

    2011-08-18

    Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html.

  10. Robust Kalman filter design for predictive wind shear detection

    NASA Technical Reports Server (NTRS)

    Stratton, Alexander D.; Stengel, Robert F.

    1991-01-01

    Severe, low-altitude wind shear is a threat to aviation safety. Airborne sensors under development measure the radial component of wind along a line directly in front of an aircraft. In this paper, optimal estimation theory is used to define a detection algorithm to warn of hazardous wind shear from these sensors. To achieve robustness, a wind shear detection algorithm must distinguish threatening wind shear from less hazardous gustiness, despite variations in wind shear structure. This paper presents statistical analysis methods to refine wind shear detection algorithm robustness. Computational methods predict the ability to warn of severe wind shear and avoid false warning. Comparative capability of the detection algorithm as a function of its design parameters is determined, identifying designs that provide robust detection of severe wind shear.

  11. Aperiodic Robust Model Predictive Control for Constrained Continuous-Time Nonlinear Systems: An Event-Triggered Approach.

    PubMed

    Liu, Changxin; Gao, Jian; Li, Huiping; Xu, Demin

    2018-05-01

    The event-triggered control is a promising solution to cyber-physical systems, such as networked control systems, multiagent systems, and large-scale intelligent systems. In this paper, we propose an event-triggered model predictive control (MPC) scheme for constrained continuous-time nonlinear systems with bounded disturbances. First, a time-varying tightened state constraint is computed to achieve robust constraint satisfaction, and an event-triggered scheduling strategy is designed in the framework of dual-mode MPC. Second, the sufficient conditions for ensuring feasibility and closed-loop robust stability are developed, respectively. We show that robust stability can be ensured and communication load can be reduced with the proposed MPC algorithm. Finally, numerical simulations and comparison studies are performed to verify the theoretical results.

  12. A novel robust speed controller scheme for PMBLDC motor.

    PubMed

    Thirusakthimurugan, P; Dananjayan, P

    2007-10-01

    The design of speed and position controllers for permanent magnet brushless DC motor (PMBLDC) drive remains as an open problem in the field of motor drives. A precise speed control of PMBLDC motor is complex due to nonlinear coupling between winding currents and rotor speed. In addition, the nonlinearity present in the developed torque due to magnetic saturation of the rotor further complicates this issue. This paper presents a novel control scheme to the conventional PMBLDC motor drive, which aims at improving the robustness by complete decoupling of the design besides minimizing the mutual influence among the speed and current control loops. The interesting feature of this robust control scheme is its suitability for both static and dynamic aspects. The effectiveness of the proposed robust speed control scheme is verified through simulations.

  13. Robust Constrained Optimization Approach to Control Design for International Space Station Centrifuge Rotor Auto Balancing Control System

    NASA Technical Reports Server (NTRS)

    Postma, Barry Dirk

    2005-01-01

    This thesis discusses application of a robust constrained optimization approach to control design to develop an Auto Balancing Controller (ABC) for a centrifuge rotor to be implemented on the International Space Station. The design goal is to minimize a performance objective of the system, while guaranteeing stability and proper performance for a range of uncertain plants. The Performance objective is to minimize the translational response of the centrifuge rotor due to a fixed worst-case rotor imbalance. The robustness constraints are posed with respect to parametric uncertainty in the plant. The proposed approach to control design allows for both of these objectives to be handled within the framework of constrained optimization. The resulting controller achieves acceptable performance and robustness characteristics.

  14. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    NASA Astrophysics Data System (ADS)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.

  15. Robust Control Design for Uncertain Nonlinear Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Andrews, Lindsey; Giesy, Daniel P.

    2012-01-01

    Robustness to parametric uncertainty is fundamental to successful control system design and as such it has been at the core of many design methods developed over the decades. Despite its prominence, most of the work on robust control design has focused on linear models and uncertainties that are non-probabilistic in nature. Recently, researchers have acknowledged this disparity and have been developing theory to address a broader class of uncertainties. This paper presents an experimental application of robust control design for a hybrid class of probabilistic and non-probabilistic parametric uncertainties. The experimental apparatus is based upon the classic inverted pendulum on a cart. The physical uncertainty is realized by a known additional lumped mass at an unknown location on the pendulum. This unknown location has the effect of substantially altering the nominal frequency and controllability of the nonlinear system, and in the limit has the capability to make the system neutrally stable and uncontrollable. Another uncertainty to be considered is a direct current motor parameter. The control design objective is to design a controller that satisfies stability, tracking error, control power, and transient behavior requirements for the largest range of parametric uncertainties. This paper presents an overview of the theory behind the robust control design methodology and the experimental results.

  16. Best Practices for Reliable and Robust Spacecraft Structures

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Murthy, P. L. N.; Patel, Naresh R.; Bonacuse, Peter J.; Elliott, Kenny B.; Gordon, S. A.; Gyekenyesi, J. P.; Daso, E. O.; Aggarwal, P.; Tillman, R. F.

    2007-01-01

    A study was undertaken to capture the best practices for the development of reliable and robust spacecraft structures for NASA s next generation cargo and crewed launch vehicles. In this study, the NASA heritage programs such as Mercury, Gemini, Apollo, and the Space Shuttle program were examined. A series of lessons learned during the NASA and DoD heritage programs are captured. The processes that "make the right structural system" are examined along with the processes to "make the structural system right". The impact of technology advancements in materials and analysis and testing methods on reliability and robustness of spacecraft structures is studied. The best practices and lessons learned are extracted from these studies. Since the first human space flight, the best practices for reliable and robust spacecraft structures appear to be well established, understood, and articulated by each generation of designers and engineers. However, these best practices apparently have not always been followed. When the best practices are ignored or short cuts are taken, risks accumulate, and reliability suffers. Thus program managers need to be vigilant of circumstances and situations that tend to violate best practices. Adherence to the best practices may help develop spacecraft systems with high reliability and robustness against certain anomalies and unforeseen events.

  17. Chemical Microsensor Development for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Xu, Jennifer C.; Hunter, Gary W.; Lukco, Dorothy; Chen, Liangyu; Biaggi-Labiosa, Azlin M.

    2013-01-01

    Numerous aerospace applications, including low-false-alarm fire detection, environmental monitoring, fuel leak detection, and engine emission monitoring, would benefit greatly from robust and low weight, cost, and power consumption chemical microsensors. NASA Glenn Research Center has been working to develop a variety of chemical microsensors with these attributes to address the aforementioned applications. Chemical microsensors using different material platforms and sensing mechanisms have been produced. Approaches using electrochemical cells, resistors, and Schottky diode platforms, combined with nano-based materials, high temperature solid electrolytes, and room temperature polymer electrolytes have been realized to enable different types of microsensors. By understanding the application needs and chemical gas species to be detected, sensing materials and unique microfabrication processes were selected and applied. The chemical microsensors were designed utilizing simple structures and the least number of microfabrication processes possible, while maintaining high yield and low cost. In this presentation, an overview of carbon dioxide (CO2), oxygen (O2), and hydrogen/hydrocarbons (H2/CxHy) microsensors and their fabrication, testing results, and applications will be described. Particular challenges associated with improving the H2/CxHy microsensor contact wire-bonding pad will be discussed. These microsensors represent our research approach and serve as major tools as we expand our sensor development toolbox. Our ultimate goal is to develop robust chemical microsensor systems for aerospace and commercial applications.

  18. Enabling breakthroughs in Parkinson’s disease with wearable technologies and big data analytics

    PubMed Central

    Cohen, Shahar; Martig, Adria K.

    2016-01-01

    Parkinson’s disease (PD) is a progressive, degenerative disorder of the central nervous system that is diagnosed and measured clinically by the Unified Parkinson’s Disease Rating Scale (UPDRS). Tools for continuous and objective monitoring of PD motor symptoms are needed to complement clinical assessments of symptom severity to further inform PD therapeutic development across several arenas, from developing more robust clinical trial outcome measures to establishing biomarkers of disease progression. The Michael J. Fox Foundation for Parkinson’s Disease Research and Intel Corporation have joined forces to develop a mobile application and an Internet of Things (IoT) platform to support large-scale studies of objective, continuously sampled sensory data from people with PD. This platform provides both population and per-patient analyses, measuring gait, activity level, nighttime activity, tremor, as well as other structured assessments and tasks. All data collected will be available to researchers on an open-source platform. Development of the IoT platform raised a number of engineering considerations, including wearable sensor choice, data management and curation, and algorithm validation. This project has successfully demonstrated proof of concept that IoT platforms, wearable technologies and the data they generate offer exciting possibilities for more robust, reliable, and low-cost research methodologies and patient care strategies. PMID:28293596

  19. Rapid tests for sexually transmitted infections (STIs): the way forward

    PubMed Central

    Peeling, R W; Holmes, K K; Mabey, D

    2006-01-01

    In the developing world, laboratory services for sexually transmitted infections (STIs) are either not available, or where limited services are available, patients may not be able to pay for or physically access those services. Despite the existence of national policy for antenatal screening to prevent congenital syphilis and substantial evidence that antenatal screening is cost‐effective, implementation of syphilis screening programmes remains unacceptably low because of lack of screening tools that can be used in primary health care settings. The World Health Organization Sexually Transmitted Diseases Diagnostics Initiative (SDI) has developed the ASSURED criteria as a benchmark to decide if tests address disease control needs: Affordable, Sensitive, Specific, User‐friendly, Rapid and robust, Equipment‐free and Deliverable to end‐users. Rapid syphilis tests that can be used with whole blood approach the ASSURED criteria and can now be deployed in areas where no previous screening has been possible. Although rapid tests for chlamydia and gonorrhoea lack sensitivity, more tests are in development. The way forward for STI diagnostics requires a continuing quest for ASSURED tests, the development of a road map for test introduction, sustainable programmes for quality assurance, and the creation of a robust infrastructure linked to HIV prevention that ensures sustainability of STI control efforts that includes viral STIs. PMID:17151023

  20. Rapid tests for sexually transmitted infections (STIs): the way forward.

    PubMed

    Peeling, R W; Holmes, K K; Mabey, D; Ronald, A

    2006-12-01

    In the developing world, laboratory services for sexually transmitted infections (STIs) are either not available, or where limited services are available, patients may not be able to pay for or physically access those services. Despite the existence of national policy for antenatal screening to prevent congenital syphilis and substantial evidence that antenatal screening is cost-effective, implementation of syphilis screening programmes remains unacceptably low because of lack of screening tools that can be used in primary health care settings. The World Health Organization Sexually Transmitted Diseases Diagnostics Initiative (SDI) has developed the ASSURED criteria as a benchmark to decide if tests address disease control needs: Affordable, Sensitive, Specific, User-friendly, Rapid and robust, Equipment-free and Deliverable to end-users. Rapid syphilis tests that can be used with whole blood approach the ASSURED criteria and can now be deployed in areas where no previous screening has been possible. Although rapid tests for chlamydia and gonorrhoea lack sensitivity, more tests are in development. The way forward for STI diagnostics requires a continuing quest for ASSURED tests, the development of a road map for test introduction, sustainable programmes for quality assurance, and the creation of a robust infrastructure linked to HIV prevention that ensures sustainability of STI control efforts that includes viral STIs.

  1. Enabling breakthroughs in Parkinson's disease with wearable technologies and big data analytics.

    PubMed

    Cohen, Shahar; Bataille, Lauren R; Martig, Adria K

    2016-01-01

    Parkinson's disease (PD) is a progressive, degenerative disorder of the central nervous system that is diagnosed and measured clinically by the Unified Parkinson's Disease Rating Scale (UPDRS). Tools for continuous and objective monitoring of PD motor symptoms are needed to complement clinical assessments of symptom severity to further inform PD therapeutic development across several arenas, from developing more robust clinical trial outcome measures to establishing biomarkers of disease progression. The Michael J. Fox Foundation for Parkinson's Disease Research and Intel Corporation have joined forces to develop a mobile application and an Internet of Things (IoT) platform to support large-scale studies of objective, continuously sampled sensory data from people with PD. This platform provides both population and per-patient analyses, measuring gait, activity level, nighttime activity, tremor, as well as other structured assessments and tasks. All data collected will be available to researchers on an open-source platform. Development of the IoT platform raised a number of engineering considerations, including wearable sensor choice, data management and curation, and algorithm validation. This project has successfully demonstrated proof of concept that IoT platforms, wearable technologies and the data they generate offer exciting possibilities for more robust, reliable, and low-cost research methodologies and patient care strategies.

  2. TARCMO: Theory and Algorithms for Robust, Combinatorial, Multicriteria Optimization

    DTIC Science & Technology

    2016-11-28

    objective 9 4.6 On The Recoverable Robust Traveling Salesman Problem . . . . . 11 4.7 A Bicriteria Approach to Robust Optimization...be found. 4.6 On The Recoverable Robust Traveling Salesman Problem The traveling salesman problem (TSP) is a well-known combinatorial optimiza- tion...procedure for the robust traveling salesman problem . While this iterative algorithms results in an optimal solution to the robust TSP, computation

  3. Scatterometry or imaging overlay: a comparative study

    NASA Astrophysics Data System (ADS)

    Hsu, Simon C. C.; Pai, Yuan Chi; Chen, Charlie; Yu, Chun Chi; Hsing, Henry; Wu, Hsing-Chien; Kuo, Kelly T. L.; Amir, Nuriel

    2015-03-01

    Most fabrication facilities today use imaging overlay measurement methods, as it has been the industry's reliable workhorse for decades. In the last few years, third-generation Scatterometry Overlay (SCOL™) or Diffraction Based Overlay (DBO-1) technology was developed, along another DBO technology (DBO-2). This development led to the question of where the DBO technology should be implemented for overlay measurements. Scatterometry has been adopted for high volume production in only few cases, always with imaging as a backup, but scatterometry overlay is considered by many as the technology of the future. In this paper we compare imaging overlay and DBO technologies by means of measurements and simulations. We outline issues and sensitivities for both technologies, providing guidelines for the best implementation of each. For several of the presented cases, data from two different DBO technologies are compared as well, the first with Pupil data access (DBO-1) and the other without pupil data access (DBO-2). Key indicators of overlay measurement quality include: layer coverage, accuracy, TMU, process robustness and robustness to process changes. Measurement data from real cases across the industry are compared and the conclusions are also backed by simulations. Accuracy is benchmarked with reference OVL, and self-consistency, showing good results for Imaging and DBO-1 technology. Process sensitivity and metrology robustness are mostly simulated with MTD (Metrology Target Designer) comparing the same process variations for both technologies. The experimental data presented in this study was done on ten advanced node layers and three production node layers, for all phases of the IC fabrication process (FEOL, MEOL and BEOL). The metrology tool used for most of the study is KLA-Tencor's Archer 500LCM system (scatterometry-based and imaging-based measurement technologies on the same tool) another type of tool is used for DBO-2 measurements. Finally, we conclude that both imaging overlay technology and DBO-1 technology are fully successful and have a valid roadmap for the next few design nodes, with some use cases better suited for one or the other measurement technologies. Having both imaging and DBO technology options available in parallel, allows Overlay Engineers a mix and match overlay measurement strategy, providing back up when encountering difficulties with one of the technologies and benefiting from the best of both technologies for every use case.

  4. The Stochastic Evolutionary Game for a Population of Biological Networks Under Natural Selection

    PubMed Central

    Chen, Bor-Sen; Ho, Shih-Ju

    2014-01-01

    In this study, a population of evolutionary biological networks is described by a stochastic dynamic system with intrinsic random parameter fluctuations due to genetic variations and external disturbances caused by environmental changes in the evolutionary process. Since information on environmental changes is unavailable and their occurrence is unpredictable, they can be considered as a game player with the potential to destroy phenotypic stability. The biological network needs to develop an evolutionary strategy to improve phenotypic stability as much as possible, so it can be considered as another game player in the evolutionary process, ie, a stochastic Nash game of minimizing the maximum network evolution level caused by the worst environmental disturbances. Based on the nonlinear stochastic evolutionary game strategy, we find that some genetic variations can be used in natural selection to construct negative feedback loops, efficiently improving network robustness. This provides larger genetic robustness as a buffer against neutral genetic variations, as well as larger environmental robustness to resist environmental disturbances and maintain a network phenotypic traits in the evolutionary process. In this situation, the robust phenotypic traits of stochastic biological networks can be more frequently selected by natural selection in evolution. However, if the harbored neutral genetic variations are accumulated to a sufficiently large degree, and environmental disturbances are strong enough that the network robustness can no longer confer enough genetic robustness and environmental robustness, then the phenotype robustness might break down. In this case, a network phenotypic trait may be pushed from one equilibrium point to another, changing the phenotypic trait and starting a new phase of network evolution through the hidden neutral genetic variations harbored in network robustness by adaptive evolution. Further, the proposed evolutionary game is extended to an n-tuple evolutionary game of stochastic biological networks with m players (competitive populations) and k environmental dynamics. PMID:24558296

  5. Robustness of radiomic breast features of benign lesions and luminal A cancers across MR magnet strengths

    NASA Astrophysics Data System (ADS)

    Whitney, Heather M.; Drukker, Karen; Edwards, Alexandra; Papaioannou, John; Giger, Maryellen L.

    2018-02-01

    Radiomics features extracted from breast lesion images have shown potential in diagnosis and prognosis of breast cancer. As clinical institutions transition from 1.5 T to 3.0 T magnetic resonance imaging (MRI), it is helpful to identify robust features across these field strengths. In this study, dynamic contrast-enhanced MR images were acquired retrospectively under IRB/HIPAA compliance, yielding 738 cases: 241 and 124 benign lesions imaged at 1.5 T and 3.0 T and 231 and 142 luminal A cancers imaged at 1.5 T and 3.0 T, respectively. Lesions were segmented using a fuzzy C-means method. Extracted radiomic values for each group of lesions by cancer status and field strength of acquisition were compared using a Kolmogorov-Smirnov test for the null hypothesis that two groups being compared came from the same distribution, with p-values being corrected for multiple comparisons by the Holm-Bonferroni method. Two shape features, one texture feature, and three enhancement variance kinetics features were found to be potentially robust. All potentially robust features had areas under the receiver operating characteristic curve (AUC) statistically greater than 0.5 in the task of distinguishing between lesion types (range of means 0.57-0.78). The significant difference in voxel size between field strength of acquisition limits the ability to affirm more features as robust or not robust according to field strength alone, and inhomogeneities in static field strength and radiofrequency field could also have affected the assessment of kinetic curve features as robust or not. Vendor-specific image scaling could have also been a factor. These findings will contribute to the development of radiomic signatures that use features identified as robust across field strength.

  6. Click-On-Diagram Questions: a New Tool to Study Conceptions Using Classroom Response Systems

    NASA Astrophysics Data System (ADS)

    LaDue, Nicole D.; Shipley, Thomas F.

    2018-06-01

    Geoscience instructors depend upon photos, diagrams, and other visualizations to depict geologic structures and processes that occur over a wide range of temporal and spatial scales. This proof-of-concept study tests click-on-diagram (COD) questions, administered using a classroom response system (CRS), as a research tool for identifying spatial misconceptions. First, we propose a categorization of spatial conceptions associated with geoscience concepts. Second, we implemented the COD questions in an undergraduate introductory geology course. Each question was implemented three times: pre-instruction, post-instruction, and at the end of the course to evaluate the stability of students' conceptual understanding. We classified each instance as (1) a false belief that was easily remediated, (2) a flawed mental model that was not fully transformed, or (3) a robust misconception that persisted despite targeted instruction. Geographic Information System (GIS) software facilitated spatial analysis of students' answers. The COD data confirmed known misconceptions about Earth's structure, geologic time, and base level and revealed a novel robust misconception about hot spot formation. Questions with complex spatial attributes were less likely to change following instruction and more likely to be classified as a robust misconception. COD questions provided efficient access to students' conceptual understanding. CRS-administered COD questions present an opportunity to gather spatial conceptions with large groups of students, immediately, building the knowledge base about students' misconceptions and providing feedback to guide instruction.

  7. Robust model predictive control for multi-step short range spacecraft rendezvous

    NASA Astrophysics Data System (ADS)

    Zhu, Shuyi; Sun, Ran; Wang, Jiaolong; Wang, Jihe; Shao, Xiaowei

    2018-07-01

    This work presents a robust model predictive control (MPC) approach for the multi-step short range spacecraft rendezvous problem. During the specific short range phase concerned, the chaser is supposed to be initially outside the line-of-sight (LOS) cone. Therefore, the rendezvous process naturally includes two steps: the first step is to transfer the chaser into the LOS cone and the second step is to transfer the chaser into the aimed region with its motion confined within the LOS cone. A novel MPC framework named after Mixed MPC (M-MPC) is proposed, which is the combination of the Variable-Horizon MPC (VH-MPC) framework and the Fixed-Instant MPC (FI-MPC) framework. The M-MPC framework enables the optimization for the two steps to be implemented jointly rather than to be separated factitiously, and its computation workload is acceptable for the usually low-power processors onboard spacecraft. Then considering that disturbances including modeling error, sensor noise and thrust uncertainty may induce undesired constraint violations, a robust technique is developed and it is attached to the above M-MPC framework to form a robust M-MPC approach. The robust technique is based on the chance-constrained idea, which ensures that constraints can be satisfied with a prescribed probability. It improves the robust technique proposed by Gavilan et al., because it eliminates the unnecessary conservativeness by explicitly incorporating known statistical properties of the navigation uncertainty. The efficacy of the robust M-MPC approach is shown in a simulation study.

  8. Robust Control Design for Systems With Probabilistic Uncertainty

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.

  9. Robust detection of rare species using environmental DNA: The importance of primer specificity

    Treesearch

    Taylor M. Wilcox; Kevin S. McKelvey; Michael K. Young; Stephen F. Jane; Winsor H. Lowe; Andrew R. Whiteley; Michael K. Schwartz

    2013-01-01

    Environmental DNA (eDNA) is being rapidly adopted as a tool to detect rare animals. Quantitative PCR (qPCR) using probebased chemistries may represent a particularly powerful tool because of the method's sensitivity, specificity, and potential to quantify target DNA. However, there has been little work understanding the performance of these assays in the presence...

  10. Analysis of Family Structures Reveals Robustness or Sensitivity of Bursting Activity to Parameter Variations in a Half-Center Oscillator (HCO) Model.

    PubMed

    Doloc-Mihu, Anca; Calabrese, Ronald L

    2016-01-01

    The underlying mechanisms that support robustness in neuronal networks are as yet unknown. However, recent studies provide evidence that neuronal networks are robust to natural variations, modulation, and environmental perturbations of parameters, such as maximal conductances of intrinsic membrane and synaptic currents. Here we sought a method for assessing robustness, which might easily be applied to large brute-force databases of model instances. Starting with groups of instances with appropriate activity (e.g., tonic spiking), our method classifies instances into much smaller subgroups, called families, in which all members vary only by the one parameter that defines the family. By analyzing the structures of families, we developed measures of robustness for activity type. Then, we applied these measures to our previously developed model database, HCO-db, of a two-neuron half-center oscillator (HCO), a neuronal microcircuit from the leech heartbeat central pattern generator where the appropriate activity type is alternating bursting. In HCO-db, the maximal conductances of five intrinsic and two synaptic currents were varied over eight values (leak reversal potential also varied, five values). We focused on how variations of particular conductance parameters maintain normal alternating bursting activity while still allowing for functional modulation of period and spike frequency. We explored the trade-off between robustness of activity type and desirable change in activity characteristics when intrinsic conductances are altered and identified the hyperpolarization-activated (h) current as an ideal target for modulation. We also identified ensembles of model instances that closely approximate physiological activity and can be used in future modeling studies.

  11. Preliminary results of miniaturized and robust ultrasound guided diffuse optical tomography system for breast cancer detection

    NASA Astrophysics Data System (ADS)

    Vavadi, Hamed; Mostafa, Atahar; Li, Jinglong; Zhou, Feifei; Uddin, Shihab; Xu, Chen; Zhu, Quing

    2017-02-01

    According to the World Health Organization, breast cancer is the most common cancer among women worldwide, claiming the lives of hundreds of thousands of women each year. Near infrared diffuse optical tomography (DOT) has demonstrated a great potential as an adjunct modality for differentiation of malignant and benign breast lesions and for monitoring treatment response of patients with locally advanced breast cancers. The path toward commercialization of DOT techniques depends upon the improvement of robustness and user-friendliness of this technique in hardware and software. In the past, our group have developed three frequency domain prototype systems which were used in several clinical studies. In this study, we introduce our newly under development US-guided DOT system which is being improved in terms of size, robustness and user friendliness by several custom electronic and mechanical design. A new and robust probe designed to reduce preparation time in clinical process. The processing procedure, data selection and user interface software also updated. With all these improvements, our new system is more robust and accurate which is one step closer to commercialization and wide use of this technology in clinical settings. This system is aimed to be used by minimally trained user in the clinical settings with robust performance. The system performance has been tested in the phantom experiment and initial results are demonstrated in this study. We are currently working on finalizing this system and do further testing to validate the performance of this system. We are aiming toward use of this system in clinical setting for patients with breast cancer.

  12. Robust and Effective Component-based Banknote Recognition for the Blind

    PubMed Central

    Hasanuzzaman, Faiz M.; Yang, Xiaodong; Tian, YingLi

    2012-01-01

    We develop a novel camera-based computer vision technology to automatically recognize banknotes for assisting visually impaired people. Our banknote recognition system is robust and effective with the following features: 1) high accuracy: high true recognition rate and low false recognition rate, 2) robustness: handles a variety of currency designs and bills in various conditions, 3) high efficiency: recognizes banknotes quickly, and 4) ease of use: helps blind users to aim the target for image capture. To make the system robust to a variety of conditions including occlusion, rotation, scaling, cluttered background, illumination change, viewpoint variation, and worn or wrinkled bills, we propose a component-based framework by using Speeded Up Robust Features (SURF). Furthermore, we employ the spatial relationship of matched SURF features to detect if there is a bill in the camera view. This process largely alleviates false recognition and can guide the user to correctly aim at the bill to be recognized. The robustness and generalizability of the proposed system is evaluated on a dataset including both positive images (with U.S. banknotes) and negative images (no U.S. banknotes) collected under a variety of conditions. The proposed algorithm, achieves 100% true recognition rate and 0% false recognition rate. Our banknote recognition system is also tested by blind users. PMID:22661884

  13. Robust synthetic biology design: stochastic game theory approach.

    PubMed

    Chen, Bor-Sen; Chang, Chia-Hung; Lee, Hsiao-Ching

    2009-07-15

    Synthetic biology is to engineer artificial biological systems to investigate natural biological phenomena and for a variety of applications. However, the development of synthetic gene networks is still difficult and most newly created gene networks are non-functioning due to uncertain initial conditions and disturbances of extra-cellular environments on the host cell. At present, how to design a robust synthetic gene network to work properly under these uncertain factors is the most important topic of synthetic biology. A robust regulation design is proposed for a stochastic synthetic gene network to achieve the prescribed steady states under these uncertain factors from the minimax regulation perspective. This minimax regulation design problem can be transformed to an equivalent stochastic game problem. Since it is not easy to solve the robust regulation design problem of synthetic gene networks by non-linear stochastic game method directly, the Takagi-Sugeno (T-S) fuzzy model is proposed to approximate the non-linear synthetic gene network via the linear matrix inequality (LMI) technique through the Robust Control Toolbox in Matlab. Finally, an in silico example is given to illustrate the design procedure and to confirm the efficiency and efficacy of the proposed robust gene design method. http://www.ee.nthu.edu.tw/bschen/SyntheticBioDesign_supplement.pdf.

  14. Robust multivariate nonparametric tests for detection of two-sample location shift in clinical trials

    PubMed Central

    Jiang, Xuejun; Guo, Xu; Zhang, Ning; Wang, Bo

    2018-01-01

    This article presents and investigates performance of a series of robust multivariate nonparametric tests for detection of location shift between two multivariate samples in randomized controlled trials. The tests are built upon robust estimators of distribution locations (medians, Hodges-Lehmann estimators, and an extended U statistic) with both unscaled and scaled versions. The nonparametric tests are robust to outliers and do not assume that the two samples are drawn from multivariate normal distributions. Bootstrap and permutation approaches are introduced for determining the p-values of the proposed test statistics. Simulation studies are conducted and numerical results are reported to examine performance of the proposed statistical tests. The numerical results demonstrate that the robust multivariate nonparametric tests constructed from the Hodges-Lehmann estimators are more efficient than those based on medians and the extended U statistic. The permutation approach can provide a more stringent control of Type I error and is generally more powerful than the bootstrap procedure. The proposed robust nonparametric tests are applied to detect multivariate distributional difference between the intervention and control groups in the Thai Healthy Choices study and examine the intervention effect of a four-session motivational interviewing-based intervention developed in the study to reduce risk behaviors among youth living with HIV. PMID:29672555

  15. On a computational model of building thermal dynamic response

    NASA Astrophysics Data System (ADS)

    Jarošová, Petra; Vala, Jiří

    2016-07-01

    Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.

  16. Flexible design in water and wastewater engineering--definitions, literature and decision guide.

    PubMed

    Spiller, Marc; Vreeburg, Jan H G; Leusbrock, Ingo; Zeeman, Grietje

    2015-02-01

    Urban water and wastewater systems face uncertain developments including technological progress, climate change and urban development. To ensure the sustainability of these systems under dynamic conditions it has been proposed that technologies and infrastructure should be flexible, adaptive and robust. However, in literature it is often unclear what these technologies and infrastructure are. Furthermore, the terms flexible, adaptive and robust are often used interchangeably, despite important differences. In this paper we will i) define the terminology, ii) provide an overview of the status of flexible infrastructure design alternatives for water and wastewater networks and treatment, and iii) develop guidelines for the selection of flexible design alternatives. Results indicate that, with the exception of Net Present Valuation methods, there is little research available on the design and evaluation of technologies that can enable flexibility. Flexible design alternatives reviewed include robust design, phased design, modular design, modular/component platform design and design for remanufacturing. As developments in the water sector are driven by slow variables (climate change, urban development), rather than market forces, it is suggested that phased design or component platform designs are suitable for responding to change, while robust design is an option when operations face highly dynamic variability. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Integrated approaches to the application of advanced modeling technology in process development and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  18. Robust network design for multispecies conservation

    Treesearch

    Ronan Le Bras; Bistra Dilkina; Yexiang Xue; Carla P. Gomes; Kevin S. McKelvey; Michael K. Schwartz; Claire A. Montgomery

    2013-01-01

    Our work is motivated by an important network design application in computational sustainability concerning wildlife conservation. In the face of human development and climate change, it is important that conservation plans for protecting landscape connectivity exhibit certain level of robustness. While previous work has focused on conservation strategies that result...

  19. A Robust Bayesian Approach for Structural Equation Models with Missing Data

    ERIC Educational Resources Information Center

    Lee, Sik-Yum; Xia, Ye-Mao

    2008-01-01

    In this paper, normal/independent distributions, including but not limited to the multivariate t distribution, the multivariate contaminated distribution, and the multivariate slash distribution, are used to develop a robust Bayesian approach for analyzing structural equation models with complete or missing data. In the context of a nonlinear…

  20. Free-floating epithelial micro-tissue arrays: a low cost and versatile technique.

    PubMed

    Flood, P; Alvarez, L; Reynaud, E G

    2016-10-11

    Three-dimensional (3D) tissue models are invaluable tools that can closely reflect the in vivo physiological environment. However, they are usually difficult to develop, have a low throughput and are often costly; limiting their utility to most laboratories. The recent availability of inexpensive additive manufacturing printers and open source 3D design software offers us the possibility to easily create affordable 3D cell culture platforms. To demonstrate this, we established a simple, inexpensive and robust method for producing arrays of free-floating epithelial micro-tissues. Using a combination of 3D computer aided design and 3D printing, hydrogel micro-moulding and collagen cell encapsulation we engineered microenvironments that consistently direct the growth of micro-tissue arrays. We described the adaptability of this technique by testing several immortalised epithelial cell lines (MDCK, A549, Caco-2) and by generating branching morphology and micron to millimetre scaled micro-tissues. We established by fluorescence and electron microscopy that micro-tissues are polarised, have cell type specific differentiated phenotypes and regain native in vivo tissue qualities. Finally, using Salmonella typhimurium we show micro-tissues display a more physiologically relevant infection response compared to epithelial monolayers grown on permeable filter supports. In summary, we have developed a robust and adaptable technique for producing arrays of epithelial micro-tissues. This in vitro model has the potential to be a valuable tool for studying epithelial cell and tissue function/architecture in a physiologically relevant context.

  1. A multimodal interface for real-time soldier-robot teaming

    NASA Astrophysics Data System (ADS)

    Barber, Daniel J.; Howard, Thomas M.; Walter, Matthew R.

    2016-05-01

    Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems becoming increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.

  2. Quantitative Story Telling: Initial steps towards bridging perspectives and tools for a robust nexus assessment

    NASA Astrophysics Data System (ADS)

    Cabello, Violeta

    2017-04-01

    This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.

  3. APPLICATION OF STABLE ISOTOPE TECHNIQUES TO AIR POLLUTION RESEARCH

    EPA Science Inventory

    Stable isotope techniques provide a robust, yet under-utilized tool for examining pollutant effects on plant growth and ecosystem function. Here, we survey a range of mixing model, physiological and system level applications for documenting pollutant effects. Mixing model examp...

  4. Robust digital image watermarking using distortion-compensated dither modulation

    NASA Astrophysics Data System (ADS)

    Li, Mianjie; Yuan, Xiaochen

    2018-04-01

    In this paper, we propose a robust feature extraction based digital image watermarking method using Distortion- Compensated Dither Modulation (DC-DM). Our proposed local watermarking method provides stronger robustness and better flexibility than traditional global watermarking methods. We improve robustness by introducing feature extraction and DC-DM method. To extract the robust feature points, we propose a DAISY-based Robust Feature Extraction (DRFE) method by employing the DAISY descriptor and applying the entropy calculation based filtering. The experimental results show that the proposed method achieves satisfactory robustness under the premise of ensuring watermark imperceptibility quality compared to other existing methods.

  5. Mechanisms for Robust Cognition.

    PubMed

    Walsh, Matthew M; Gluck, Kevin A

    2015-08-01

    To function well in an unpredictable environment using unreliable components, a system must have a high degree of robustness. Robustness is fundamental to biological systems and is an objective in the design of engineered systems such as airplane engines and buildings. Cognitive systems, like biological and engineered systems, exist within variable environments. This raises the question, how do cognitive systems achieve similarly high degrees of robustness? The aim of this study was to identify a set of mechanisms that enhance robustness in cognitive systems. We identify three mechanisms that enhance robustness in biological and engineered systems: system control, redundancy, and adaptability. After surveying the psychological literature for evidence of these mechanisms, we provide simulations illustrating how each contributes to robust cognition in a different psychological domain: psychomotor vigilance, semantic memory, and strategy selection. These simulations highlight features of a mathematical approach for quantifying robustness, and they provide concrete examples of mechanisms for robust cognition. © 2014 Cognitive Science Society, Inc.

  6. A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: II. Probabilistic Guarantees on Constraint Satisfaction

    PubMed Central

    Li, Zukui; Floudas, Christodoulos A.

    2012-01-01

    Probabilistic guarantees on constraint satisfaction for robust counterpart optimization are studied in this paper. The robust counterpart optimization formulations studied are derived from box, ellipsoidal, polyhedral, “interval+ellipsoidal” and “interval+polyhedral” uncertainty sets (Li, Z., Ding, R., and Floudas, C.A., A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: I. Robust Linear and Robust Mixed Integer Linear Optimization, Ind. Eng. Chem. Res, 2011, 50, 10567). For those robust counterpart optimization formulations, their corresponding probability bounds on constraint satisfaction are derived for different types of uncertainty characteristic (i.e., bounded or unbounded uncertainty, with or without detailed probability distribution information). The findings of this work extend the results in the literature and provide greater flexibility for robust optimization practitioners in choosing tighter probability bounds so as to find less conservative robust solutions. Extensive numerical studies are performed to compare the tightness of the different probability bounds and the conservatism of different robust counterpart optimization formulations. Guiding rules for the selection of robust counterpart optimization models and for the determination of the size of the uncertainty set are discussed. Applications in production planning and process scheduling problems are presented. PMID:23329868

  7. Metabolic Engineering of Oleaginous Yeasts for Production of Fuels and Chemicals

    PubMed Central

    Shi, Shuobo; Zhao, Huimin

    2017-01-01

    Oleaginous yeasts have been increasingly explored for production of chemicals and fuels via metabolic engineering. Particularly, there is a growing interest in using oleaginous yeasts for the synthesis of lipid-related products due to their high lipogenesis capability, robustness, and ability to utilize a variety of substrates. Most of the metabolic engineering studies in oleaginous yeasts focused on Yarrowia that already has plenty of genetic engineering tools. However, recent advances in systems biology and synthetic biology have provided new strategies and tools to engineer those oleaginous yeasts that have naturally high lipid accumulation but lack genetic tools, such as Rhodosporidium, Trichosporon, and Lipomyces. This review highlights recent accomplishments in metabolic engineering of oleaginous yeasts and recent advances in the development of genetic engineering tools in oleaginous yeasts within the last 3 years. PMID:29167664

  8. Optimum Design of Forging Process Parameters and Preform Shape under Uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2004-06-01

    Forging is a highly complex non-linear process that is vulnerable to various uncertainties, such as variations in billet geometry, die temperature, material properties, workpiece and forging equipment positional errors and process parameters. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion and production risk. Identifying the sources of uncertainties, quantifying and controlling them will reduce risk in the manufacturing environment, which will minimize the overall cost of production. In this paper, various uncertainties that affect forging tool life and preform design are identified, and their cumulative effect on the forging process is evaluated. Since the forging process simulation is computationally intensive, the response surface approach is used to reduce time by establishing a relationship between the system performance and the critical process design parameters. Variability in system performance due to randomness in the parameters is computed by applying Monte Carlo Simulations (MCS) on generated Response Surface Models (RSM). Finally, a Robust Methodology is developed to optimize forging process parameters and preform shape. The developed method is demonstrated by applying it to an axisymmetric H-cross section disk forging to improve the product quality and robustness.

  9. Robust autofocus algorithm for ISAR imaging of moving targets

    NASA Astrophysics Data System (ADS)

    Li, Jian; Wu, Renbiao; Chen, Victor C.

    2000-08-01

    A robust autofocus approach, referred to as AUTOCLEAN (AUTOfocus via CLEAN), is proposed for the motion compensation in ISAR (inverse synthetic aperture radar) imaging of moving targets. It is a parametric algorithm based on a very flexible data model which takes into account arbitrary range migration and arbitrary phase errors across the synthetic aperture that may be induced by unwanted radial motion of the target as well as propagation or system instability. AUTOCLEAN can be classified as a multiple scatterer algorithm (MSA), but it differs considerably from other existing MSAs in several aspects: (1) dominant scatterers are selected automatically in the two-dimensional (2-D) image domain; (2) scatterers may not be well-isolated or very dominant; (3) phase and RCS (radar cross section) information from each selected scatterer are combined in an optimal way; (4) the troublesome phase unwrapping step is avoided. AUTOCLEAN is computationally efficient and involves only a sequence of FFTs (fast Fourier Transforms). Another good feature associated with AUTOCLEAN is that its performance can be progressively improved by assuming a larger number of dominant scatterers for the target. Hence it can be easily configured for real-time applications including, for example, ATR (automatic target recognition) of non-cooperative moving targets, and for some other applications where the image quality is of the major concern but not the computational time including, for example, for the development and maintenance of low observable aircrafts. Numerical and experimental results have shown that AUTOCLEAN is a very robust autofocus tool for ISAR imaging.

  10. Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data

    NASA Technical Reports Server (NTRS)

    Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)

    2001-01-01

    A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.

  11. Optimal design of stimulus experiments for robust discrimination of biochemical reaction networks.

    PubMed

    Flassig, R J; Sundmacher, K

    2012-12-01

    Biochemical reaction networks in the form of coupled ordinary differential equations (ODEs) provide a powerful modeling tool for understanding the dynamics of biochemical processes. During the early phase of modeling, scientists have to deal with a large pool of competing nonlinear models. At this point, discrimination experiments can be designed and conducted to obtain optimal data for selecting the most plausible model. Since biological ODE models have widely distributed parameters due to, e.g. biologic variability or experimental variations, model responses become distributed. Therefore, a robust optimal experimental design (OED) for model discrimination can be used to discriminate models based on their response probability distribution functions (PDFs). In this work, we present an optimal control-based methodology for designing optimal stimulus experiments aimed at robust model discrimination. For estimating the time-varying model response PDF, which results from the nonlinear propagation of the parameter PDF under the ODE dynamics, we suggest using the sigma-point approach. Using the model overlap (expected likelihood) as a robust discrimination criterion to measure dissimilarities between expected model response PDFs, we benchmark the proposed nonlinear design approach against linearization with respect to prediction accuracy and design quality for two nonlinear biological reaction networks. As shown, the sigma-point outperforms the linearization approach in the case of widely distributed parameter sets and/or existing multiple steady states. Since the sigma-point approach scales linearly with the number of model parameter, it can be applied to large systems for robust experimental planning. An implementation of the method in MATLAB/AMPL is available at http://www.uni-magdeburg.de/ivt/svt/person/rf/roed.html. flassig@mpi-magdeburg.mpg.de Supplementary data are are available at Bioinformatics online.

  12. Processing tracking in jMRUI software for magnetic resonance spectra quantitation reproducibility assurance.

    PubMed

    Jabłoński, Michał; Starčuková, Jana; Starčuk, Zenon

    2017-01-23

    Proton magnetic resonance spectroscopy is a non-invasive measurement technique which provides information about concentrations of up to 20 metabolites participating in intracellular biochemical processes. In order to obtain any metabolic information from measured spectra a processing should be done in specialized software, like jMRUI. The processing is interactive and complex and often requires many trials before obtaining a correct result. This paper proposes a jMRUI enhancement for efficient and unambiguous history tracking and file identification. A database storing all processing steps, parameters and files used in processing was developed for jMRUI. The solution was developed in Java, authors used a SQL database for robust storage of parameters and SHA-256 hash code for unambiguous file identification. The developed system was integrated directly in jMRUI and it will be publically available. A graphical user interface was implemented in order to make the user experience more comfortable. The database operation is invisible from the point of view of the common user, all tracking operations are performed in the background. The implemented jMRUI database is a tool that can significantly help the user to track the processing history performed on data in jMRUI. The created tool is oriented to be user-friendly, robust and easy to use. The database GUI allows the user to browse the whole processing history of a selected file and learn e.g. what processing lead to the results, where the original data are stored, to obtain the list of all processing actions performed on spectra.

  13. Effects of CT resolution and radiodensity threshold on the CFD evaluation of nasal airflow.

    PubMed

    Quadrio, Maurizio; Pipolo, Carlotta; Corti, Stefano; Messina, Francesco; Pesci, Chiara; Saibene, Alberto M; Zampini, Samuele; Felisati, Giovanni

    2016-03-01

    The article focuses on the robustness of a CFD-based procedure for the quantitative evaluation of the nasal airflow. CFD ability to yield robust results with respect to the unavoidable procedural and modeling inaccuracies must be demonstrated to allow this tool to become part of the clinical practice in this field. The present article specifically addresses the sensitivity of the CFD procedure to the spatial resolution of the available CT scans, as well as to the choice of the segmentation level of the CT images. We found no critical problems concerning these issues; nevertheless, the choice of the segmentation level is potentially delicate if carried out by an untrained operator.

  14. Enterovirus A71 DNA-Launched Infectious Clone as a Robust Reverse Genetic Tool

    PubMed Central

    Tan, Chee Wah; Tee, Han Kang; Lee, Michelle Hui Pheng; Sam, I-Ching; Chan, Yoke Fun

    2016-01-01

    Enterovirus A71 (EV-A71) causes major outbreaks of hand, foot and mouth disease, and is occasionally associated with neurological complications and death in children. Reverse genetics is widely used in the field of virology for functional study of viral genes. For EV-A71, such tools are limited to clones that are transcriptionally controlled by T7/SP6 bacteriophage promoter. This is often time-consuming and expensive. Here, we describe the development of infectious plasmid DNA-based EV-A71 clones, for which EV-A71 genome expression is under transcriptional control by the CMV-intermediate early promoter and SV40 transcriptional-termination signal. Transfection of this EV-A71 infectious DNA produces good virus yield similar to in vitro-transcribed EV-A71 infectious RNA, 6.4 and 5.8 log10PFU/ml, respectively. Infectious plasmid with enhanced green fluorescence protein and Nano luciferase reporter genes also produced good virus titers, with 4.3 and 5.0 log10 PFU/ml, respectively. Another infectious plasmid with both CMV and T7 promoters was also developed for easy manipulation of in vitro transcription or direct plasmid transfection. Transfection with either dual-promoter infectious plasmid DNA or infectious RNA derived from this dual-promoter clone produced infectious viral particles. Incorporation of hepatitis delta virus ribozyme, which yields precise 3’ ends of the DNA-launched EV-A71 genomic transcripts, increased infectious viral production. In contrast, the incorporation of hammerhead ribozyme in the DNA-launched EV-A71 resulted in lower virus yield, but improved the virus titers for T7 promoter-derived infectious RNA. This study describes rapid and robust reverse genetic tools for EV-A71. PMID:27617744

  15. THE REAL McCOIL: A method for the concurrent estimation of the complexity of infection and SNP allele frequency for malaria parasites

    PubMed Central

    Chang, Hsiao-Han; Worby, Colin J.; Yeka, Adoke; Nankabirwa, Joaniter; Kamya, Moses R.; Staedke, Sarah G.; Hubbart, Christina; Amato, Roberto; Kwiatkowski, Dominic P.

    2017-01-01

    As many malaria-endemic countries move towards elimination of Plasmodium falciparum, the most virulent human malaria parasite, effective tools for monitoring malaria epidemiology are urgent priorities. P. falciparum population genetic approaches offer promising tools for understanding transmission and spread of the disease, but a high prevalence of multi-clone or polygenomic infections can render estimation of even the most basic parameters, such as allele frequencies, challenging. A previous method, COIL, was developed to estimate complexity of infection (COI) from single nucleotide polymorphism (SNP) data, but relies on monogenomic infections to estimate allele frequencies or requires external allele frequency data which may not available. Estimates limited to monogenomic infections may not be representative, however, and when the average COI is high, they can be difficult or impossible to obtain. Therefore, we developed THE REAL McCOIL, Turning HEterozygous SNP data into Robust Estimates of ALelle frequency, via Markov chain Monte Carlo, and Complexity Of Infection using Likelihood, to incorporate polygenomic samples and simultaneously estimate allele frequency and COI. This approach was tested via simulations then applied to SNP data from cross-sectional surveys performed in three Ugandan sites with varying malaria transmission. We show that THE REAL McCOIL consistently outperforms COIL on simulated data, particularly when most infections are polygenomic. Using field data we show that, unlike with COIL, we can distinguish epidemiologically relevant differences in COI between and within these sites. Surprisingly, for example, we estimated high average COI in a peri-urban subregion with lower transmission intensity, suggesting that many of these cases were imported from surrounding regions with higher transmission intensity. THE REAL McCOIL therefore provides a robust tool for understanding the molecular epidemiology of malaria across transmission settings. PMID:28125584

  16. Developments in the Tools and Methodologies of Synthetic Biology

    PubMed Central

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  17. Robustness of meta-analyses in finding gene × environment interactions

    PubMed Central

    Shi, Gang; Nehorai, Arye

    2017-01-01

    Meta-analyses that synthesize statistical evidence across studies have become important analytical tools for genetic studies. Inspired by the success of genome-wide association studies of the genetic main effect, researchers are searching for gene × environment interactions. Confounders are routinely included in the genome-wide gene × environment interaction analysis as covariates; however, this does not control for any confounding effects on the results if covariate × environment interactions are present. We carried out simulation studies to evaluate the robustness to the covariate × environment confounder for meta-regression and joint meta-analysis, which are two commonly used meta-analysis methods for testing the gene × environment interaction or the genetic main effect and interaction jointly. Here we show that meta-regression is robust to the covariate × environment confounder while joint meta-analysis is subject to the confounding effect with inflated type I error rates. Given vast sample sizes employed in genome-wide gene × environment interaction studies, non-significant covariate × environment interactions at the study level could substantially elevate the type I error rate at the consortium level. When covariate × environment confounders are present, type I errors can be controlled in joint meta-analysis by including the covariate × environment terms in the analysis at the study level. Alternatively, meta-regression can be applied, which is robust to potential covariate × environment confounders. PMID:28362796

  18. Robust Surface Reconstruction via Laplace-Beltrami Eigen-Projection and Boundary Deformation

    PubMed Central

    Shi, Yonggang; Lai, Rongjie; Morra, Jonathan H.; Dinov, Ivo; Thompson, Paul M.; Toga, Arthur W.

    2010-01-01

    In medical shape analysis, a critical problem is reconstructing a smooth surface of correct topology from a binary mask that typically has spurious features due to segmentation artifacts. The challenge is the robust removal of these outliers without affecting the accuracy of other parts of the boundary. In this paper, we propose a novel approach for this problem based on the Laplace-Beltrami (LB) eigen-projection and properly designed boundary deformations. Using the metric distortion during the LB eigen-projection, our method automatically detects the location of outliers and feeds this information to a well-composed and topology-preserving deformation. By iterating between these two steps of outlier detection and boundary deformation, we can robustly filter out the outliers without moving the smooth part of the boundary. The final surface is the eigen-projection of the filtered mask boundary that has the correct topology, desired accuracy and smoothness. In our experiments, we illustrate the robustness of our method on different input masks of the same structure, and compare with the popular SPHARM tool and the topology preserving level set method to show that our method can reconstruct accurate surface representations without introducing artificial oscillations. We also successfully validate our method on a large data set of more than 900 hippocampal masks and demonstrate that the reconstructed surfaces retain volume information accurately. PMID:20624704

  19. LMI-Based Generation of Feedback Laws for a Robust Model Predictive Control Algorithm

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Carson, John M., III

    2007-01-01

    This technical note provides a mathematical proof of Corollary 1 from the paper 'A Nonlinear Model Predictive Control Algorithm with Proven Robustness and Resolvability' that appeared in the 2006 Proceedings of the American Control Conference. The proof was omitted for brevity in the publication. The paper was based on algorithms developed for the FY2005 R&TD (Research and Technology Development) project for Small-body Guidance, Navigation, and Control [2].The framework established by the Corollary is for a robustly stabilizing MPC (model predictive control) algorithm for uncertain nonlinear systems that guarantees the resolvability of the associated nite-horizon optimal control problem in a receding-horizon implementation. Additional details of the framework are available in the publication.

  20. p3d--Python module for structural bioinformatics.

    PubMed

    Fufezan, Christian; Specht, Michael

    2009-08-21

    High-throughput bioinformatic analysis tools are needed to mine the large amount of structural data via knowledge based approaches. The development of such tools requires a robust interface to access the structural data in an easy way. For this the Python scripting language is the optimal choice since its philosophy is to write an understandable source code. p3d is an object oriented Python module that adds a simple yet powerful interface to the Python interpreter to process and analyse three dimensional protein structure files (PDB files). p3d's strength arises from the combination of a) very fast spatial access to the structural data due to the implementation of a binary space partitioning (BSP) tree, b) set theory and c) functions that allow to combine a and b and that use human readable language in the search queries rather than complex computer language. All these factors combined facilitate the rapid development of bioinformatic tools that can perform quick and complex analyses of protein structures. p3d is the perfect tool to quickly develop tools for structural bioinformatics using the Python scripting language.

Top