Conceptual information processing: A robust approach to KBS-DBMS integration
NASA Technical Reports Server (NTRS)
Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond
1987-01-01
Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.
NASA Astrophysics Data System (ADS)
Szelag, Bertrand; Abraham, Alexis; Brision, Stéphane; Gindre, Paul; Blampey, Benjamin; Myko, André; Olivier, Segolene; Kopp, Christophe
2017-05-01
Silicon photonic is becoming a reality for next generation communication system addressing the increasing needs of HPC (High Performance Computing) systems and datacenters. CMOS compatible photonic platforms are developed in many foundries integrating passive and active devices. The use of existing and qualified microelectronics process guarantees cost efficient and mature photonic technologies. Meanwhile, photonic devices have their own fabrication constraints, not similar to those of cmos devices, which can affect their performances. In this paper, we are addressing the integration of PN junction Mach Zehnder modulator in a 200mm CMOS compatible photonic platform. Implantation based device characteristics are impacted by many process variations among which screening layer thickness, dopant diffusion, implantation mask overlay. CMOS devices are generally quite robust with respect to these processes thanks to dedicated design rules. For photonic devices, the situation is different since, most of the time, doped areas must be carefully located within waveguides and CMOS solutions like self-alignment to the gate cannot be applied. In this work, we present different robust integration solutions for junction-based modulators. A simulation setup has been built in order to optimize of the process conditions. It consist in a Mathlab interface coupling process and device electro-optic simulators in order to run many iterations. Illustrations of modulator characteristic variations with process parameters are done using this simulation setup. Parameters under study are, for instance, X and Y direction lithography shifts, screening oxide and slab thicknesses. A robust process and design approach leading to a pn junction Mach Zehnder modulator insensitive to lithography misalignment is then proposed. Simulation results are compared with experimental datas. Indeed, various modulators have been fabricated with different process conditions and integration schemes. Extensive electro-optic characterization of these components will be presented.
ERIC Educational Resources Information Center
Neumann, Yoram; Neumann, Edith; Lewis, Shelia
2017-01-01
This study integrated the Spiral Curriculum approach into the Robust Learning Model as part of a continuous improvement process that was designed to improve educational effectiveness and then assessed the differences between the initial and integrated models as well as the predictability of the first course in the integrated learning model on a…
Stochastic simulation and robust design optimization of integrated photonic filters
NASA Astrophysics Data System (ADS)
Weng, Tsui-Wei; Melati, Daniele; Melloni, Andrea; Daniel, Luca
2017-01-01
Manufacturing variations are becoming an unavoidable issue in modern fabrication processes; therefore, it is crucial to be able to include stochastic uncertainties in the design phase. In this paper, integrated photonic coupled ring resonator filters are considered as an example of significant interest. The sparsity structure in photonic circuits is exploited to construct a sparse combined generalized polynomial chaos model, which is then used to analyze related statistics and perform robust design optimization. Simulation results show that the optimized circuits are more robust to fabrication process variations and achieve a reduction of 11%-35% in the mean square errors of the 3 dB bandwidth compared to unoptimized nominal designs.
Precup, Radu-Emil; David, Radu-Codrut; Petriu, Emil M; Radac, Mircea-Bogdan; Preitl, Stefan
2014-11-01
This paper suggests a new generation of optimal PI controllers for a class of servo systems characterized by saturation and dead zone static nonlinearities and second-order models with an integral component. The objective functions are expressed as the integral of time multiplied by absolute error plus the weighted sum of the integrals of output sensitivity functions of the state sensitivity models with respect to two process parametric variations. The PI controller tuning conditions applied to a simplified linear process model involve a single design parameter specific to the extended symmetrical optimum (ESO) method which offers the desired tradeoff to several control system performance indices. An original back-calculation and tracking anti-windup scheme is proposed in order to prevent the integrator wind-up and to compensate for the dead zone nonlinearity of the process. The minimization of the objective functions is carried out in the framework of optimization problems with inequality constraints which guarantee the robust stability with respect to the process parametric variations and the controller robustness. An adaptive gravitational search algorithm (GSA) solves the optimization problems focused on the optimal tuning of the design parameter specific to the ESO method and of the anti-windup tracking gain. A tuning method for PI controllers is proposed as an efficient approach to the design of resilient control systems. The tuning method and the PI controllers are experimentally validated by the adaptive GSA-based tuning of PI controllers for the angular position control of a laboratory servo system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.
The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.
Pechenick, Dov A.; Payne, Joshua L.; Moore, Jason H.
2011-01-01
Gene regulatory networks (GRNs) drive the cellular processes that sustain life. To do so reliably, GRNs must be robust to perturbations, such as gene deletion and the addition or removal of regulatory interactions. GRNs must also be robust to genetic changes in regulatory regions that define the logic of signal-integration, as these changes can affect how specific combinations of regulatory signals are mapped to particular gene expression states. Previous theoretical analyses have demonstrated that the robustness of a GRN is influenced by its underlying topological properties, such as degree distribution and modularity. Another important topological property is assortativity, which measures the propensity with which nodes of similar connectivity are connected to one another. How assortativity influences the robustness of the signal-integration logic of GRNs remains an open question. Here, we use computational models of GRNs to investigate this relationship. We separately consider each of the three dynamical regimes of this model for a variety of degree distributions. We find that in the chaotic regime, robustness exhibits a pronounced increase as assortativity becomes more positive, while in the critical and ordered regimes, robustness is generally less sensitive to changes in assortativity. We attribute the increased robustness to a decrease in the duration of the gene expression pattern, which is caused by a reduction in the average size of a GRN’s in-components. This study provides the first direct evidence that assortativity influences the robustness of the signal-integration logic of computational models of GRNs, illuminates a mechanistic explanation for this influence, and furthers our understanding of the relationship between topology and robustness in complex biological systems. PMID:22155134
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, Erika J.; Huang, Chao; Hamilton, Julie
Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less
Smith predictor based-sliding mode controller for integrating processes with elevated deadtime.
Camacho, Oscar; De la Cruz, Francisco
2004-04-01
An approach to control integrating processes with elevated deadtime using a Smith predictor sliding mode controller is presented. A PID sliding surface and an integrating first-order plus deadtime model have been used to synthesize the controller. Since the performance of existing controllers with a Smith predictor decrease in the presence of modeling errors, this paper presents a simple approach to combining the Smith predictor with the sliding mode concept, which is a proven, simple, and robust procedure. The proposed scheme has a set of tuning equations as a function of the characteristic parameters of the model. For implementation of our proposed approach, computer based industrial controllers that execute PID algorithms can be used. The performance and robustness of the proposed controller are compared with the Matausek-Micić scheme for linear systems using simulations.
AIRSAR Automated Web-based Data Processing and Distribution System
NASA Technical Reports Server (NTRS)
Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen
2005-01-01
In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.
Fong, Erika J.; Huang, Chao; Hamilton, Julie; ...
2015-11-23
Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less
Integration of Off-Track Sonic Boom Analysis in Conceptual Design of Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Li, Wu
2011-01-01
A highly desired capability for the conceptual design of aircraft is the ability to rapidly and accurately evaluate new concepts to avoid adverse trade decisions that may hinder the development process in the later stages of design. Evaluating the robustness of new low-boom concepts is important for the conceptual design of supersonic aircraft. Here, robustness means that the aircraft configuration has a low-boom ground signature at both under- and off-track locations. An integrated process for off-track boom analysis is developed to facilitate the design of robust low-boom supersonic aircraft. The integrated off-track analysis can also be used to study the sonic boom impact and to plan future flight trajectories where flight conditions and ground elevation might have a significant effect on ground signatures. The key enabler for off-track sonic boom analysis is accurate computational fluid dynamics (CFD) solutions for off-body pressure distributions. To ensure the numerical accuracy of the off-body pressure distributions, a mesh study is performed with Cart3D to determine the mesh requirements for off- body CFD analysis and comparisons are made between the Cart3D and USM3D results. The variations in ground signatures that result from changes in the initial location of the near-field waveform are also examined. Finally, a complete under- and off-track sonic boom analysis is presented for two distinct supersonic concepts to demonstrate the capability of the integrated analysis process.
An Improved Strong Tracking Cubature Kalman Filter for GPS/INS Integrated Navigation Systems.
Feng, Kaiqiang; Li, Jie; Zhang, Xi; Zhang, Xiaoming; Shen, Chong; Cao, Huiliang; Yang, Yanyu; Liu, Jun
2018-06-12
The cubature Kalman filter (CKF) is widely used in the application of GPS/INS integrated navigation systems. However, its performance may decline in accuracy and even diverge in the presence of process uncertainties. To solve the problem, a new algorithm named improved strong tracking seventh-degree spherical simplex-radial cubature Kalman filter (IST-7thSSRCKF) is proposed in this paper. In the proposed algorithm, the effect of process uncertainty is mitigated by using the improved strong tracking Kalman filter technique, in which the hypothesis testing method is adopted to identify the process uncertainty and the prior state estimate covariance in the CKF is further modified online according to the change in vehicle dynamics. In addition, a new seventh-degree spherical simplex-radial rule is employed to further improve the estimation accuracy of the strong tracking cubature Kalman filter. In this way, the proposed comprehensive algorithm integrates the advantage of 7thSSRCKF’s high accuracy and strong tracking filter’s strong robustness against process uncertainties. The GPS/INS integrated navigation problem with significant dynamic model errors is utilized to validate the performance of proposed IST-7thSSRCKF. Results demonstrate that the improved strong tracking cubature Kalman filter can achieve higher accuracy than the existing CKF and ST-CKF, and is more robust for the GPS/INS integrated navigation system.
Desai, Parind M; Hogan, Rachael C; Brancazio, David; Puri, Vibha; Jensen, Keith D; Chun, Jung-Hoon; Myerson, Allan S; Trout, Bernhardt L
2017-10-05
This study provides a framework for robust tablet development using an integrated hot-melt extrusion-injection molding (IM) continuous manufacturing platform. Griseofulvin, maltodextrin, xylitol and lactose were employed as drug, carrier, plasticizer and reinforcing agent respectively. A pre-blended drug-excipient mixture was fed from a loss-in-weight feeder to a twin-screw extruder. The extrudate was subsequently injected directly into the integrated IM unit and molded into tablets. Tablets were stored in different storage conditions up to 20 weeks to monitor physical stability and were evaluated by polarized light microscopy, DSC, SEM, XRD and dissolution analysis. Optimized injection pressure provided robust tablet formulations. Tablets manufactured at low and high injection pressures exhibited the flaws of sink marks and flashing respectively. Higher solidification temperature during IM process reduced the thermal induced residual stress and prevented chipping and cracking issues. Polarized light microscopy revealed a homogeneous dispersion of crystalline griseofulvin in an amorphous matrix. DSC underpinned the effect of high tablet residual moisture on maltodextrin-xylitol phase separation that resulted in dimensional instability. Tablets with low residual moisture demonstrated long term dimensional stability. This study serves as a model for IM tablet formulations for mechanistic understanding of critical process parameters and formulation attributes required for optimal product performance. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Limin; Shen, Yiteng; Yu, Jingxian; Li, Ping; Zhang, Ridong; Gao, Furong
2018-01-01
In order to cope with system disturbances in multi-phase batch processes with different dimensions, a hybrid robust control scheme of iterative learning control combined with feedback control is proposed in this paper. First, with a hybrid iterative learning control law designed by introducing the state error, the tracking error and the extended information, the multi-phase batch process is converted into a two-dimensional Fornasini-Marchesini (2D-FM) switched system with different dimensions. Second, a switching signal is designed using the average dwell-time method integrated with the related switching conditions to give sufficient conditions ensuring stable running for the system. Finally, the minimum running time of the subsystems and the control law gains are calculated by solving the linear matrix inequalities. Meanwhile, a compound 2D controller with robust performance is obtained, which includes a robust extended feedback control for ensuring the steady-state tracking error to converge rapidly. The application on an injection molding process displays the effectiveness and superiority of the proposed strategy.
Nie, Xianghui; Huang, Guo H; Li, Yongping
2009-11-01
This study integrates the concepts of interval numbers and fuzzy sets into optimization analysis by dynamic programming as a means of accounting for system uncertainty. The developed interval fuzzy robust dynamic programming (IFRDP) model improves upon previous interval dynamic programming methods. It allows highly uncertain information to be effectively communicated into the optimization process through introducing the concept of fuzzy boundary interval and providing an interval-parameter fuzzy robust programming method for an embedded linear programming problem. Consequently, robustness of the optimization process and solution can be enhanced. The modeling approach is applied to a hypothetical problem for the planning of waste-flow allocation and treatment/disposal facility expansion within a municipal solid waste (MSW) management system. Interval solutions for capacity expansion of waste management facilities and relevant waste-flow allocation are generated and interpreted to provide useful decision alternatives. The results indicate that robust and useful solutions can be obtained, and the proposed IFRDP approach is applicable to practical problems that are associated with highly complex and uncertain information.
Micromachined Thin-Film Sensors for SOI-CMOS Co-Integration
NASA Astrophysics Data System (ADS)
Laconte, Jean; Flandre, D.; Raskin, Jean-Pierre
Co-integration of sensors with their associated electronics on a single silicon chip may provide many significant benefits regarding performance, reliability, miniaturization and process simplicity without significantly increasing the total cost. Micromachined Thin-Film Sensors for SOI-CMOS Co-integration covers the challenges and interests and demonstrates the successful co-integration of gas flow sensors on dielectric membrane, with their associated electronics, in CMOS-SOI technology. We firstly investigate the extraction of residual stress in thin layers and in their stacking and the release, in post-processing, of a 1 μm-thick robust and flat dielectric multilayered membrane using Tetramethyl Ammonium Hydroxide (TMAH) silicon micromachining solution.
Demonstration of an N7 integrated fab process for metal oxide EUV photoresist
NASA Astrophysics Data System (ADS)
De Simone, Danilo; Mao, Ming; Kocsis, Michael; De Schepper, Peter; Lazzarino, Frederic; Vandenberghe, Geert; Stowers, Jason; Meyers, Stephen; Clark, Benjamin L.; Grenville, Andrew; Luong, Vinh; Yamashita, Fumiko; Parnell, Doni
2016-03-01
Inpria has developed a directly patternable metal oxide hard-mask as a robust, high-resolution photoresist for EUV lithography. In this paper we demonstrate the full integration of a baseline Inpria resist into an imec N7 BEOL block mask process module. We examine in detail both the lithography and etch patterning results. By leveraging the high differential etch resistance of metal oxide photoresists, we explore opportunities for process simplification and cost reduction. We review the imaging results from the imec N7 block mask patterns and its process windows as well as routes to maximize the process latitude, underlayer integration, etch transfer, cross sections, etch equipment integration from cross metal contamination standpoint and selective resist strip process. Finally, initial results from a higher sensitivity Inpria resist are also reported. A dose to size of 19 mJ/cm2 was achieved to print pillars as small as 21nm.
Competence-Based Approach in Value Chain Processes
NASA Astrophysics Data System (ADS)
Azevedo, Rodrigo Cambiaghi; D'Amours, Sophie; Rönnqvist, Mikael
There is a gap between competence theory and value chain processes frameworks. While individually considered as core elements in contemporary management thinking, the integration of the two concepts is still lacking. We claim that this integration would allow for the development of more robust business models by structuring value chain activities around aspects such as capabilities and skills, as well as individual and organizational knowledge. In this context, the objective of this article is to reduce this gap and consequently open a field for further improvements of value chain processes frameworks.
MeDICi Software Superglue for Data Analysis Pipelines
Ian Gorton
2017-12-09
The Middleware for Data-Intensive Computing (MeDICi) Integration Framework is an integrated middleware platform developed to solve data analysis and processing needs of scientists across many domains. MeDICi is scalable, easily modified, and robust to multiple languages, protocols, and hardware platforms, and in use today by PNNL scientists for bioinformatics, power grid failure analysis, and text analysis.
An Intercompany Perspective on Biopharmaceutical Drug Product Robustness Studies.
Morar-Mitrica, Sorina; Adams, Monica L; Crotts, George; Wurth, Christine; Ihnat, Peter M; Tabish, Tanvir; Antochshuk, Valentyn; DiLuzio, Willow; Dix, Daniel B; Fernandez, Jason E; Gupta, Kapil; Fleming, Michael S; He, Bing; Kranz, James K; Liu, Dingjiang; Narasimhan, Chakravarthy; Routhier, Eric; Taylor, Katherine D; Truong, Nobel; Stokes, Elaine S E
2018-02-01
The Biophorum Development Group (BPDG) is an industry-wide consortium enabling networking and sharing of best practices for the development of biopharmaceuticals. To gain a better understanding of current industry approaches for establishing biopharmaceutical drug product (DP) robustness, the BPDG-Formulation Point Share group conducted an intercompany collaboration exercise, which included a bench-marking survey and extensive group discussions around the scope, design, and execution of robustness studies. The results of this industry collaboration revealed several key common themes: (1) overall DP robustness is defined by both the formulation and the manufacturing process robustness; (2) robustness integrates the principles of quality by design (QbD); (3) DP robustness is an important factor in setting critical quality attribute control strategies and commercial specifications; (4) most companies employ robustness studies, along with prior knowledge, risk assessments, and statistics, to develop the DP design space; (5) studies are tailored to commercial development needs and the practices of each company. Three case studies further illustrate how a robustness study design for a biopharmaceutical DP balances experimental complexity, statistical power, scientific understanding, and risk assessment to provide the desired product and process knowledge. The BPDG-Formulation Point Share discusses identified industry challenges with regard to biopharmaceutical DP robustness and presents some recommendations for best practices. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Parada, M.; Sbarbaro, D.; Borges, R. A.; Peres, P. L. D.
2017-01-01
The use of robust design techniques such as the one based on ? and ? for tuning proportional integral (PI) and proportional integral derivative (PID) controllers have been limited to address a small set of processes. This work addresses the problem by considering a wide set of possible plants, both first- and second-order continuous-time systems with time delays and zeros, leading to PI and PID controllers. The use of structured uncertainties to handle neglected dynamics allows to expand the range of processes to be considered. The proposed approach takes into account the robustness of the controller with respect to these structured uncertainties by using the small-gain theorem. In addition, improved performance is sought through the minimisation of an upper bound to the closed-loop system ? norm. A Lyapunov-Krasovskii-type functional is used to obtain delay-dependent design conditions. The controller design is accomplished by means of a convex optimisation procedure formulated using linear matrix inequalities. In order to illustrate the flexibility of the approach, several examples considering recycle compensation, reduced-order controller design and a practical implementation are addressed. Numerical experiments are provided in each case to highlight the main characteristics of the proposed design method.
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; ...
2013-01-01
Background . The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective . To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods . The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expertmore » knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results . The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions . Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Varnum, Susan M.; Brown, Joseph N.; Riensche, Roderick M.; Adkins, Joshua N.; Jacobs, Jon M.; Hoidal, John R.; Scholand, Mary Beth; Pounds, Joel G.; Blackburn, Michael R.; Rodland, Karin D.; McDermott, Jason E.
2013-01-01
Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification. PMID:24223463
Joseph J. Bozell; Berenger Biannic; Diana Cedeno; Thomas Elder; Omid Hosseinaei; Lukas Delbeck; Jae-Woo Kim; C.J. O' Lenick; Timothy Young
2014-01-01
Abstract The concept of the integrated biorefinery is critical to developing a robust biorefining industry in the USA.Within this model, the biorefinery will produce fuel as a highvolume output addressing domestic energy needs and biobased chemical products (high-value organics) as an output providing necessary economic support for fuel production. This paper will...
Robust fusion-based processing for military polarimetric imaging systems
NASA Astrophysics Data System (ADS)
Hickman, Duncan L.; Smith, Moira I.; Kim, Kyung Su; Choi, Hyun-Jin
2017-05-01
Polarisation information within a scene can be exploited in military systems to give enhanced automatic target detection and recognition (ATD/R) performance. However, the performance gain achieved is highly dependent on factors such as the geometry, viewing conditions, and the surface finish of the target. Such performance sensitivities are highly undesirable in many tactical military systems where operational conditions can vary significantly and rapidly during a mission. Within this paper, a range of processing architectures and fusion methods is considered in terms of their practical viability and operational robustness for systems requiring ATD/R. It is shown that polarisation information can give useful performance gains but, to retained system robustness, the introduction of polarimetric processing should be done in such a way as to not compromise other discriminatory scene information in the spectral and spatial domains. The analysis concludes that polarimetric data can be effectively integrated with conventional intensity-based ATD/R by either adapting the ATD/R processing function based on the scene polarisation or else by detection-level fusion. Both of these approaches avoid the introduction of processing bottlenecks and limit the impact of processing on system latency.
Robust Foot Clearance Estimation Based on the Integration of Foot-Mounted IMU Acceleration Data
Benoussaad, Mourad; Sijobert, Benoît; Mombaur, Katja; Azevedo Coste, Christine
2015-01-01
This paper introduces a method for the robust estimation of foot clearance during walking, using a single inertial measurement unit (IMU) placed on the subject’s foot. The proposed solution is based on double integration and drift cancellation of foot acceleration signals. The method is insensitive to misalignment of IMU axes with respect to foot axes. Details are provided regarding calibration and signal processing procedures. Experimental validation was performed on 10 healthy subjects under three walking conditions: normal, fast and with obstacles. Foot clearance estimation results were compared to measurements from an optical motion capture system. The mean error between them is significantly less than 15% under the various walking conditions. PMID:26703622
Lee, Won Seok; Won, Sejeong; Park, Jeunghee; Lee, Jihye; Park, Inkyu
2012-06-07
Controlled alignment and mechanically robust bonding between nanowires (NWs) and electrodes are essential requirements for reliable operation of functional NW-based electronic devices. In this work, we developed a novel process for the alignment and bonding between NWs and metal electrodes by using thermo-compressive transfer printing. In this process, bottom-up synthesized NWs were aligned in parallel by shear loading onto the intermediate substrate and then finally transferred onto the target substrate with low melting temperature metal electrodes. In particular, multi-layer (e.g. Cr/Au/In/Au and Cr/Cu/In/Au) metal electrodes are softened at low temperatures (below 100 °C) and facilitate submergence of aligned NWs into the surface of electrodes at a moderate pressure (∼5 bar). By using this thermo-compressive transfer printing process, robust electrical and mechanical contact between NWs and metal electrodes can be realized. This method is believed to be very useful for the large-area fabrication of NW-based electrical devices with improved mechanical robustness, electrical contact resistance, and reliability.
Veneziano, Alessio; Meloro, Carlo; Irish, Joel D; Stringer, Chris; Profico, Antonio; De Groote, Isabelle
2018-05-08
Although the evolution of the hominin masticatory apparatus has been linked to diet and food processing, the physical connection between neurocranium and lower jaw suggests a role of encephalization in the trend of dental and mandibular reduction. Here, the hypothesis that tooth size and mandibular robusticity are influenced by morphological changes in the neurocranium was tested. Three-dimensional landmarks, alveolar lengths, and mandibular robusticity data were recorded on a sample of chimpanzee and human skulls. The morphological integration between the neurocranium and the lower jaw was analyzed by means of Singular Warps Analysis. Redundancy Analysis was performed to understand if the pattern of neuromandibular integration affects tooth size and mandibular robusticity. There is significant morphological covariation between neurocranium and lower jaw in both chimpanzees and humans. In humans, changes in the temporal fossa seem to produce alterations of the relative orientation of jaw parts, while the influence of similar neurocranial changes in chimpanzees are more localized. In both species, postcanine alveolar lengths and mandibular robusticity are associated with shape changes of the temporal fossa. The results of this study support the hypothesis that the neurocranium is able to affect the evolution and development of the lower jaw, although most likely through functional integration of mandible, teeth, and muscles within the masticatory apparatus. This study highlights the relative influence of structural constraints and adaptive factors in the evolution of the human skull. © 2018 Wiley Periodicals, Inc.
Potentials for the use of tool-integrated in-line data acquisition systems in press shops
NASA Astrophysics Data System (ADS)
Maier, S.; Schmerbeck, T.; Liebig, A.; Kautz, T.; Volk, W.
2017-09-01
Robust in-line data acquisition systems are required for the realization of process monitoring and control systems in press shops. A promising approach is the integration of sensors in the following press tools. There they can be easy integrated and maintained. It also achieves the necessary robustness for the rough press environment. Such concepts were already investigated for the measurement of the geometrical accuracy as well as for the material flow of inner part areas. They enable the monitoring of each produced part’s quality. An important success factor are practical approaches to the use of this new process information in press shops. This work presents various applications of these measuring concepts, based on real car body components of the BMW Group. For example, the procedure of retroactive error analysis is explained for a side frame. It also shows how this data acquisition can be used for the optimization of drawing tools in tool shops. With the skid-line, there is a continuous value that can be monitored from planning to serial production.
Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel
NASA Astrophysics Data System (ADS)
Xie, Yanmin
2011-08-01
Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.
Integrated Cognitive Architectures For Robust Decision Making
2010-09-20
groups differed significantly from the other three [W(5) > 5, p > 0.13, uncorrected]. Performance by Condition It is useful to look at the average...the research that pursues integrated theories of human cognition, two approaches have become particularly influencial : ACT-R and Leabra. ACT-R...a wide range of tasks involving attention, learning, memory, problem solving, decision making, and language processing. Under the pressure of
An integrated biotechnology platform for developing sustainable chemical processes.
Barton, Nelson R; Burgard, Anthony P; Burk, Mark J; Crater, Jason S; Osterhout, Robin E; Pharkya, Priti; Steer, Brian A; Sun, Jun; Trawick, John D; Van Dien, Stephen J; Yang, Tae Hoon; Yim, Harry
2015-03-01
Genomatica has established an integrated computational/experimental metabolic engineering platform to design, create, and optimize novel high performance organisms and bioprocesses. Here we present our platform and its use to develop E. coli strains for production of the industrial chemical 1,4-butanediol (BDO) from sugars. A series of examples are given to demonstrate how a rational approach to strain engineering, including carefully designed diagnostic experiments, provided critical insights about pathway bottlenecks, byproducts, expression balancing, and commercial robustness, leading to a superior BDO production strain and process.
On the fragility of fractional-order PID controllers for FOPDT processes.
Padula, Fabrizio; Visioli, Antonio
2016-01-01
This paper analyzes the fragility issue of fractional-order proportional-integral-derivative controllers applied to integer first-order plus-dead-time processes. In particular, the effects of the variations of the controller parameters on the achieved control system robustness and performance are investigated. Results show that this kind of controllers is more fragile with respect to the standard proportional-integral-derivative controllers and therefore a significant attention should be paid by the user in their tuning. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Monitoring of laser material processing using machine integrated low-coherence interferometry
NASA Astrophysics Data System (ADS)
Kunze, Rouwen; König, Niels; Schmitt, Robert
2017-06-01
Laser material processing has become an indispensable tool in modern production. With the availability of high power pico- and femtosecond laser sources, laser material processing is advancing into applications, which demand for highest accuracies such as laser micro milling or laser drilling. In order to enable narrow tolerance windows, a closedloop monitoring of the geometrical properties of the processed work piece is essential for achieving a robust manufacturing process. Low coherence interferometry (LCI) is a high-precision measuring principle well-known from surface metrology. In recent years, we demonstrated successful integrations of LCI into several different laser material processing methods. Within this paper, we give an overview about the different machine integration strategies, that always aim at a complete and ideally telecentric integration of the measurement device into the existing beam path of the processing laser. Thus, highly accurate depth measurements within machine coordinates and a subsequent process control and quality assurance are possible. First products using this principle have already found its way to the market, which underlines the potential of this technology for the monitoring of laser material processing.
He, Fei; Fromion, Vincent; Westerhoff, Hans V
2013-11-21
Metabolic control analysis (MCA) and supply-demand theory have led to appreciable understanding of the systems properties of metabolic networks that are subject exclusively to metabolic regulation. Supply-demand theory has not yet considered gene-expression regulation explicitly whilst a variant of MCA, i.e. Hierarchical Control Analysis (HCA), has done so. Existing analyses based on control engineering approaches have not been very explicit about whether metabolic or gene-expression regulation would be involved, but designed different ways in which regulation could be organized, with the potential of causing adaptation to be perfect. This study integrates control engineering and classical MCA augmented with supply-demand theory and HCA. Because gene-expression regulation involves time integration, it is identified as a natural instantiation of the 'integral control' (or near integral control) known in control engineering. This study then focuses on robustness against and adaptation to perturbations of process activities in the network, which could result from environmental perturbations, mutations or slow noise. It is shown however that this type of 'integral control' should rarely be expected to lead to the 'perfect adaptation': although the gene-expression regulation increases the robustness of important metabolite concentrations, it rarely makes them infinitely robust. For perfect adaptation to occur, the protein degradation reactions should be zero order in the concentration of the protein, which may be rare biologically for cells growing steadily. A proposed new framework integrating the methodologies of control engineering and metabolic and hierarchical control analysis, improves the understanding of biological systems that are regulated both metabolically and by gene expression. In particular, the new approach enables one to address the issue whether the intracellular biochemical networks that have been and are being identified by genomics and systems biology, correspond to the 'perfect' regulatory structures designed by control engineering vis-à-vis optimal functions such as robustness. To the extent that they are not, the analyses suggest how they may become so and this in turn should facilitate synthetic biology and metabolic engineering.
2013-01-01
Background Metabolic control analysis (MCA) and supply–demand theory have led to appreciable understanding of the systems properties of metabolic networks that are subject exclusively to metabolic regulation. Supply–demand theory has not yet considered gene-expression regulation explicitly whilst a variant of MCA, i.e. Hierarchical Control Analysis (HCA), has done so. Existing analyses based on control engineering approaches have not been very explicit about whether metabolic or gene-expression regulation would be involved, but designed different ways in which regulation could be organized, with the potential of causing adaptation to be perfect. Results This study integrates control engineering and classical MCA augmented with supply–demand theory and HCA. Because gene-expression regulation involves time integration, it is identified as a natural instantiation of the ‘integral control’ (or near integral control) known in control engineering. This study then focuses on robustness against and adaptation to perturbations of process activities in the network, which could result from environmental perturbations, mutations or slow noise. It is shown however that this type of ‘integral control’ should rarely be expected to lead to the ‘perfect adaptation’: although the gene-expression regulation increases the robustness of important metabolite concentrations, it rarely makes them infinitely robust. For perfect adaptation to occur, the protein degradation reactions should be zero order in the concentration of the protein, which may be rare biologically for cells growing steadily. Conclusions A proposed new framework integrating the methodologies of control engineering and metabolic and hierarchical control analysis, improves the understanding of biological systems that are regulated both metabolically and by gene expression. In particular, the new approach enables one to address the issue whether the intracellular biochemical networks that have been and are being identified by genomics and systems biology, correspond to the ‘perfect’ regulatory structures designed by control engineering vis-à-vis optimal functions such as robustness. To the extent that they are not, the analyses suggest how they may become so and this in turn should facilitate synthetic biology and metabolic engineering. PMID:24261908
Liu, Jian; Cheng, Yuhu; Wang, Xuesong; Zhang, Lin; Liu, Hui
2017-08-17
It is urgent to diagnose colorectal cancer in the early stage. Some feature genes which are important to colorectal cancer development have been identified. However, for the early stage of colorectal cancer, less is known about the identity of specific cancer genes that are associated with advanced clinical stage. In this paper, we conducted a feature extraction method named Optimal Mean based Block Robust Feature Extraction method (OMBRFE) to identify feature genes associated with advanced colorectal cancer in clinical stage by using the integrated colorectal cancer data. Firstly, based on the optimal mean and L 2,1 -norm, a novel feature extraction method called Optimal Mean based Robust Feature Extraction method (OMRFE) is proposed to identify feature genes. Then the OMBRFE method which introduces the block ideology into OMRFE method is put forward to process the colorectal cancer integrated data which includes multiple genomic data: copy number alterations, somatic mutations, methylation expression alteration, as well as gene expression changes. Experimental results demonstrate that the OMBRFE is more effective than previous methods in identifying the feature genes. Moreover, genes identified by OMBRFE are verified to be closely associated with advanced colorectal cancer in clinical stage.
Liu, Tao; Gao, Furong
2011-04-01
In view of the deficiencies in existing internal model control (IMC)-based methods for load disturbance rejection for integrating and unstable processes with slow dynamics, a modified IMC-based controller design is proposed to deal with step- or ramp-type load disturbance that is often encountered in engineering practices. By classifying the ways through which such load disturbance enters into the process, analytical controller formulae are correspondingly developed, based on a two-degree-of-freedom (2DOF) control structure that allows for separate optimization of load disturbance rejection from setpoint tracking. An obvious merit is that there is only a single adjustable parameter in the proposed controller, which in essence corresponds to the time constant of the closed-loop transfer function for load disturbance rejection, and can be monotonically tuned to meet a good trade-off between disturbance rejection performance and closed-loop robust stability. At the same time, robust tuning constraints are given to accommodate process uncertainties in practice. Illustrative examples from the recent literature are used to show effectiveness and merits of the proposed method for different cases of load disturbance. Copyright © 2010. Published by Elsevier Ltd.
Robust parameter design for automatically controlled systems and nanostructure synthesis
NASA Astrophysics Data System (ADS)
Dasgupta, Tirthankar
2007-12-01
This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor-made to address the unique phenomena associated with nanostructure synthesis. A sequential space filling design called Sequential Minimum Energy Design (SMED) for exploring best process conditions for synthesis of nanowires. The SMED is a novel approach to generate sequential designs that are model independent, can quickly "carve out" regions with no observable nanostructure morphology, and allow for the exploration of complex response surfaces.
NASA Astrophysics Data System (ADS)
Ezz-Eldien, S. S.; Doha, E. H.; Bhrawy, A. H.; El-Kalaawy, A. A.; Machado, J. A. T.
2018-04-01
In this paper, we propose a new accurate and robust numerical technique to approximate the solutions of fractional variational problems (FVPs) depending on indefinite integrals with a type of fixed Riemann-Liouville fractional integral. The proposed technique is based on the shifted Chebyshev polynomials as basis functions for the fractional integral operational matrix (FIOM). Together with the Lagrange multiplier method, these problems are then reduced to a system of algebraic equations, which greatly simplifies the solution process. Numerical examples are carried out to confirm the accuracy, efficiency and applicability of the proposed algorithm
Design optimization for cost and quality: The robust design approach
NASA Technical Reports Server (NTRS)
Unal, Resit
1990-01-01
Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.
de Carvalho, Júlio Cesar; Borghetti, Ivo Alberto; Cartas, Liliana Carrilo; Woiciechowski, Adenise Lorenci; Soccol, Vanete Thomaz; Soccol, Carlos Ricardo
2018-01-01
Cassava, the 5th most important staple crop, generates at least 600L of wastewater per ton of processed root. This residue, cassava processing wastewater (CPW) has a high chemical oxygen demand, that can reach 56 g/L, and has also high concentrations of several mineral nutrients. The cultivation of microalgae such as Chlorella, Spirulina and wild strains was evaluated in the last years in raw, minimally processed and partially digested CPW. Concentrations of 2-4 g/L of these microalgae, comparable to those obtained in synthetic media, could be reached. The BOD of the residue was reduced by up to 92%. This process can be integrated into cassava processing industries, if challenges such as the toxicity of the concentrated residue, bacterial contamination, and the isolation of robust strains are addressed. Because CPW carries about 11% of the crop energy, integrating biogas production and microalgal cultivation into the cassava processing chain is promising. Copyright © 2017 Elsevier Ltd. All rights reserved.
Robust Vocabulary Instruction in a Readers' Workshop
ERIC Educational Resources Information Center
Feezell, Greg
2012-01-01
This article presents strategies for integrating explicit vocabulary instruction within a reading workshop. The author begins by describing a process for involving students in word selection. The author then provides a weeklong instructional sequence using student-selected words. Finally, the author briefly examines the role of vocabulary…
Neural integrators for decision making: a favorable tradeoff between robustness and sensitivity
Cain, Nicholas; Barreiro, Andrea K.; Shadlen, Michael
2013-01-01
A key step in many perceptual decision tasks is the integration of sensory inputs over time, but a fundamental questions remain about how this is accomplished in neural circuits. One possibility is to balance decay modes of membranes and synapses with recurrent excitation. To allow integration over long timescales, however, this balance must be exceedingly precise. The need for fine tuning can be overcome via a “robust integrator” mechanism in which momentary inputs must be above a preset limit to be registered by the circuit. The degree of this limiting embodies a tradeoff between sensitivity to the input stream and robustness against parameter mistuning. Here, we analyze the consequences of this tradeoff for decision-making performance. For concreteness, we focus on the well-studied random dot motion discrimination task and constrain stimulus parameters by experimental data. We show that mistuning feedback in an integrator circuit decreases decision performance but that the robust integrator mechanism can limit this loss. Intriguingly, even for perfectly tuned circuits with no immediate need for a robustness mechanism, including one often does not impose a substantial penalty for decision-making performance. The implication is that robust integrators may be well suited to subserve the basic function of evidence integration in many cognitive tasks. We develop these ideas using simulations of coupled neural units and the mathematics of sequential analysis. PMID:23446688
Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N
2017-03-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.
Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.
2016-01-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692
Arciszewski, Tim J; Munkittrick, Kelly R; Scrimgeour, Garry J; Dubé, Monique G; Wrona, Fred J; Hazewinkel, Rod R
2017-09-01
The primary goals of environmental monitoring are to indicate whether unexpected changes related to development are occurring in the physical, chemical, and biological attributes of ecosystems and to inform meaningful management intervention. Although achieving these objectives is conceptually simple, varying scientific and social challenges often result in their breakdown. Conceptualizing, designing, and operating programs that better delineate monitoring, management, and risk assessment processes supported by hypothesis-driven approaches, strong inference, and adverse outcome pathways can overcome many of the challenges. Generally, a robust monitoring program is characterized by hypothesis-driven questions associated with potential adverse outcomes and feedback loops informed by data. Specifically, key and basic features are predictions of future observations (triggers) and mechanisms to respond to success or failure of those predictions (tiers). The adaptive processes accelerate or decelerate the effort to highlight and overcome ignorance while preventing the potentially unnecessary escalation of unguided monitoring and management. The deployment of the mutually reinforcing components can allow for more meaningful and actionable monitoring programs that better associate activities with consequences. Integr Environ Assess Manag 2017;13:877-891. © 2017 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
Equality in Education: An Equality of Condition Perspective
ERIC Educational Resources Information Center
Lynch, Kathleen; Baker, John
2005-01-01
Transforming schools into truly egalitarian institutions requires a holistic and integrated approach. Using a robust conception of "equality of condition", we examine key dimensions of equality that are central to both the purposes and processes of education: equality in educational and related resources; equality of respect and recognition;…
Deployment Process, Mechanization, and Testing for the Mars Exploration Rovers
NASA Technical Reports Server (NTRS)
Iskenderian, Ted
2004-01-01
NASA's Mar Exploration Rover (MER) robotic prospectors were produced in an environment of unusually challenging schedule, volume, and mass restrictions. The technical challenges pushed the system s design towards extensive integration of function, which resulted in complex system engineering issues. One example of the system's integrated complexity can be found in the deployment process for the rover. Part of this process, rover "standup", is outlined in this paper. Particular attention is given to the Rover Lift Mechanism's (RLM) role and its design. Analysis methods are presented and compared to test results. It is shown that because prudent design principles were followed, a robust mechanism was created that minimized the duration of integration and test, and enabled recovery without perturbing related systems when reasonably foreseeable problems did occur. Examples of avoidable, unnecessary difficulty are also presented.
Optimized Sleeping Beauty transposons rapidly generate stable transgenic cell lines.
Kowarz, Eric; Löscher, Denise; Marschalek, Rolf
2015-04-01
Stable gene expression in mammalian cells is a prerequisite for many in vitro and in vivo experiments. However, either the integration of plasmids into mammalian genomes or the use of retro-/lentiviral systems have intrinsic limitations. The use of transposable elements, e.g. the Sleeping Beauty system (SB), circumvents most of these drawbacks (integration sites, size limitations) and allows the quick generation of stable cell lines. The integration process of SB is catalyzed by a transposase and the handling of this gene transfer system is easy, fast and safe. Here, we report our improvements made to the existing SB vector system and present two new vector types for robust constitutive or inducible expression of any gene of interest. Both types are available in 16 variants with different selection marker (puromycin, hygromycin, blasticidin, neomycin) and fluorescent protein expression (GFP, RFP, BFP) to fit most experimental requirements. With this system it is possible to generate cell lines from stable transfected cells quickly and reliably in a medium-throughput setting (three to five days). Cell lines robustly express any gene-of-interest, either constitutively or tightly regulated by doxycycline. This allows many laboratory experiments to speed up generation of data in a rapid and robust manner. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Plastic masters-rigid templates for soft lithography.
Desai, Salil P; Freeman, Dennis M; Voldman, Joel
2009-06-07
We demonstrate a simple process for the fabrication of rigid plastic master molds for soft lithography directly from (poly)dimethysiloxane devices. Plastics masters (PMs) provide a cost-effective alternative to silicon-based masters and can be easily replicated without the need for cleanroom facilities. We have successfully demonstrated the use of plastics micromolding to generate both single and dual-layer plastic structures, and have characterized the fidelity of the molding process. Using the PM fabrication technique, world-to-chip connections can be integrated directly into the master enabling devices with robust, well-aligned fluidic ports directly after molding. PMs provide an easy technique for the fabrication of microfluidic devices and a simple route for the scaling-up of fabrication of robust masters for soft lithography.
Vectorial mask optimization methods for robust optical lithography
NASA Astrophysics Data System (ADS)
Ma, Xu; Li, Yanqiu; Guo, Xuejia; Dong, Lisong; Arce, Gonzalo R.
2012-10-01
Continuous shrinkage of critical dimension in an integrated circuit impels the development of resolution enhancement techniques for low k1 lithography. Recently, several pixelated optical proximity correction (OPC) and phase-shifting mask (PSM) approaches were developed under scalar imaging models to account for the process variations. However, the lithography systems with larger-NA (NA>0.6) are predominant for current technology nodes, rendering the scalar models inadequate to describe the vector nature of the electromagnetic field that propagates through the optical lithography system. In addition, OPC and PSM algorithms based on scalar models can compensate for wavefront aberrations, but are incapable of mitigating polarization aberrations in practical lithography systems, which can only be dealt with under the vector model. To this end, we focus on developing robust pixelated gradient-based OPC and PSM optimization algorithms aimed at canceling defocus, dose variation, wavefront and polarization aberrations under a vector model. First, an integrative and analytic vector imaging model is applied to formulate the optimization problem, where the effects of process variations are explicitly incorporated in the optimization framework. A steepest descent algorithm is then used to iteratively optimize the mask patterns. Simulations show that the proposed algorithms can effectively improve the process windows of the optical lithography systems.
Zhang, Bitao; Pi, YouGuo
2013-07-01
The traditional integer order proportional-integral-differential (IO-PID) controller is sensitive to the parameter variation or/and external load disturbance of permanent magnet synchronous motor (PMSM). And the fractional order proportional-integral-differential (FO-PID) control scheme based on robustness tuning method is proposed to enhance the robustness. But the robustness focuses on the open-loop gain variation of controlled plant. In this paper, an enhanced robust fractional order proportional-plus-integral (ERFOPI) controller based on neural network is proposed. The control law of the ERFOPI controller is acted on a fractional order implement function (FOIF) of tracking error but not tracking error directly, which, according to theory analysis, can enhance the robust performance of system. Tuning rules and approaches, based on phase margin, crossover frequency specification and robustness rejecting gain variation, are introduced to obtain the parameters of ERFOPI controller. And the neural network algorithm is used to adjust the parameter of FOIF. Simulation and experimental results show that the method proposed in this paper not only achieve favorable tracking performance, but also is robust with regard to external load disturbance and parameter variation. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Growth Control and Disease Mechanisms in Computational Embryogeny
NASA Technical Reports Server (NTRS)
Shapiro, Andrew A.; Yogev, Or; Antonsson, Erik K.
2008-01-01
This paper presents novel approach to applying growth control and diseases mechanisms in computational embryogeny. Our method, which mimics fundamental processes from biology, enables individuals to reach maturity in a controlled process through a stochastic environment. Three different mechanisms were implemented; disease mechanisms, gene suppression, and thermodynamic balancing. This approach was integrated as part of a structural evolutionary model. The model evolved continuum 3-D structures which support an external load. By using these mechanisms we were able to evolve individuals that reached a fixed size limit through the growth process. The growth process was an integral part of the complete development process. The size of the individuals was determined purely by the evolutionary process where different individuals matured to different sizes. Individuals which evolved with these characteristics have been found to be very robust for supporting a wide range of external loads.
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.
2013-10-01
Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integratedmore » into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less
A Neurocomputational Model of Goal-Directed Navigation in Insect-Inspired Artificial Agents
Goldschmidt, Dennis; Manoonpong, Poramate; Dasgupta, Sakyasingha
2017-01-01
Despite their small size, insect brains are able to produce robust and efficient navigation in complex environments. Specifically in social insects, such as ants and bees, these navigational capabilities are guided by orientation directing vectors generated by a process called path integration. During this process, they integrate compass and odometric cues to estimate their current location as a vector, called the home vector for guiding them back home on a straight path. They further acquire and retrieve path integration-based vector memories globally to the nest or based on visual landmarks. Although existing computational models reproduced similar behaviors, a neurocomputational model of vector navigation including the acquisition of vector representations has not been described before. Here we present a model of neural mechanisms in a modular closed-loop control—enabling vector navigation in artificial agents. The model consists of a path integration mechanism, reward-modulated global learning, random search, and action selection. The path integration mechanism integrates compass and odometric cues to compute a vectorial representation of the agent's current location as neural activity patterns in circular arrays. A reward-modulated learning rule enables the acquisition of vector memories by associating the local food reward with the path integration state. A motor output is computed based on the combination of vector memories and random exploration. In simulation, we show that the neural mechanisms enable robust homing and localization, even in the presence of external sensory noise. The proposed learning rules lead to goal-directed navigation and route formation performed under realistic conditions. Consequently, we provide a novel approach for vector learning and navigation in a simulated, situated agent linking behavioral observations to their possible underlying neural substrates. PMID:28446872
A Neurocomputational Model of Goal-Directed Navigation in Insect-Inspired Artificial Agents.
Goldschmidt, Dennis; Manoonpong, Poramate; Dasgupta, Sakyasingha
2017-01-01
Despite their small size, insect brains are able to produce robust and efficient navigation in complex environments. Specifically in social insects, such as ants and bees, these navigational capabilities are guided by orientation directing vectors generated by a process called path integration. During this process, they integrate compass and odometric cues to estimate their current location as a vector, called the home vector for guiding them back home on a straight path. They further acquire and retrieve path integration-based vector memories globally to the nest or based on visual landmarks. Although existing computational models reproduced similar behaviors, a neurocomputational model of vector navigation including the acquisition of vector representations has not been described before. Here we present a model of neural mechanisms in a modular closed-loop control-enabling vector navigation in artificial agents. The model consists of a path integration mechanism, reward-modulated global learning, random search, and action selection. The path integration mechanism integrates compass and odometric cues to compute a vectorial representation of the agent's current location as neural activity patterns in circular arrays. A reward-modulated learning rule enables the acquisition of vector memories by associating the local food reward with the path integration state. A motor output is computed based on the combination of vector memories and random exploration. In simulation, we show that the neural mechanisms enable robust homing and localization, even in the presence of external sensory noise. The proposed learning rules lead to goal-directed navigation and route formation performed under realistic conditions. Consequently, we provide a novel approach for vector learning and navigation in a simulated, situated agent linking behavioral observations to their possible underlying neural substrates.
Engineering Elegant Systems: Postulates, Principles, and Hypotheses of Systems Engineering
NASA Technical Reports Server (NTRS)
Watson, Michael D.
2018-01-01
Definition: System Engineering is the engineering discipline which integrates the system functions, system environment, and the engineering disciplines necessary to produce and/or operate an elegant system; Elegant System - A system that is robust in application, fully meeting specified and adumbrated intent, is well structured, and is graceful in operation. Primary Focus: System Design and Integration: Identify system couplings and interactions; Identify system uncertainties and sensitivities; Identify emergent properties; Manage the effectiveness of the system. Engineering Discipline Integration: Manage flow of information for system development and/or operations; Maintain system activities within budget and schedule. Supporting Activities: Process application and execution.
NASA Astrophysics Data System (ADS)
Dos Santos Ferreira, Olavio; Sadat Gousheh, Reza; Visser, Bart; Lie, Kenrick; Teuwen, Rachel; Izikson, Pavel; Grzela, Grzegorz; Mokaberi, Babak; Zhou, Steve; Smith, Justin; Husain, Danish; Mandoy, Ram S.; Olvera, Raul
2018-03-01
Ever increasing need for tighter on-product overlay (OPO), as well as enhanced accuracy in overlay metrology and methodology, is driving semiconductor industry's technologists to innovate new approaches to OPO measurements. In case of High Volume Manufacturing (HVM) fabs, it is often critical to strive for both accuracy and robustness. Robustness, in particular, can be challenging in metrology since overlay targets can be impacted by proximity of other structures next to the overlay target (asymmetric effects), as well as symmetric stack changes such as photoresist height variations. Both symmetric and asymmetric contributors have impact on robustness. Furthermore, tweaking or optimizing wafer processing parameters for maximum yield may have an adverse effect on physical target integrity. As a result, measuring and monitoring physical changes or process abnormalities/artefacts in terms of new Key Performance Indicators (KPIs) is crucial for the end goal of minimizing true in-die overlay of the integrated circuits (ICs). IC manufacturing fabs often relied on CD-SEM in the past to capture true in-die overlay. Due to destructive and intrusive nature of CD-SEMs on certain materials, it's desirable to characterize asymmetry effects for overlay targets via inline KPIs utilizing YieldStar (YS) metrology tools. These KPIs can also be integrated as part of (μDBO) target evaluation and selection for final recipe flow. In this publication, the Holistic Metrology Qualification (HMQ) flow was extended to account for process induced (asymmetric) effects such as Grating Imbalance (GI) and Bottom Grating Asymmetry (BGA). Local GI typically contributes to the intrafield OPO whereas BGA typically impacts the interfield OPO, predominantly at the wafer edge. Stack height variations highly impact overlay metrology accuracy, in particular in case of multi-layer LithoEtch Litho-Etch (LELE) overlay control scheme. Introducing a GI impact on overlay (in nm) KPI check quantifies the grating imbalance impact on overlay, whereas optimizing for accuracy using self-reference captures the bottom grating asymmetry effect. Measuring BGA after each process step before exposure of the top grating helps to identify which specific step introduces the asymmetry in the bottom grating. By evaluating this set of KPI's to a BEOL LELE overlay scheme, we can enhance robustness of recipe selection and target selection. Furthermore, these KPIs can be utilized to highlight process and equipment abnormalities. In this work, we also quantified OPO results with a self-contained methodology called Triangle Method. This method can be utilized for LELE layers with a common target and reference. This allows validating general μDBO accuracy, hence reducing the need for CD-SEM verification.
Atencio, Reinaldo; Chacón, Mirbel; González, Teresa; Briceño, Alexander; Agrifoglio, Giuseppe; Sierraalta, Anibal
2004-02-21
A robust heteromeric hydrogen-bonded synthon [R2(2) (9)-Id] is exploited to drive the modular self-assembly of four coordination complexes [M(H2biim)2(OH2)2]2+ (M = Co2+, Ni2+) and carboxylate counterions. This strategy allowed us to build molecular architectures of 0-, 1-, and 2-dimensions. A hydrogen-bonded 2D-network with cavities has been designed, which maintains its striking integrity after reversible water desorption-resorption processes.
Kim, Jongpal; Kim, Jihoon; Ko, Hyoungho
2015-12-31
To overcome light interference, including a large DC offset and ambient light variation, a robust photoplethysmogram (PPG) readout chip is fabricated using a 0.13-μm complementary metal-oxide-semiconductor (CMOS) process. Against the large DC offset, a saturation detection and current feedback circuit is proposed to compensate for an offset current of up to 30 μA. For robustness against optical path variation, an automatic emitted light compensation method is adopted. To prevent ambient light interference, an alternating sampling and charge redistribution technique is also proposed. In the proposed technique, no additional power is consumed, and only three differential switches and one capacitor are required. The PPG readout channel consumes 26.4 μW and has an input referred current noise of 260 pArms.
Kim, Jongpal; Kim, Jihoon; Ko, Hyoungho
2015-01-01
To overcome light interference, including a large DC offset and ambient light variation, a robust photoplethysmogram (PPG) readout chip is fabricated using a 0.13-μm complementary metal–oxide–semiconductor (CMOS) process. Against the large DC offset, a saturation detection and current feedback circuit is proposed to compensate for an offset current of up to 30 μA. For robustness against optical path variation, an automatic emitted light compensation method is adopted. To prevent ambient light interference, an alternating sampling and charge redistribution technique is also proposed. In the proposed technique, no additional power is consumed, and only three differential switches and one capacitor are required. The PPG readout channel consumes 26.4 μW and has an input referred current noise of 260 pArms. PMID:26729122
Integrated Demonstration of Instrument Placement , Robust Execution and Contingent Planning
NASA Technical Reports Server (NTRS)
Pedersen, L.; Bualat, M.; Lees, D.; Smith, D. E.; Korsmeyer, David (Technical Monitor); Washington, R.
2003-01-01
This paper describes an integrated demonstration of ground-based contingent planning, robust execution and autonomous instrument placement for the efficient exploration of a site by a prototype Mars rover.
Simulation Based Low-Cost Composite Process Development at the US Air Force Research Laboratory
NASA Technical Reports Server (NTRS)
Rice, Brian P.; Lee, C. William; Curliss, David B.
2003-01-01
Low-cost composite research in the US Air Force Research Laboratory, Materials and Manufacturing Directorate, Organic Matrix Composites Branch has focused on the theme of affordable performance. Practically, this means that we use a very broad view when considering the affordability of composites. Factors such as material costs, labor costs, recurring and nonrecurring manufacturing costs are balanced against performance to arrive at the relative affordability vs. performance measure of merit. The research efforts discussed here are two projects focused on affordable processing of composites. The first topic is the use of a neural network scheme to model cure reaction kinetics, then utilize the kinetics coupled with simple heat transport models to predict, in real-time, future exotherms and control them. The neural network scheme is demonstrated to be very robust and a much more efficient method that mechanistic cure modeling approach. This enables very practical low-cost processing of thick composite parts. The second project is liquid composite molding (LCM) process simulation. LCM processing of large 3D integrated composite parts has been demonstrated to be a very cost effective way to produce large integrated aerospace components specific examples of LCM processes are resin transfer molding (RTM), vacuum assisted resin transfer molding (VARTM), and other similar approaches. LCM process simulation is a critical part of developing an LCM process approach. Flow simulation enables the development of the most robust approach to introducing resin into complex preforms. Furthermore, LCM simulation can be used in conjunction with flow front sensors to control the LCM process in real-time to account for preform or resin variability.
Towards a manufacturing ecosystem for integrated photonic sensors (Conference Presentation)
NASA Astrophysics Data System (ADS)
Miller, Benjamin L.
2017-03-01
Laboratory-scale demonstrations of optical biosensing employing structures compatible with CMOS fabrication, including waveguides, Mach-Zehnder interferometers, ring resonators, and photonic crystals, have provided ample validation of the promise of these technologies. However, to date there are relatively few examples of integrated photonic biosensors in the commercial sphere. The lack of successful translation from the laboratory to the marketplace is due in part to a lack of robust manufacturing processes for integrated photonics overall. This talk will describe efforts within the American Institute for Manufacturing Photonics (AIM Photonics), a public-private consortium funded by the Department of Defense, State governments, Universities, and Corporate partners to accelerate manufacturing of integrated photonic sensors.
Tuning rules for robust FOPID controllers based on multi-objective optimization with FOPDT models.
Sánchez, Helem Sabina; Padula, Fabrizio; Visioli, Antonio; Vilanova, Ramon
2017-01-01
In this paper a set of optimally balanced tuning rules for fractional-order proportional-integral-derivative controllers is proposed. The control problem of minimizing at once the integrated absolute error for both the set-point and the load disturbance responses is addressed. The control problem is stated as a multi-objective optimization problem where a first-order-plus-dead-time process model subject to a robustness, maximum sensitivity based, constraint has been considered. A set of Pareto optimal solutions is obtained for different normalized dead times and then the optimal balance between the competing objectives is obtained by choosing the Nash solution among the Pareto-optimal ones. A curve fitting procedure has then been applied in order to generate suitable tuning rules. Several simulation results show the effectiveness of the proposed approach. Copyright © 2016. Published by Elsevier Ltd.
Color sensor and neural processor on one chip
NASA Astrophysics Data System (ADS)
Fiesler, Emile; Campbell, Shannon R.; Kempem, Lother; Duong, Tuan A.
1998-10-01
Low-cost, compact, and robust color sensor that can operate in real-time under various environmental conditions can benefit many applications, including quality control, chemical sensing, food production, medical diagnostics, energy conservation, monitoring of hazardous waste, and recycling. Unfortunately, existing color sensor are either bulky and expensive or do not provide the required speed and accuracy. In this publication we describe the design of an accurate real-time color classification sensor, together with preprocessing and a subsequent neural network processor integrated on a single complementary metal oxide semiconductor (CMOS) integrated circuit. This one-chip sensor and information processor will be low in cost, robust, and mass-producible using standard commercial CMOS processes. The performance of the chip and the feasibility of its manufacturing is proven through computer simulations based on CMOS hardware parameters. Comparisons with competing methodologies show a significantly higher performance for our device.
Abdelkarim, Noha; Mohamed, Amr E; El-Garhy, Ahmed M; Dorrah, Hassen T
2016-01-01
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller.
Mohamed, Amr E.; Dorrah, Hassen T.
2016-01-01
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller. PMID:27807444
A Novel Robust H∞ Filter Based on Krein Space Theory in the SINS/CNS Attitude Reference System.
Yu, Fei; Lv, Chongyang; Dong, Qianhui
2016-03-18
Owing to their numerous merits, such as compact, autonomous and independence, the strapdown inertial navigation system (SINS) and celestial navigation system (CNS) can be used in marine applications. What is more, due to the complementary navigation information obtained from two different kinds of sensors, the accuracy of the SINS/CNS integrated navigation system can be enhanced availably. Thus, the SINS/CNS system is widely used in the marine navigation field. However, the CNS is easily interfered with by the surroundings, which will lead to the output being discontinuous. Thus, the uncertainty problem caused by the lost measurement will reduce the system accuracy. In this paper, a robust H∞ filter based on the Krein space theory is proposed. The Krein space theory is introduced firstly, and then, the linear state and observation models of the SINS/CNS integrated navigation system are established reasonably. By taking the uncertainty problem into account, in this paper, a new robust H∞ filter is proposed to improve the robustness of the integrated system. At last, this new robust filter based on the Krein space theory is estimated by numerical simulations and actual experiments. Additionally, the simulation and experiment results and analysis show that the attitude errors can be reduced by utilizing the proposed robust filter effectively when the measurements are missing discontinuous. Compared to the traditional Kalman filter (KF) method, the accuracy of the SINS/CNS integrated system is improved, verifying the robustness and the availability of the proposed robust H∞ filter.
High-throughput electrical characterization for robust overlay lithography control
NASA Astrophysics Data System (ADS)
Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.
2017-03-01
Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.
Quantum interference between transverse spatial waveguide modes.
Mohanty, Aseema; Zhang, Mian; Dutt, Avik; Ramelow, Sven; Nussenzveig, Paulo; Lipson, Michal
2017-01-20
Integrated quantum optics has the potential to markedly reduce the footprint and resource requirements of quantum information processing systems, but its practical implementation demands broader utilization of the available degrees of freedom within the optical field. To date, integrated photonic quantum systems have primarily relied on path encoding. However, in the classical regime, the transverse spatial modes of a multi-mode waveguide have been easily manipulated using the waveguide geometry to densely encode information. Here, we demonstrate quantum interference between the transverse spatial modes within a single multi-mode waveguide using quantum circuit-building blocks. This work shows that spatial modes can be controlled to an unprecedented level and have the potential to enable practical and robust quantum information processing.
Varga, Nicole L.; Stewart, Rebekah A.; Bauer, Patricia J.
2016-01-01
Semantic memory, defined as our store of knowledge about the world, provides representational support for all of our higher order cognitive functions. As such, it is crucial that the contents of semantic memory remain accessible over time. Although memory for knowledge learned through direct observation has been previously investigated, we know very little about the retention of knowledge derived through integration of information acquired across separate learning episodes. The present research investigated cross-episode integration in 4-year-old children. Participants were presented with novel facts via distinct story episodes and tested for knowledge extension through cross-episode integration, as well as for retention of the information over a 1-week delay. In Experiment 1, children retained the self-derived knowledge over the delay, though performance was primarily evidenced in a forced-choice format. In Experiment 2, we sought to facilitate the accessibility and robustness of self-derived knowledge by providing a verbal reminder after the delay. The accessibility of self-derived knowledge increased, irrespective of whether participants successfully demonstrated knowledge of the integration facts during the first visit. The results suggest knowledge extended through integration remains accessible after delays, even in a population in which this learning process is less robust. The findings also demonstrate the facilitative effect of reminders on the accessibility and further extension of knowledge over extended time periods. PMID:26774259
Integrating 3D geological information with a national physically-based hydrological modelling system
NASA Astrophysics Data System (ADS)
Lewis, Elizabeth; Parkin, Geoff; Kessler, Holger; Whiteman, Mark
2016-04-01
Robust numerical models are an essential tool for informing flood and water management and policy around the world. Physically-based hydrological models have traditionally not been used for such applications due to prohibitively large data, time and computational resource requirements. Given recent advances in computing power and data availability, a robust, physically-based hydrological modelling system for Great Britain using the SHETRAN model and national datasets has been created. Such a model has several advantages over less complex systems. Firstly, compared with conceptual models, a national physically-based model is more readily applicable to ungauged catchments, in which hydrological predictions are also required. Secondly, the results of a physically-based system may be more robust under changing conditions such as climate and land cover, as physical processes and relationships are explicitly accounted for. Finally, a fully integrated surface and subsurface model such as SHETRAN offers a wider range of applications compared with simpler schemes, such as assessments of groundwater resources, sediment and nutrient transport and flooding from multiple sources. As such, SHETRAN provides a robust means of simulating numerous terrestrial system processes which will add physical realism when coupled to the JULES land surface model. 306 catchments spanning Great Britain have been modelled using this system. The standard configuration of this system performs satisfactorily (NSE > 0.5) for 72% of catchments and well (NSE > 0.7) for 48%. Many of the remaining 28% of catchments that performed relatively poorly (NSE < 0.5) are located in the chalk in the south east of England. As such, the British Geological Survey 3D geology model for Great Britain (GB3D) has been incorporated, for the first time in any hydrological model, to pave the way for improvements to be made to simulations of catchments with important groundwater regimes. This coupling has involved development of software to allow for easy incorporation of geological information into SHETRAN for any model setup. The addition of more realistic subsurface representation following this approach is shown to greatly improve model performance in areas dominated by groundwater processes. The resulting modelling system has great potential to be used as a resource at national, regional and local scales in an array of different applications, including climate change impact assessments, land cover change studies and integrated assessments of groundwater and surface water resources.
A Novel Image Retrieval Based on Visual Words Integration of SIFT and SURF
Ali, Nouman; Bajwa, Khalid Bashir; Sablatnig, Robert; Chatzichristofis, Savvas A.; Iqbal, Zeshan; Rashid, Muhammad; Habib, Hafiz Adnan
2016-01-01
With the recent evolution of technology, the number of image archives has increased exponentially. In Content-Based Image Retrieval (CBIR), high-level visual information is represented in the form of low-level features. The semantic gap between the low-level features and the high-level image concepts is an open research problem. In this paper, we present a novel visual words integration of Scale Invariant Feature Transform (SIFT) and Speeded-Up Robust Features (SURF). The two local features representations are selected for image retrieval because SIFT is more robust to the change in scale and rotation, while SURF is robust to changes in illumination. The visual words integration of SIFT and SURF adds the robustness of both features to image retrieval. The qualitative and quantitative comparisons conducted on Corel-1000, Corel-1500, Corel-2000, Oliva and Torralba and Ground Truth image benchmarks demonstrate the effectiveness of the proposed visual words integration. PMID:27315101
Robust averaging protects decisions from noise in neural computations
Herce Castañón, Santiago; Solomon, Joshua A.; Vandormael, Hildward
2017-01-01
An ideal observer will give equivalent weight to sources of information that are equally reliable. However, when averaging visual information, human observers tend to downweight or discount features that are relatively outlying or deviant (‘robust averaging’). Why humans adopt an integration policy that discards important decision information remains unknown. Here, observers were asked to judge the average tilt in a circular array of high-contrast gratings, relative to an orientation boundary defined by a central reference grating. Observers showed robust averaging of orientation, but the extent to which they did so was a positive predictor of their overall performance. Using computational simulations, we show that although robust averaging is suboptimal for a perfect integrator, it paradoxically enhances performance in the presence of “late” noise, i.e. which corrupts decisions during integration. In other words, robust decision strategies increase the brain’s resilience to noise arising in neural computations during decision-making. PMID:28841644
Robust Integration Schemes for Generalized Viscoplasticity with Internal-State Variables
NASA Technical Reports Server (NTRS)
Saleeb, Atef F.; Li, W.; Wilt, Thomas E.
1997-01-01
The scope of the work in this presentation focuses on the development of algorithms for the integration of rate dependent constitutive equations. In view of their robustness; i.e., their superior stability and convergence properties for isotropic and anisotropic coupled viscoplastic-damage models, implicit integration schemes have been selected. This is the simplest in its class and is one of the most widely used implicit integrators at present.
Impact of self-healing capability on network robustness
NASA Astrophysics Data System (ADS)
Shang, Yilun
2015-04-01
A wide spectrum of real-life systems ranging from neurons to botnets display spontaneous recovery ability. Using the generating function formalism applied to static uncorrelated random networks with arbitrary degree distributions, the microscopic mechanism underlying the depreciation-recovery process is characterized and the effect of varying self-healing capability on network robustness is revealed. It is found that the self-healing capability of nodes has a profound impact on the phase transition in the emergence of percolating clusters, and that salient difference exists in upholding network integrity under random failures and intentional attacks. The results provide a theoretical framework for quantitatively understanding the self-healing phenomenon in varied complex systems.
Impact of self-healing capability on network robustness.
Shang, Yilun
2015-04-01
A wide spectrum of real-life systems ranging from neurons to botnets display spontaneous recovery ability. Using the generating function formalism applied to static uncorrelated random networks with arbitrary degree distributions, the microscopic mechanism underlying the depreciation-recovery process is characterized and the effect of varying self-healing capability on network robustness is revealed. It is found that the self-healing capability of nodes has a profound impact on the phase transition in the emergence of percolating clusters, and that salient difference exists in upholding network integrity under random failures and intentional attacks. The results provide a theoretical framework for quantitatively understanding the self-healing phenomenon in varied complex systems.
NASA Astrophysics Data System (ADS)
Zhou, Yajun
This thesis employs the topological concept of compactness to deduce robust solutions to two integral equations arising from chemistry and physics: the inverse Laplace problem in chemical kinetics and the vector wave scattering problem in dielectric optics. The inverse Laplace problem occurs in the quantitative understanding of biological processes that exhibit complex kinetic behavior: different subpopulations of transition events from the "reactant" state to the "product" state follow distinct reaction rate constants, which results in a weighted superposition of exponential decay modes. Reconstruction of the rate constant distribution from kinetic data is often critical for mechanistic understandings of chemical reactions related to biological macromolecules. We devise a "phase function approach" to recover the probability distribution of rate constants from decay data in the time domain. The robustness (numerical stability) of this reconstruction algorithm builds upon the continuity of the transformations connecting the relevant function spaces that are compact metric spaces. The robust "phase function approach" not only is useful for the analysis of heterogeneous subpopulations of exponential decays within a single transition step, but also is generalizable to the kinetic analysis of complex chemical reactions that involve multiple intermediate steps. A quantitative characterization of the light scattering is central to many meteoro-logical, optical, and medical applications. We give a rigorous treatment to electromagnetic scattering on arbitrarily shaped dielectric media via the Born equation: an integral equation with a strongly singular convolution kernel that corresponds to a non-compact Green operator. By constructing a quadratic polynomial of the Green operator that cancels out the kernel singularity and satisfies the compactness criterion, we reveal the universality of a real resonance mode in dielectric optics. Meanwhile, exploiting the properties of compact operators, we outline the geometric and physical conditions that guarantee a robust solution to the light scattering problem, and devise an asymptotic solution to the Born equation of electromagnetic scattering for arbitrarily shaped dielectric in a non-perturbative manner.
Robust Representation of Integrated Surface-subsurface Hydrology at Watershed Scales
NASA Astrophysics Data System (ADS)
Painter, S. L.; Tang, G.; Collier, N.; Jan, A.; Karra, S.
2015-12-01
A representation of integrated surface-subsurface hydrology is the central component to process-rich watershed models that are emerging as alternatives to traditional reduced complexity models. These physically based systems are important for assessing potential impacts of climate change and human activities on groundwater-dependent ecosystems and water supply and quality. Integrated surface-subsurface models typically couple three-dimensional solutions for variably saturated flow in the subsurface with the kinematic- or diffusion-wave equation for surface flows. The computational scheme for coupling the surface and subsurface systems is key to the robustness, computational performance, and ease-of-implementation of the integrated system. A new, robust approach for coupling the subsurface and surface systems is developed from the assumption that the vertical gradient in head is negligible at the surface. This tight-coupling assumption allows the surface flow system to be incorporated directly into the subsurface system; effects of surface flow and surface water accumulation are represented as modifications to the subsurface flow and accumulation terms but are not triggered until the subsurface pressure reaches a threshold value corresponding to the appearance of water on the surface. The new approach has been implemented in the highly parallel PFLOTRAN (www.pflotran.org) code. Several synthetic examples and three-dimensional examples from the Walker Branch Watershed in Oak Ridge TN demonstrate the utility and robustness of the new approach using unstructured computational meshes. Representation of solute transport in the new approach is also discussed. Notice: This manuscript has been authored by UT-Battelle, LLC, under Contract No. DE-AC0500OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for the United States Government purposes.
NASA Technical Reports Server (NTRS)
D'souza, Sarah N.; Kinney, David J.; Garcia, Joseph A.; Sarigul-Klijn, Nesrin
2014-01-01
The state-of-the-art in vehicle design decouples flight feasible trajectory generation from the optimization process of an entry spacecraft shape. The disadvantage to this decoupled process is seen when a particular aeroshell does not meet in-flight requirements when integrated into Guidance, Navigation, and Control simulations. It is postulated that the integration of a guidance algorithm into the design process will provide a real-time, rapid trajectory generation technique to enhance the robustness of vehicle design solutions. The potential benefit of this integration is a reduction in design cycles (possible cost savings) and increased accuracy in the aerothermal environment (possible mass savings). This work examines two aspects: 1) the performance of a reference tracking guidance algorithm for five different geometries with the same reference trajectory and 2) the potential of mass savings from improved aerothermal predictions. An Apollo Derived Guidance (ADG) algorithm is used in this study. The baseline geometry and five test case geometries were flown using the same baseline trajectory. The guided trajectory results are compared to separate trajectories determined in a vehicle optimization study conducted for NASA's Mars Entry, Descent, and Landing System Analysis. This study revealed several aspects regarding the potential gains and required developments for integrating a guidance algorithm into the vehicle optimization environment. First, the generation of flight feasible trajectories is only as good as the robustness of the guidance algorithm. The set of dispersed geometries modelled aerodynamic dispersions that ranged from +/-1% to +/-17% and a single extreme case was modelled where the aerodynamics were approximately 80% less than the baseline geometry. The ADG, as expected, was able to guide the vehicle into the aeroshell separation box at the target location for dispersions up to 17%, but failed for the 80% dispersion cases. Finally, the results revealed that including flight feasible trajectories for a set of dispersed geometries has the potential to save mass up to 430 kg.
Brown, Andrew D; Tollin, Daniel J
2016-09-21
In mammals, localization of sound sources in azimuth depends on sensitivity to interaural differences in sound timing (ITD) and level (ILD). Paradoxically, while typical ILD-sensitive neurons of the auditory brainstem require millisecond synchrony of excitatory and inhibitory inputs for the encoding of ILDs, human and animal behavioral ILD sensitivity is robust to temporal stimulus degradations (e.g., interaural decorrelation due to reverberation), or, in humans, bilateral clinical device processing. Here we demonstrate that behavioral ILD sensitivity is only modestly degraded with even complete decorrelation of left- and right-ear signals, suggesting the existence of a highly integrative ILD-coding mechanism. Correspondingly, we find that a majority of auditory midbrain neurons in the central nucleus of the inferior colliculus (of chinchilla) effectively encode ILDs despite complete decorrelation of left- and right-ear signals. We show that such responses can be accounted for by relatively long windows of bilateral excitatory-inhibitory interaction, which we explicitly measure using trains of narrowband clicks. Neural and behavioral data are compared with the outputs of a simple model of ILD processing with a single free parameter, the duration of excitatory-inhibitory interaction. Behavioral, neural, and modeling data collectively suggest that ILD sensitivity depends on binaural integration of excitation and inhibition within a ≳3 ms temporal window, significantly longer than observed in lower brainstem neurons. This relatively slow integration potentiates a unique role for the ILD system in spatial hearing that may be of particular importance when informative ITD cues are unavailable. In mammalian hearing, interaural differences in the timing (ITD) and level (ILD) of impinging sounds carry critical information about source location. However, natural sounds are often decorrelated between the ears by reverberation and background noise, degrading the fidelity of both ITD and ILD cues. Here we demonstrate that behavioral ILD sensitivity (in humans) and neural ILD sensitivity (in single neurons of the chinchilla auditory midbrain) remain robust under stimulus conditions that render ITD cues undetectable. This result can be explained by "slow" temporal integration arising from several-millisecond-long windows of excitatory-inhibitory interaction evident in midbrain, but not brainstem, neurons. Such integrative coding can account for the preservation of ILD sensitivity despite even extreme temporal degradations in ecological acoustic stimuli. Copyright © 2016 the authors 0270-6474/16/369908-14$15.00/0.
Kim, Chur; Kwon, Dohyeon; Kim, Dohyun; Choi, Sun Young; Cha, Sang Jun; Choi, Ki Sun; Yeom, Dong-Il; Rotermund, Fabian; Kim, Jungwon
2017-04-15
We demonstrate a new planar lightwave circuit (PLC)-based device, integrated with a 980/1550 wavelength division multiplexer, an evanescent-field-interaction-based saturable absorber, and an output tap coupler, which can be employed as a multi-functional element in mode-locked fiber lasers. Using this multi-functional PLC device, we demonstrate a simple, robust, low-noise, and polarization-maintaining mode-locked Er-fiber laser. The measured full-width at half-maximum bandwidth is 6 nm centered at 1555 nm, corresponding to 217 fs transform-limited pulse duration. The measured RIN and timing jitter are 0.22% [10 Hz-10 MHz] and 6.6 fs [10 kHz-1 MHz], respectively. Our results show that the non-gain section of mode-locked fiber lasers can be easily implemented as a single PLC chip that can be manufactured by a wafer-scale fabrication process. The use of PLC processes in mode-locked lasers has the potential for higher manufacturability of low-cost and robust fiber and waveguide lasers.
A Novel Robust H∞ Filter Based on Krein Space Theory in the SINS/CNS Attitude Reference System
Yu, Fei; Lv, Chongyang; Dong, Qianhui
2016-01-01
Owing to their numerous merits, such as compact, autonomous and independence, the strapdown inertial navigation system (SINS) and celestial navigation system (CNS) can be used in marine applications. What is more, due to the complementary navigation information obtained from two different kinds of sensors, the accuracy of the SINS/CNS integrated navigation system can be enhanced availably. Thus, the SINS/CNS system is widely used in the marine navigation field. However, the CNS is easily interfered with by the surroundings, which will lead to the output being discontinuous. Thus, the uncertainty problem caused by the lost measurement will reduce the system accuracy. In this paper, a robust H∞ filter based on the Krein space theory is proposed. The Krein space theory is introduced firstly, and then, the linear state and observation models of the SINS/CNS integrated navigation system are established reasonably. By taking the uncertainty problem into account, in this paper, a new robust H∞ filter is proposed to improve the robustness of the integrated system. At last, this new robust filter based on the Krein space theory is estimated by numerical simulations and actual experiments. Additionally, the simulation and experiment results and analysis show that the attitude errors can be reduced by utilizing the proposed robust filter effectively when the measurements are missing discontinuous. Compared to the traditional Kalman filter (KF) method, the accuracy of the SINS/CNS integrated system is improved, verifying the robustness and the availability of the proposed robust H∞ filter. PMID:26999153
ERIC Educational Resources Information Center
Lee, Carol D.
2017-01-01
This chapter addresses how fundamental principles regarding how people learn in the last decade open up possibilities for conceptualizing a broad ecological culturally rooted framework for the design of robust learning environments in a variety of settings, especially schools. These cross-disciplinary principles emerging from across relevant…
Simulation Based Acquisition for NASA's Office of Exploration Systems
NASA Technical Reports Server (NTRS)
Hale, Joe
2004-01-01
In January 2004, President George W. Bush unveiled his vision for NASA to advance U.S. scientific, security, and economic interests through a robust space exploration program. This vision includes the goal to extend human presence across the solar system, starting with a human return to the Moon no later than 2020, in preparation for human exploration of Mars and other destinations. In response to this vision, NASA has created the Office of Exploration Systems (OExS) to develop the innovative technologies, knowledge, and infrastructures to explore and support decisions about human exploration destinations, including the development of a new Crew Exploration Vehicle (CEV). Within the OExS organization, NASA is implementing Simulation Based Acquisition (SBA), a robust Modeling & Simulation (M&S) environment integrated across all acquisition phases and programs/teams, to make the realization of the President s vision more certain. Executed properly, SBA will foster better informed, timelier, and more defensible decisions throughout the acquisition life cycle. By doing so, SBA will improve the quality of NASA systems and speed their development, at less cost and risk than would otherwise be the case. SBA is a comprehensive, Enterprise-wide endeavor that necessitates an evolved culture, a revised spiral acquisition process, and an infrastructure of advanced Information Technology (IT) capabilities. SBA encompasses all project phases (from requirements analysis and concept formulation through design, manufacture, training, and operations), professional disciplines, and activities that can benefit from employing SBA capabilities. SBA capabilities include: developing and assessing system concepts and designs; planning manufacturing, assembly, transport, and launch; training crews, maintainers, launch personnel, and controllers; planning and monitoring missions; responding to emergencies by evaluating effects and exploring solutions; and communicating across the OExS enterprise, within the Government, and with the general public. The SBA process features empowered collaborative teams (including industry partners) to integrate requirements, acquisition, training, operations, and sustainment. The SBA process also utilizes an increased reliance on and investment in M&S to reduce design risk. SBA originated as a joint Industry and Department of Defense (DoD) initiative to define and integrate an acquisition process that employs robust, collaborative use of M&S technology across acquisition phases and programs. The SBA process was successfully implemented in the Air Force s Joint Strike Fighter (JSF) Program.
Robust feature detection and local classification for surfaces based on moment analysis.
Clarenz, Ulrich; Rumpf, Martin; Telea, Alexandru
2004-01-01
The stable local classification of discrete surfaces with respect to features such as edges and corners or concave and convex regions, respectively, is as quite difficult as well as indispensable for many surface processing applications. Usually, the feature detection is done via a local curvature analysis. If concerned with large triangular and irregular grids, e.g., generated via a marching cube algorithm, the detectors are tedious to treat and a robust classification is hard to achieve. Here, a local classification method on surfaces is presented which avoids the evaluation of discretized curvature quantities. Moreover, it provides an indicator for smoothness of a given discrete surface and comes together with a built-in multiscale. The proposed classification tool is based on local zero and first moments on the discrete surface. The corresponding integral quantities are stable to compute and they give less noisy results compared to discrete curvature quantities. The stencil width for the integration of the moments turns out to be the scale parameter. Prospective surface processing applications are the segmentation on surfaces, surface comparison, and matching and surface modeling. Here, a method for feature preserving fairing of surfaces is discussed to underline the applicability of the presented approach.
Vehicle System Integration, Optimization, and Robustness
Operations Technology Exchange Initiating Partnerships University Partners Government Partners Industry Contacts Researchers Thrust Area 5: Vehicle System Integration, Optimization, and Robustness Thrust Area only optimal design of the vehicle components, but also an optimization of the interactions between
Robustness, evolvability, and the logic of genetic regulation.
Payne, Joshua L; Moore, Jason H; Wagner, Andreas
2014-01-01
In gene regulatory circuits, the expression of individual genes is commonly modulated by a set of regulating gene products, which bind to a gene's cis-regulatory region. This region encodes an input-output function, referred to as signal-integration logic, that maps a specific combination of regulatory signals (inputs) to a particular expression state (output) of a gene. The space of all possible signal-integration functions is vast and the mapping from input to output is many-to-one: For the same set of inputs, many functions (genotypes) yield the same expression output (phenotype). Here, we exhaustively enumerate the set of signal-integration functions that yield identical gene expression patterns within a computational model of gene regulatory circuits. Our goal is to characterize the relationship between robustness and evolvability in the signal-integration space of regulatory circuits, and to understand how these properties vary between the genotypic and phenotypic scales. Among other results, we find that the distributions of genotypic robustness are skewed, so that the majority of signal-integration functions are robust to perturbation. We show that the connected set of genotypes that make up a given phenotype are constrained to specific regions of the space of all possible signal-integration functions, but that as the distance between genotypes increases, so does their capacity for unique innovations. In addition, we find that robust phenotypes are (i) evolvable, (ii) easily identified by random mutation, and (iii) mutationally biased toward other robust phenotypes. We explore the implications of these latter observations for mutation-based evolution by conducting random walks between randomly chosen source and target phenotypes. We demonstrate that the time required to identify the target phenotype is independent of the properties of the source phenotype.
Robustness, Evolvability, and the Logic of Genetic Regulation
Moore, Jason H.; Wagner, Andreas
2014-01-01
In gene regulatory circuits, the expression of individual genes is commonly modulated by a set of regulating gene products, which bind to a gene’s cis-regulatory region. This region encodes an input-output function, referred to as signal-integration logic, that maps a specific combination of regulatory signals (inputs) to a particular expression state (output) of a gene. The space of all possible signal-integration functions is vast and the mapping from input to output is many-to-one: for the same set of inputs, many functions (genotypes) yield the same expression output (phenotype). Here, we exhaustively enumerate the set of signal-integration functions that yield idential gene expression patterns within a computational model of gene regulatory circuits. Our goal is to characterize the relationship between robustness and evolvability in the signal-integration space of regulatory circuits, and to understand how these properties vary between the genotypic and phenotypic scales. Among other results, we find that the distributions of genotypic robustness are skewed, such that the majority of signal-integration functions are robust to perturbation. We show that the connected set of genotypes that make up a given phenotype are constrained to specific regions of the space of all possible signal-integration functions, but that as the distance between genotypes increases, so does their capacity for unique innovations. In addition, we find that robust phenotypes are (i) evolvable, (ii) easily identified by random mutation, and (iii) mutationally biased toward other robust phenotypes. We explore the implications of these latter observations for mutation-based evolution by conducting random walks between randomly chosen source and target phenotypes. We demonstrate that the time required to identify the target phenotype is independent of the properties of the source phenotype. PMID:23373974
Model reference tracking control of an aircraft: a robust adaptive approach
NASA Astrophysics Data System (ADS)
Tanyer, Ilker; Tatlicioglu, Enver; Zergeroglu, Erkan
2017-05-01
This work presents the design and the corresponding analysis of a nonlinear robust adaptive controller for model reference tracking of an aircraft that has parametric uncertainties in its system matrices and additive state- and/or time-dependent nonlinear disturbance-like terms in its dynamics. Specifically, robust integral of the sign of the error feedback term and an adaptive term is fused with a proportional integral controller. Lyapunov-based stability analysis techniques are utilised to prove global asymptotic convergence of the output tracking error. Extensive numerical simulations are presented to illustrate the performance of the proposed robust adaptive controller.
Systems Integration Processes for NASA Ares I Crew Launch Vehicle
NASA Technical Reports Server (NTRS)
Taylor, James L.; Reuter, James L.; Sexton, Jeffrey D.
2006-01-01
NASA's Exploration Initiative will require development of many new elements to constitute a robust system of systems. New launch vehicles are needed to place cargo and crew in stable Low Earth Orbit (LEO). This paper examines the systems integration processes NASA is utilizing to ensure integration and control of propulsion and nonpropulsion elements within NASA's Crew Launch Vehicle (CLV), now known as the Ares I. The objective of the Ares I is to provide the transportation capabilities to meet the Constellation Program requirements for delivering a Crew Exploration Vehicle (CEV) or other payload to LEO in support of the lunar and Mars missions. The Ares I must successfully provide this capability within cost and schedule, and with an acceptable risk approach. This paper will describe the systems engineering management processes that will be applied to assure Ares I Project success through complete and efficient technical integration. Discussion of technical review and management processes for requirements development and verification, integrated design and analysis, integrated simulation and testing, and the integration of reliability, maintainability and supportability (RMS) into the design will also be included. The Ares I Project is logically divided into elements by the major hardware groupings, and associated management, system engineering, and integration functions. The processes to be described herein are designed to integrate within these Ares I elements and among the other Constellation projects. Also discussed is launch vehicle stack integration (Ares I to CEV, and Ground and Flight Operations integration) throughout the life cycle, including integrated vehicle performance through orbital insertion, recovery of the first stage, and reentry of the upper stage. The processes for decomposing requirements to the elements and ensuring that requirements have been correctly validated, decomposed, and allocated, and that the verification requirements are properly defined to ensure that the system design meets requirements, will be discussed.
Robust numerical electromagnetic eigenfunction expansion algorithms
NASA Astrophysics Data System (ADS)
Sainath, Kamalesh
This thesis summarizes developments in rigorous, full-wave, numerical spectral-domain (integral plane wave eigenfunction expansion [PWE]) evaluation algorithms concerning time-harmonic electromagnetic (EM) fields radiated by generally-oriented and positioned sources within planar and tilted-planar layered media exhibiting general anisotropy, thickness, layer number, and loss characteristics. The work is motivated by the need to accurately and rapidly model EM fields radiated by subsurface geophysical exploration sensors probing layered, conductive media, where complex geophysical and man-made processes can lead to micro-laminate and micro-fractured geophysical formations exhibiting, at the lower (sub-2MHz) frequencies typically employed for deep EM wave penetration through conductive geophysical media, bulk-scale anisotropic (i.e., directional) electrical conductivity characteristics. When the planar-layered approximation (layers of piecewise-constant material variation and transversely-infinite spatial extent) is locally, near the sensor region, considered valid, numerical spectral-domain algorithms are suitable due to their strong low-frequency stability characteristic, and ability to numerically predict time-harmonic EM field propagation in media with response characterized by arbitrarily lossy and (diagonalizable) dense, anisotropic tensors. If certain practical limitations are addressed, PWE can robustly model sensors with general position and orientation that probe generally numerous, anisotropic, lossy, and thick layers. The main thesis contributions, leading to a sensor and geophysical environment-robust numerical modeling algorithm, are as follows: (1) Simple, rapid estimator of the region (within the complex plane) containing poles, branch points, and branch cuts (critical points) (Chapter 2), (2) Sensor and material-adaptive azimuthal coordinate rotation, integration contour deformation, integration domain sub-region partition and sub-region-dependent integration order (Chapter 3), (3) Integration partition-extrapolation-based (Chapter 3) and Gauss-Laguerre Quadrature (GLQ)-based (Chapter 4) evaluations of the deformed, semi-infinite-length integration contour tails, (4) Robust in-situ-based (i.e., at the spectral-domain integrand level) direct/homogeneous-medium field contribution subtraction and analytical curbing of the source current spatial spectrum function's ill behavior (Chapter 5), and (5) Analytical re-casting of the direct-field expressions when the source is embedded within a NBAM, short for non-birefringent anisotropic medium (Chapter 6). The benefits of these contributions are, respectively, (1) Avoiding computationally intensive critical-point location and tracking (computation time savings), (2) Sensor and material-robust curbing of the integrand's oscillatory and slow decay behavior, as well as preventing undesirable critical-point migration within the complex plane (computation speed, precision, and instability-avoidance benefits), (3) sensor and material-robust reduction (or, for GLQ, elimination) of integral truncation error, (4) robustly stable modeling of scattered fields and/or fields radiated from current sources modeled as spatially distributed (10 to 1000-fold compute-speed acceleration also realized for distributed-source computations), and (5) numerically stable modeling of fields radiated from sources within NBAM layers. Having addressed these limitations, are PWE algorithms applicable to modeling EM waves in tilted planar-layered geometries too? This question is explored in Chapter 7 using a Transformation Optics-based approach, allowing one to model wave propagation through layered media that (in the sensor's vicinity) possess tilted planar interfaces. The technique leads to spurious wave scattering however, whose induced computation accuracy degradation requires analysis. Mathematical exhibition, and exhaustive simulation-based study and analysis of the limitations of, this novel tilted-layer modeling formulation is Chapter 7's main contribution.
Deep Neural Networks for Speech Separation With Application to Robust Speech Recognition
acoustic -phonetic features. The second objective is integration of spectrotemporal context for improved separation performance. Conditional random fields...will be used to encode contextual constraints. The third objective is to achieve robust ASR in the DNN framework through integrated acoustic modeling
Adaptive integral robust control and application to electromechanical servo systems.
Deng, Wenxiang; Yao, Jianyong
2017-03-01
This paper proposes a continuous adaptive integral robust control with robust integral of the sign of the error (RISE) feedback for a class of uncertain nonlinear systems, in which the RISE feedback gain is adapted online to ensure the robustness against disturbances without the prior bound knowledge of the additive disturbances. In addition, an adaptive compensation integrated with the proposed adaptive RISE feedback term is also constructed to further reduce design conservatism when the system also exists parametric uncertainties. Lyapunov analysis reveals the proposed controllers could guarantee the tracking errors are asymptotically converging to zero with continuous control efforts. To illustrate the high performance nature of the developed controllers, numerical simulations are provided. At the end, an application case of an actual electromechanical servo system driven by motor is also studied, with some specific design consideration, and comparative experimental results are obtained to verify the effectiveness of the proposed controllers. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Amyris, Inc. Integrated Biorefinery Project Summary Final Report - Public Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, David; Sato, Suzanne; Garcia, Fernando
The Amyris pilot-scale Integrated Biorefinery (IBR) leveraged Amyris synthetic biology and process technology experience to upgrade Amyris’s existing Emeryville, California pilot plant and fermentation labs to enable development of US-based production capabilities for renewable diesel fuel and alternative chemical products. These products were derived semi-synthetically from high-impact biomass feedstocks via microbial fermentation to the 15-carbon intermediate farnesene, with subsequent chemical finishing to farnesane. The Amyris IBR team tested and provided methods for production of diesel and alternative chemical products from sweet sorghum, and other high-impact lignocellulosic feedstocks, at pilot scale. This enabled robust techno-economic analysis (TEA), regulatory approvals, and amore » basis for full-scale manufacturing processes and facility design.« less
Two-dimensional phase unwrapping using robust derivative estimation and adaptive integration.
Strand, Jarle; Taxt, Torfinn
2002-01-01
The adaptive integration (ADI) method for two-dimensional (2-D) phase unwrapping is presented. The method uses an algorithm for noise robust estimation of partial derivatives, followed by a noise robust adaptive integration process. The ADI method can easily unwrap phase images with moderate noise levels, and the resulting images are congruent modulo 2pi with the observed, wrapped, input images. In a quantitative evaluation, both the ADI and the BLS methods (Strand et al.) were better than the least-squares methods of Ghiglia and Romero (GR), and of Marroquin and Rivera (MRM). In a qualitative evaluation, the ADI, the BLS, and a conjugate gradient version of the MRM method (MRMCG), were all compared using a synthetic image with shear, using 115 magnetic resonance images, and using 22 fiber-optic interferometry images. For the synthetic image and the interferometry images, the ADI method gave consistently visually better results than the other methods. For the MR images, the MRMCG method was best, and the ADI method second best. The ADI method was less sensitive to the mask definition and the block size than the BLS method, and successfully unwrapped images with shears that were not marked in the masks. The computational requirements of the ADI method for images of nonrectangular objects were comparable to only two iterations of many least-squares-based methods (e.g., GR). We believe the ADI method provides a powerful addition to the ensemble of tools available for 2-D phase unwrapping.
NASA Astrophysics Data System (ADS)
Aye, S. A.; Heyns, P. S.
2017-02-01
This paper proposes an optimal Gaussian process regression (GPR) for the prediction of remaining useful life (RUL) of slow speed bearings based on a novel degradation assessment index obtained from acoustic emission signal. The optimal GPR is obtained from an integration or combination of existing simple mean and covariance functions in order to capture the observed trend of the bearing degradation as well the irregularities in the data. The resulting integrated GPR model provides an excellent fit to the data and improves over the simple GPR models that are based on simple mean and covariance functions. In addition, it achieves a low percentage error prediction of the remaining useful life of slow speed bearings. These findings are robust under varying operating conditions such as loading and speed and can be applied to nonlinear and nonstationary machine response signals useful for effective preventive machine maintenance purposes.
Harnessing glycomics technologies: integrating structure with function for glycan characterization
Robinson, Luke N.; Artpradit, Charlermchai; Raman, Rahul; Shriver, Zachary H.; Ruchirawat, Mathuros; Sasisekharan, Ram
2013-01-01
Glycans, or complex carbohydrates, are a ubiquitous class of biological molecules which impinge on a variety of physiological processes ranging from signal transduction to tissue development and microbial pathogenesis. In comparison to DNA and proteins, glycans present unique challenges to the study of their structure and function owing to their complex and heterogeneous structures and the dominant role played by multivalency in their sequence-specific biological interactions. Arising from these challenges, there is a need to integrate information from multiple complementary methods to decode structure-function relationships. Focusing on acidic glycans, we describe here key glycomics technologies for characterizing their structural attributes, including linkage, modifications, and topology, as well as for elucidating their role in biological processes. Two cases studies, one involving sialylated branched glycans and the other sulfated glycosaminoglycans, are used to highlight how integration of orthogonal information from diverse datasets enables rapid convergence of glycan characterization for development of robust structure-function relationships. PMID:22522536
Fourier transform spectrometer controller for partitioned architectures
NASA Astrophysics Data System (ADS)
Tamas-Selicean, D.; Keymeulen, D.; Berisford, D.; Carlson, R.; Hand, K.; Pop, P.; Wadsworth, W.; Levy, R.
The current trend in spacecraft computing is to integrate applications of different criticality levels on the same platform using no separation. This approach increases the complexity of the development, verification and integration processes, with an impact on the whole system life cycle. Researchers at ESA and NASA advocated for the use of partitioned architecture to reduce this complexity. Partitioned architectures rely on platform mechanisms to provide robust temporal and spatial separation between applications. Such architectures have been successfully implemented in several industries, such as avionics and automotive. In this paper we investigate the challenges of developing and the benefits of integrating a scientific instrument, namely a Fourier Transform Spectrometer, in such a partitioned architecture.
Mahieu, Nathaniel G.; Spalding, Jonathan L.; Patti, Gary J.
2016-01-01
Motivation: Current informatic techniques for processing raw chromatography/mass spectrometry data break down under several common, non-ideal conditions. Importantly, hydrophilic liquid interaction chromatography (a key separation technology for metabolomics) produces data which are especially challenging to process. We identify three critical points of failure in current informatic workflows: compound specific drift, integration region variance, and naive missing value imputation. We implement the Warpgroup algorithm to address these challenges. Results: Warpgroup adds peak subregion detection, consensus integration bound detection, and intelligent missing value imputation steps to the conventional informatic workflow. When compared with the conventional workflow, Warpgroup made major improvements to the processed data. The coefficient of variation for peaks detected in replicate injections of a complex Escherichia Coli extract were halved (a reduction of 19%). Integration regions across samples were much more robust. Additionally, many signals lost by the conventional workflow were ‘rescued’ by the Warpgroup refinement, thereby resulting in greater analyte coverage in the processed data. Availability and implementation: Warpgroup is an open source R package available on GitHub at github.com/nathaniel-mahieu/warpgroup. The package includes example data and XCMS compatibility wrappers for ease of use. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: nathaniel.mahieu@wustl.edu or gjpattij@wustl.edu PMID:26424859
Fully Integrated Optical Spectrometer in Visible and Near-IR in CMOS.
Hong, Lingyu; Sengupta, Kaushik
2017-12-01
Optical spectrometry in the visible and near-infrared range has a wide range of applications in healthcare, sensing, imaging, and diagnostics. This paper presents the first fully integrated optical spectrometer in standard bulk CMOS process without custom fabrication, postprocessing, or any external optical passive structure such as lenses, gratings, collimators, or mirrors. The architecture exploits metal interconnect layers available in CMOS processes with subwavelength feature sizes to guide, manipulate, control, diffract light, integrated photodetector, and read-out circuitry to detect dispersed light, and then back-end signal processing for robust spectral estimation. The chip, realized in bulk 65-nm low power-CMOS process, measures 0.64 mm 0.56 mm in active area, and achieves 1.4 nm in peak detection accuracy for continuous wave excitations between 500 and 830 nm. This paper demonstrates the ability to use these metal-optic nanostructures to miniaturize complex optical instrumentation into a new class of optics-free CMOS-based systems-on-chip in the visible and near-IR for various sensing and imaging applications.
NASA Astrophysics Data System (ADS)
Frits, Andrew P.
In the current Navy environment of undersea weapons development, the engineering aspect of design is decoupled from the development of the tactics with which the weapon is employed. Tactics are developed by intelligence experts, warfighters, and wargamers, while torpedo design is handled by engineers and contractors. This dissertation examines methods by which the conceptual design process of undersea weapon systems, including both torpedo systems and mine counter-measure systems, can be improved. It is shown that by simultaneously designing the torpedo and the tactics with which undersea weapons are used, a more effective overall weapon system can be created. In addition to integrating torpedo tactics with design, the thesis also looks at design methods to account for uncertainty. The uncertainty is attributable to multiple sources, including: lack of detailed analysis tools early in the design process, incomplete knowledge of the operational environments, and uncertainty in the performance of potential technologies. A robust design process is introduced to account for this uncertainty in the analysis and optimization of torpedo systems through the combination of Monte Carlo simulation with response surface methodology and metamodeling techniques. Additionally, various other methods that are appropriate to uncertainty analysis are discussed and analyzed. The thesis also advances a new approach towards examining robustness and risk: the treatment of probability of success (POS) as an independent variable. Examining the cost and performance tradeoffs between high and low probability of success designs, the decision-maker can make better informed decisions as to what designs are most promising and determine the optimal balance of risk, cost, and performance. Finally, the thesis examines the use of non-dimensionalization of parameters for torpedo design. The thesis shows that the use of non-dimensional torpedo parameters leads to increased knowledge about the scaleability of torpedo systems and increased performance of Designs of Experiments.
Integrated direct/indirect adaptive robust motion trajectory tracking control of pneumatic cylinders
NASA Astrophysics Data System (ADS)
Meng, Deyuan; Tao, Guoliang; Zhu, Xiaocong
2013-09-01
This paper studies the precision motion trajectory tracking control of a pneumatic cylinder driven by a proportional-directional control valve. An integrated direct/indirect adaptive robust controller is proposed. The controller employs a physical model based indirect-type parameter estimation to obtain reliable estimates of unknown model parameters, and utilises a robust control method with dynamic compensation type fast adaptation to attenuate the effects of parameter estimation errors, unmodelled dynamics and disturbances. Due to the use of projection mapping, the robust control law and the parameter adaption algorithm can be designed separately. Since the system model uncertainties are unmatched, the recursive backstepping technology is adopted to design the robust control law. Extensive comparative experimental results are presented to illustrate the effectiveness of the proposed controller and its performance robustness to parameter variations and sudden disturbances.
Optical temperature compensation schemes of spectral modulation sensors for aircraft engine control
NASA Astrophysics Data System (ADS)
Berkcan, Ertugrul
1993-02-01
Optical temperature compensation schemes for the ratiometric interrogation of spectral modulation sensors for source temperature robustness are presented. We have obtained better than 50 - 100X decrease of the temperature coefficient of the sensitivity using these types of compensation. We have also developed a spectrographic interrogation scheme that provides increased source temperature robustness; this affords a significantly improved accuracy over FADEC temperature ranges as well as temperature coefficient of the sensitivity that is substantially and further reduced. This latter compensation scheme can be integrated in a small E/O package including the detection, analog and digital signal processing. We find that these interrogation schemes can be used within a detector spatially multiplexed architecture.
Automating an integrated spatial data-mining model for landfill site selection
NASA Astrophysics Data System (ADS)
Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul
2017-10-01
An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.
Control Systems Cyber Security:Defense in Depth Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Kuipers; Mark Fabro
2006-05-01
Information infrastructures across many public and private domains share several common attributes regarding IT deployments and data communications. This is particularly true in the control systems domain. A majority of the systems use robust architectures to enhance business and reduce costs by increasing the integration of external, business, and control system networks. However, multi-network integration strategies often lead to vulnerabilities that greatly reduce the security of an organization, and can expose mission-critical control systems to cyber threats. This document provides guidance and direction for developing ‘defense-in-depth’ strategies for organizations that use control system networks while maintaining a multi-tier information architecturemore » that requires: Maintenance of various field devices, telemetry collection, and/or industrial-level process systems Access to facilities via remote data link or modem Public facing services for customer or corporate operations A robust business environment that requires connections among the control system domain, the external Internet, and other peer organizations.« less
Control Systems Cyber Security: Defense-in-Depth Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mark Fabro
2007-10-01
Information infrastructures across many public and private domains share several common attributes regarding IT deployments and data communications. This is particularly true in the control systems domain. A majority of the systems use robust architectures to enhance business and reduce costs by increasing the integration of external, business, and control system networks. However, multi-network integration strategies often lead to vulnerabilities that greatly reduce the security of an organization, and can expose mission-critical control systems to cyber threats. This document provides guidance and direction for developing ‘defense-in-depth’ strategies for organizations that use control system networks while maintaining a multi-tier information architecturemore » that requires: • Maintenance of various field devices, telemetry collection, and/or industrial-level process systems • Access to facilities via remote data link or modem • Public facing services for customer or corporate operations • A robust business environment that requires connections among the control system domain, the external Internet, and other peer organizations.« less
Technical note: Dynamic INtegrated Gap-filling and partitioning for OzFlux (DINGO)
NASA Astrophysics Data System (ADS)
Beringer, Jason; McHugh, Ian; Hutley, Lindsay B.; Isaac, Peter; Kljun, Natascha
2017-03-01
Standardised, quality-controlled and robust data from flux networks underpin the understanding of ecosystem processes and tools necessary to support the management of natural resources, including water, carbon and nutrients for environmental and production benefits. The Australian regional flux network (OzFlux) currently has 23 active sites and aims to provide a continental-scale national research facility to monitor and assess Australia's terrestrial biosphere and climate for improved predictions. Given the need for standardised and effective data processing of flux data, we have developed a software suite, called the Dynamic INtegrated Gap-filling and partitioning for OzFlux (DINGO), that enables gap-filling and partitioning of the primary fluxes into ecosystem respiration (Fre) and gross primary productivity (GPP) and subsequently provides diagnostics and results. We outline the processing pathways and methodologies that are applied in DINGO (v13) to OzFlux data, including (1) gap-filling of meteorological and other drivers; (2) gap-filling of fluxes using artificial neural networks; (3) the u* threshold determination; (4) partitioning into ecosystem respiration and gross primary productivity; (5) random, model and u* uncertainties; and (6) diagnostic, footprint calculation, summary and results outputs. DINGO was developed for Australian data, but the framework is applicable to any flux data or regional network. Quality data from robust systems like DINGO ensure the utility and uptake of the flux data and facilitates synergies between flux, remote sensing and modelling.
A Highly Stretchable and Robust Non-fluorinated Superhydrophobic Surface.
Ju, Jie; Yao, Xi; Hou, Xu; Liu, Qihan; Zhang, Yu Shrike; Khademhosseini, Ali
2017-08-21
Superhydrophobic surface simultaneously possessing exceptional stretchability, robustness, and non-fluorination is highly desirable in applications ranging from wearable devices to artificial skins. While conventional superhydrophobic surfaces typically feature stretchability, robustness, or non-fluorination individually, co-existence of all these features still remains a great challenge. Here we report a multi-performance superhydrophobic surface achieved through incorporating hydrophilic micro-sized particles with pre-stretched silicone elastomer. The commercial silicone elastomer (Ecoflex) endowed the resulting surface with high stretchability; the densely packed micro-sized particles in multi-layers contributed to the preservation of the large surface roughness even under large strains; and the physical encapsulation of the microparticles by silicone elastomer due to the capillary dragging effect and the chemical interaction between the hydrophilic silica and the elastomer gave rise to the robust and non-fluorinated superhydrophobicity. It was demonstrated that the as-prepared fluorine-free surface could preserve the superhydrophobicity under repeated stretching-relaxing cycles. Most importantly, the surface's superhydrophobicity can be well maintained after severe rubbing process, indicating wear-resistance. Our novel superhydrophobic surface integrating multiple key properties, i.e. stretchability, robustness, and non-fluorination, is expected to provide unique advantages for a wide range of applications in biomedicine, energy, and electronics.
Implementing dashboards as a business intelligence tool in the forest inventory and analysis program
Scott A. Pugh; Randall S. Morin; Barbara A. Johnson
2015-01-01
Today is the era of âbig dataâ where businesses have access to enormous amounts of often complex and sometimes unwieldy data. Businesses are using business intelligence (BI) systems to transform this data into useful information for management decisions. BI systems integrate applications, processes, data, and people to deliver prompt and robust analyses. A number of...
NASA Astrophysics Data System (ADS)
Hot, Aurélien; Weisser, Thomas; Cogan, Scott
2017-07-01
Uncertainty quantification is an integral part of the model validation process and is important to take into account during the design of mechanical systems. Sources of uncertainty are diverse but generally fall into two categories: aleatory due to random process and epistemic resulting from a lack of knowledge. This work focuses on the behavior of solar arrays in their stowed configuration. To avoid impacts during launch, snubbers are used to prestress the panels. Since the mechanical properties of the snubbers and the associated preload configurations are difficult to characterize precisely, an info-gap approach is proposed to investigate the influence of such uncertainties on design configurations obtained for different values of safety factors. This eventually allows to revise the typical values of these factors and to reevaluate them with respect to a targeted robustness level. The proposed methodology is illustrated using a simplified finite element model of a solar array.
Synthetic Biology and Microbial Fuel Cells: Towards Self-Sustaining Life Support Systems
NASA Technical Reports Server (NTRS)
Hogan, John Andrew
2014-01-01
NASA ARC and the J. Craig Venter Institute (JCVI) collaborated to investigate the development of advanced microbial fuels cells (MFCs) for biological wastewater treatment and electricity production (electrogenesis). Synthetic biology techniques and integrated hardware advances were investigated to increase system efficiency and robustness, with the intent of increasing power self-sufficiency and potential product formation from carbon dioxide. MFCs possess numerous advantages for space missions, including rapid processing, reduced biomass and effective removal of organics, nitrogen and phosphorus. Project efforts include developing space-based MFC concepts, integration analyses, increasing energy efficiency, and investigating novel bioelectrochemical system applications
Spike processing with a graphene excitable laser
Shastri, Bhavin J.; Nahmias, Mitchell A.; Tait, Alexander N.; Rodriguez, Alejandro W.; Wu, Ben; Prucnal, Paul R.
2016-01-01
Novel materials and devices in photonics have the potential to revolutionize optical information processing, beyond conventional binary-logic approaches. Laser systems offer a rich repertoire of useful dynamical behaviors, including the excitable dynamics also found in the time-resolved “spiking” of neurons. Spiking reconciles the expressiveness and efficiency of analog processing with the robustness and scalability of digital processing. We demonstrate a unified platform for spike processing with a graphene-coupled laser system. We show that this platform can simultaneously exhibit logic-level restoration, cascadability and input-output isolation—fundamental challenges in optical information processing. We also implement low-level spike-processing tasks that are critical for higher level processing: temporal pattern detection and stable recurrent memory. We study these properties in the context of a fiber laser system and also propose and simulate an analogous integrated device. The addition of graphene leads to a number of advantages which stem from its unique properties, including high absorption and fast carrier relaxation. These could lead to significant speed and efficiency improvements in unconventional laser processing devices, and ongoing research on graphene microfabrication promises compatibility with integrated laser platforms. PMID:26753897
Neural circuits in Auditory and Audiovisual Memory
Plakke, B.; Romanski, L.M.
2016-01-01
Working memory is the ability to employ recently seen or heard stimuli and apply them to changing cognitive context. Although much is known about language processing and visual working memory, the neurobiological basis of auditory working memory is less clear. Historically, part of the problem has been the difficulty in obtaining a robust animal model to study auditory short-term memory. In recent years there has been neurophysiological and lesion studies indicating a cortical network involving both temporal and frontal cortices. Studies specifically targeting the role of the prefrontal cortex (PFC) in auditory working memory have suggested that dorsal and ventral prefrontal regions perform different roles during the processing of auditory mnemonic information, with the dorsolateral PFC performing similar functions for both auditory and visual working memory. In contrast, the ventrolateral PFC (VLPFC), which contains cells that respond robustly to auditory stimuli and that process both face and vocal stimuli may be an essential locus for both auditory and audiovisual working memory. These findings suggest a critical role for the VLPFC in the processing, integrating, and retaining of communication information. PMID:26656069
Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration
NASA Astrophysics Data System (ADS)
Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.
2017-12-01
Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.
NASA Technical Reports Server (NTRS)
Shin, Jong-Yeob; Belcastro, Christine
2008-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.
NASA Astrophysics Data System (ADS)
Chou, Shuo-Ju
2011-12-01
In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.
A robust and hierarchical approach for the automatic co-registration of intensity and visible images
NASA Astrophysics Data System (ADS)
González-Aguilera, Diego; Rodríguez-Gonzálvez, Pablo; Hernández-López, David; Luis Lerma, José
2012-09-01
This paper presents a new robust approach to integrate intensity and visible images which have been acquired with a terrestrial laser scanner and a calibrated digital camera, respectively. In particular, an automatic and hierarchical method for the co-registration of both sensors is developed. The approach integrates several existing solutions to improve the performance of the co-registration between range-based and visible images: the Affine Scale-Invariant Feature Transform (A-SIFT), the epipolar geometry, the collinearity equations, the Groebner basis solution and the RANdom SAmple Consensus (RANSAC), integrating a voting scheme. The approach presented herein improves the existing co-registration approaches in automation, robustness, reliability and accuracy.
Robust high-performance control for robotic manipulators
NASA Technical Reports Server (NTRS)
Seraji, H.
1989-01-01
A robust control scheme to accomplish accurate trajectory tracking for an integrated system of manipulator-plus-actuators is proposed. The control scheme comprises a feedforward and a feedback controller. The feedforward controller contains any known part of the manipulator dynamics that can be used for online control. The feedback controller consists of adaptive position and velocity feedback gains and an auxiliary signal which is simply generated by a fixed-gain proportional/integral/derivative controller. The feedback controller is updated by very simple adaptation laws which contain both proportional and integral adaptation terms. By introduction of a simple sigma modification to the adaptation laws, robustness is guaranteed in the presence of unmodeled dynamics and disturbances.
Engineering Robustness of Microbial Cell Factories.
Gong, Zhiwei; Nielsen, Jens; Zhou, Yongjin J
2017-10-01
Metabolic engineering and synthetic biology offer great prospects in developing microbial cell factories capable of converting renewable feedstocks into fuels, chemicals, food ingredients, and pharmaceuticals. However, prohibitively low production rate and mass concentration remain the major hurdles in industrial processes even though the biosynthetic pathways are comprehensively optimized. These limitations are caused by a variety of factors unamenable for host cell survival, such as harsh industrial conditions, fermentation inhibitors from biomass hydrolysates, and toxic compounds including metabolic intermediates and valuable target products. Therefore, engineered microbes with robust phenotypes is essential for achieving higher yield and productivity. In this review, the recent advances in engineering robustness and tolerance of cell factories is described to cope with these issues and briefly introduce novel strategies with great potential to enhance the robustness of cell factories, including metabolic pathway balancing, transporter engineering, and adaptive laboratory evolution. This review also highlights the integration of advanced systems and synthetic biology principles toward engineering the harmony of overall cell function, more than the specific pathways or enzymes. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nie, Chuanxiong; Peng, Zihang; Yang, Ye; Cheng, Chong; Ma, Lang; Zhao, Changsheng
2016-11-15
Developing robust and recyclable absorbents for water purification is of great demand to control water pollution and to provide sustainable water resources. Herein, for the first time, we reported the fabrication of Kevlar nanofiber (KNF) based composite particles for water purification. Both the KNF and KNF-carbon nanotube composite particles can be produced in large-scale by automatic injection of casting solution into ethanol. The resulted nanofibrous particles showed high adsorption capacities towards various pollutants, including metal ions, phenylic compounds and various dyes. Meanwhile, the adsorption process towards dyes was found to fit well with the pseudo-second-order model, while the adsorption speed was controlled by intraparticle diffusion. Furthermore, the adsorption capacities of the nanofibrous particles could be easily recovered by washing with ethanol. In general, the KNF based particles integrate the advantages of easy production, robust and effective adsorption performances, as well as good recyclability, which can be used as robust absorbents to remove toxic molecules and forward the application of absorbents in water purification. Copyright © 2016 Elsevier B.V. All rights reserved.
He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong
2016-03-01
In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Integrative analyses of human reprogramming reveal dynamic nature of induced pluripotency
Cacchiarelli, Davide; Trapnell, Cole; Ziller, Michael J.; Soumillon, Magali; Cesana, Marcella; Karnik, Rahul; Donaghey, Julie; Smith, Zachary D.; Ratanasirintrawoot, Sutheera; Zhang, Xiaolan; Ho Sui, Shannan J.; Wu, Zhaoting; Akopian, Veronika; Gifford, Casey A.; Doench, John; Rinn, John L.; Daley, George Q.; Meissner, Alexander; Lander, Eric S.; Mikkelsen, Tarjei S.
2015-01-01
Summary Induced pluripotency is a promising avenue for disease modeling and therapy, but the molecular principles underlying this process, particularly in human cells, remain poorly understood due to donor-to-donor variability and intercellular heterogeneity. Here we constructed and characterized a clonal, inducible human reprogramming system that provides a reliable source of cells at any stage of the process. This system enabled integrative transcriptional and epigenomic analysis across the human reprogramming timeline at high resolution. We observed distinct waves of gene network activation, including the ordered reactivation of broad developmental regulators followed by early embryonic patterning genes and culminating in the emergence of a signature reminiscent of pre-implantation stages. Moreover, complementary functional analyses allowed us to identify and validate novel regulators of the reprogramming process. Altogether, this study sheds light on the molecular underpinnings of induced pluripotency in human cells and provides a robust cell platform for further studies. PMID:26186193
Master of Puppets: Cooperative Multitasking for In Situ Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Lukic, Zarija
2016-01-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less
Muñoz, C; Young, H; Antileo, C; Bornhardt, C
2009-01-01
This paper presents a sliding mode controller (SMC) for dissolved oxygen (DO) in an integrated nitrogen removal process carried out in a suspended biomass sequencing batch reactor (SBR). The SMC performance was compared against an auto-tuning PI controller with parameters adjusted at the beginning of the batch cycle. A method for cancelling the slow DO sensor dynamics was implemented by using a first order model of the sensor. Tests in a lab-scale reactor showed that the SMC offers a better disturbance rejection capability than the auto-tuning PI controller, furthermore providing reasonable performance in a wide range of operation. Thus, SMC becomes an effective robust nonlinear tool to the DO control in this process, being also simple from a computational point of view, allowing its implementation in devices such as industrial programmable logic controllers (PLCs).
Gsflow-py: An integrated hydrologic model development tool
NASA Astrophysics Data System (ADS)
Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.
2017-12-01
Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.
Self-paced model learning for robust visual tracking
NASA Astrophysics Data System (ADS)
Huang, Wenhui; Gu, Jason; Ma, Xin; Li, Yibin
2017-01-01
In visual tracking, learning a robust and efficient appearance model is a challenging task. Model learning determines both the strategy and the frequency of model updating, which contains many details that could affect the tracking results. Self-paced learning (SPL) has recently been attracting considerable interest in the fields of machine learning and computer vision. SPL is inspired by the learning principle underlying the cognitive process of humans, whose learning process is generally from easier samples to more complex aspects of a task. We propose a tracking method that integrates the learning paradigm of SPL into visual tracking, so reliable samples can be automatically selected for model learning. In contrast to many existing model learning strategies in visual tracking, we discover the missing link between sample selection and model learning, which are combined into a single objective function in our approach. Sample weights and model parameters can be learned by minimizing this single objective function. Additionally, to solve the real-valued learning weight of samples, an error-tolerant self-paced function that considers the characteristics of visual tracking is proposed. We demonstrate the robustness and efficiency of our tracker on a recent tracking benchmark data set with 50 video sequences.
Thalamocortical mechanisms for integrating musical tone and rhythm
Musacchia, Gabriella; Large, Edward
2014-01-01
Studies over several decades have identified many of the neuronal substrates of music perception by pursuing pitch and rhythm perception separately. Here, we address the question of how these mechanisms interact, starting with the observation that the peripheral pathways of the so-called “Core” and “Matrix” thalamocortical system provide the anatomical bases for tone and rhythm channels. We then examine the hypothesis that these specialized inputs integrate tonal content within rhythm context in auditory cortex using classical types of “driving” and “modulatory” mechanisms. This hypothesis provides a framework for deriving testable predictions about the early stages of music processing. Furthermore, because thalamocortical circuits are shared by speech and music processing, such a model provides concrete implications for how music experience contributes to the development of robust speech encoding mechanisms. PMID:24103509
NASA Technical Reports Server (NTRS)
Hale, Mark A.
1996-01-01
Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust Engineering Analysis Models and Specifications) has been developed which supports the activities of both meta-design and actual design execution. This is accomplished through a systematic process which is comprised of the stages of Formulation, Translation, and Evaluation. During this process, elements from a Design Specification are integrated into Design Processes. In addition, a software infrastructure was developed and is called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment). This represents a virtual apparatus in the Design Experiment conducted in this research. IMAGE is an innovative architecture because it explicitly supports design-related activities. This is accomplished through a GUI driven and Agent-based implementation of DREAMS. A HSCT design has been adopted from the Framework for Interdisciplinary Design Optimization (FIDO) and is implemented in IMAGE. This problem shows how Design Processes and Specification interact in a design system. In addition, the problem utilizes two different solution models concurrently: optimal and satisfying. The satisfying model allows for more design flexibility and allows a designer to maintain design freedom. As a result of following this experimental procedure, this infrastructure is an open system that it is robust to changes in both corporate needs and computer technologies. The development of this infrastructure leads to a number of significant intellectual contributions: 1) A new approach to implementing IPPD with the aid of a computer; 2) A formal Design Experiment; 3) A combined Process and Specification architecture that is language-based; 4) An infrastructure for exploring design; 5) An integration strategy for implementing computer resources; and 6) A seamless modeling language. The need for these contributions is emphasized by the demand by industry and government agencies for the development of these technologies.
Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E.; Tkachenko, Valery; Torcivia-Rodriguez, John; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja
2016-01-01
The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure. The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu PMID:26989153
Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja
2016-01-01
The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Fan, X. Z.; Naves, L.; Siwak, N. P.; Brown, A.; Culver, J.; Ghodssi, R.
2015-05-01
A novel virus-like particle (TMV-VLP) receptor layer has been integrated with an optical microdisk resonator transducer for biosensing applications. This bioreceptor layer is functionalized with selective peptides that encode unique recognition affinities. Integration of bioreceptors with sensor platforms is very challenging due their very different compatibility regimes. The TMV-VLP nanoreceptor exhibits integration robustness, including the ability for self-assembly along with traditional top-down microfabrication processes. An optical microdisk resonator has been functionalized for antibody binding with this receptor, demonstrating resonant wavelength shifts of (Δλo) of 0.79 nm and 5.95 nm after primary antibody binding and enzyme-linked immunosorbent assay (ELISA), respectively, illustrating label-free sensing of this bonding event. This demonstration of label-free sensing with genetically engineered TMV-VLP shows the flexibility and utility of this receptor coating when considering integration with other existing transducer platforms.
Process Performance of Optima XEx Single Wafer High Energy Implanter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, J. H.; Yoon, Jongyoon; Kondratenko, S.
2011-01-07
To meet the process requirements for well formation in future CMOS memory production, high energy implanters require more robust angle, dose, and energy control while maintaining high productivity. The Optima XEx high energy implanter meets these requirements by integrating a traditional LINAC beamline with a robust single wafer handling system. To achieve beam angle control, Optima XEx can control both the horizontal and vertical beam angles to within 0.1 degrees using advanced beam angle measurement and correction. Accurate energy calibration and energy trim functions accelerate process matching by eliminating energy calibration errors. The large volume process chamber and UDC (upstreammore » dose control) using faraday cups outside of the process chamber precisely control implant dose regardless of any chamber pressure increase due to PR (photoresist) outgassing. An optimized RF LINAC accelerator improves reliability and enables singly charged phosphorus and boron energies up to 1200 keV and 1500 keV respectively with higher beam currents. A new single wafer endstation combined with increased beam performance leads to overall increased productivity. We report on the advanced performance of Optima XEx observed during tool installation and volume production at an advanced memory fab.« less
León, Larry F; Cai, Tianxi
2012-04-01
In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.
System interface for an integrated intelligent safety system (ISS) for vehicle applications.
Hannan, Mahammad A; Hussain, Aini; Samad, Salina A
2010-01-01
This paper deals with the interface-relevant activity of a vehicle integrated intelligent safety system (ISS) that includes an airbag deployment decision system (ADDS) and a tire pressure monitoring system (TPMS). A program is developed in LabWindows/CVI, using C for prototype implementation. The prototype is primarily concerned with the interconnection between hardware objects such as a load cell, web camera, accelerometer, TPM tire module and receiver module, DAQ card, CPU card and a touch screen. Several safety subsystems, including image processing, weight sensing and crash detection systems, are integrated, and their outputs are combined to yield intelligent decisions regarding airbag deployment. The integrated safety system also monitors tire pressure and temperature. Testing and experimentation with this ISS suggests that the system is unique, robust, intelligent, and appropriate for in-vehicle applications.
NASA Technical Reports Server (NTRS)
Rakoczy, John; Heater, Daniel; Lee, Ashley
2013-01-01
Marshall Space Flight Center's (MSFC) Small Projects Rapid Integration and Test Environment (SPRITE) is a Hardware-In-The-Loop (HWIL) facility that provides rapid development, integration, and testing capabilities for small projects (CubeSats, payloads, spacecraft, and launch vehicles). This facility environment focuses on efficient processes and modular design to support rapid prototyping, integration, testing and verification of small projects at an affordable cost, especially compared to larger type HWIL facilities. SPRITE (Figure 1) consists of a "core" capability or "plant" simulation platform utilizing a graphical programming environment capable of being rapidly re-configured for any potential test article's space environments, as well as a standard set of interfaces (i.e. Mil-Std 1553, Serial, Analog, Digital, etc.). SPRITE also allows this level of interface testing of components and subsystems very early in a program, thereby reducing program risk.
System Interface for an Integrated Intelligent Safety System (ISS) for Vehicle Applications
Hannan, Mahammad A.; Hussain, Aini; Samad, Salina A.
2010-01-01
This paper deals with the interface-relevant activity of a vehicle integrated intelligent safety system (ISS) that includes an airbag deployment decision system (ADDS) and a tire pressure monitoring system (TPMS). A program is developed in LabWindows/CVI, using C for prototype implementation. The prototype is primarily concerned with the interconnection between hardware objects such as a load cell, web camera, accelerometer, TPM tire module and receiver module, DAQ card, CPU card and a touch screen. Several safety subsystems, including image processing, weight sensing and crash detection systems, are integrated, and their outputs are combined to yield intelligent decisions regarding airbag deployment. The integrated safety system also monitors tire pressure and temperature. Testing and experimentation with this ISS suggests that the system is unique, robust, intelligent, and appropriate for in-vehicle applications. PMID:22205861
NASA Astrophysics Data System (ADS)
Burton, A. R.; Lynch, J. P.; Kurata, M.; Law, K. H.
2017-09-01
Multifunctional thin film materials have opened many opportunities for novel sensing strategies for structural health monitoring. While past work has established methods of optimizing multifunctional materials to exhibit sensing properties, comparatively less work has focused on their integration into fully functional sensing systems capable of being deployed in the field. This study focuses on the advancement of a scalable fabrication process for the integration of multifunctional thin films into a fully integrated sensing system. This is achieved through the development of an optimized fabrication process that can create a broad range of sensing systems using multifunctional materials. A layer-by-layer deposited multifunctional composite consisting of single walled carbon nanotubes (SWNT) in a polyvinyl alcohol and polysodium-4-styrene sulfonate matrix are incorporated with a lithography process to produce a fully integrated sensing system deposited on a flexible substrate. To illustrate the process, a strain sensing platform consisting of a patterned SWNT-composite thin film as a strain-sensitive element within an amplified Wheatstone bridge sensing circuit is presented. Strain sensing is selected because it presents many of the design and processing challenges that are core to patterning multifunctional thin film materials into sensing systems. Strain sensors fabricated on a flexible polyimide substrate are experimentally tested under cyclic loading using standard four-point bending coupons and a partial-scale steel frame assembly under lateral loading. The study reveals the material process is highly repeatable to produce fully integrated strain sensors with linearity and sensitivity exceeding 0.99 and 5 {{V}}/{ε }, respectively. The thin film strain sensors are robust and are capable of high strain measurements beyond 3000 μ {ε }.
NASA Technical Reports Server (NTRS)
Li, Jun; Cassell, Alan; Koehne, Jessica; Chen, Hua; Ng, Hou Tee; Ye, Qi; Stevens, Ramsey; Han, Jie; Meyyappan, M.
2003-01-01
We report on our recent breakthroughs in two different applications using well-aligned carbon nanotube (CNT) arrays on Si chips, including (1) a novel processing solution for highly robust electrical interconnects in integrated circuit manufacturing, and (2) the development of ultrasensitive electrochemical DNA sensors. Both of them rely on the invention of a bottom-up fabrication scheme which includes six steps, including: (a) lithographic patterning, (b) depositing bottom conducting contacts, (c) depositing metal catalysts, (d) CNT growth by plasma enhanced chemical vapor deposition (PECVD), (e) dielectric gap-filling, and (f) chemical mechanical polishing (CMP). Such processes produce a stable planarized surface with only the open end of CNTs exposed, whch can be further processed or modified for different applications. By depositing patterned top contacts, the CNT can serve as vertical interconnects between the two conducting layers. This method is fundamentally different fiom current damascene processes and avoids problems associated with etching and filling of high aspect ratio holes at nanoscales. In addition, multiwalled CNTs (MWCNTs) are highly robust and can carry a current density of 10(exp 9) A/square centimeters without degradation. It has great potential to help extending the current Si technology. The embedded MWCNT array without the top contact layer can be also used as a nanoelectrode array in electrochemical biosensors. The cell time-constant and sensitivity can be dramatically improved. By functionalizing the tube ends with specific oligonucleotide probes, specific DNA targets can be detected with electrochemical methods down to subattomoles.
Schwaibold, M; Schöchlin, J; Bolz, A
2002-01-01
For classification tasks in biosignal processing, several strategies and algorithms can be used. Knowledge-based systems allow prior knowledge about the decision process to be integrated, both by the developer and by self-learning capabilities. For the classification stages in a sleep stage detection framework, three inference strategies were compared regarding their specific strengths: a classical signal processing approach, artificial neural networks and neuro-fuzzy systems. Methodological aspects were assessed to attain optimum performance and maximum transparency for the user. Due to their effective and robust learning behavior, artificial neural networks could be recommended for pattern recognition, while neuro-fuzzy systems performed best for the processing of contextual information.
A Data-Driven Solution for Performance Improvement
NASA Technical Reports Server (NTRS)
2002-01-01
Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.
Artificial Neuron Based on Integrated Semiconductor Quantum Dot Mode-Locked Lasers
NASA Astrophysics Data System (ADS)
Mesaritakis, Charis; Kapsalis, Alexandros; Bogris, Adonis; Syvridis, Dimitris
2016-12-01
Neuro-inspired implementations have attracted strong interest as a power efficient and robust alternative to the digital model of computation with a broad range of applications. Especially, neuro-mimetic systems able to produce and process spike-encoding schemes can offer merits like high noise-resiliency and increased computational efficiency. Towards this direction, integrated photonics can be an auspicious platform due to its multi-GHz bandwidth, its high wall-plug efficiency and the strong similarity of its dynamics under excitation with biological spiking neurons. Here, we propose an integrated all-optical neuron based on an InAs/InGaAs semiconductor quantum-dot passively mode-locked laser. The multi-band emission capabilities of these lasers allows, through waveband switching, the emulation of the excitation and inhibition modes of operation. Frequency-response effects, similar to biological neural circuits, are observed just as in a typical two-section excitable laser. The demonstrated optical building block can pave the way for high-speed photonic integrated systems able to address tasks ranging from pattern recognition to cognitive spectrum management and multi-sensory data processing.
Joining and Integration of Silicon Carbide-Based Materials for High Temperature Applications
NASA Technical Reports Server (NTRS)
Halbig, Michael C.; Singh, Mrityunjay
2016-01-01
Advanced joining and integration technologies of silicon carbide-based ceramics and ceramic matrix composites are enabling for their implementation into wide scale aerospace and ground-based applications. The robust joining and integration technologies allow for large and complex shapes to be fabricated and integrated with the larger system. Potential aerospace applications include lean-direct fuel injectors, thermal actuators, turbine vanes, blades, shrouds, combustor liners and other hot section components. Ground based applications include components for energy and environmental systems. Performance requirements and processing challenges are identified for the successful implementation different joining technologies. An overview will be provided of several joining approaches which have been developed for high temperature applications. In addition, various characterization approaches were pursued to provide an understanding of the processing-microstructure-property relationships. Microstructural analysis of the joint interfaces was conducted using optical, scanning electron, and transmission electron microscopy to identify phases and evaluate the bond quality. Mechanical testing results will be presented along with the need for new standardized test methods. The critical need for tailoring interlayer compositions for optimum joint properties will also be highlighted.
Artificial Neuron Based on Integrated Semiconductor Quantum Dot Mode-Locked Lasers
Mesaritakis, Charis; Kapsalis, Alexandros; Bogris, Adonis; Syvridis, Dimitris
2016-01-01
Neuro-inspired implementations have attracted strong interest as a power efficient and robust alternative to the digital model of computation with a broad range of applications. Especially, neuro-mimetic systems able to produce and process spike-encoding schemes can offer merits like high noise-resiliency and increased computational efficiency. Towards this direction, integrated photonics can be an auspicious platform due to its multi-GHz bandwidth, its high wall-plug efficiency and the strong similarity of its dynamics under excitation with biological spiking neurons. Here, we propose an integrated all-optical neuron based on an InAs/InGaAs semiconductor quantum-dot passively mode-locked laser. The multi-band emission capabilities of these lasers allows, through waveband switching, the emulation of the excitation and inhibition modes of operation. Frequency-response effects, similar to biological neural circuits, are observed just as in a typical two-section excitable laser. The demonstrated optical building block can pave the way for high-speed photonic integrated systems able to address tasks ranging from pattern recognition to cognitive spectrum management and multi-sensory data processing. PMID:27991574
A robust real-time abnormal region detection framework from capsule endoscopy images
NASA Astrophysics Data System (ADS)
Cheng, Yanfen; Liu, Xu; Li, Huiping
2009-02-01
In this paper we present a novel method to detect abnormal regions from capsule endoscopy images. Wireless Capsule Endoscopy (WCE) is a recent technology where a capsule with an embedded camera is swallowed by the patient to visualize the gastrointestinal tract. One challenge is one procedure of diagnosis will send out over 50,000 images, making physicians' reviewing process expensive. Physicians' reviewing process involves in identifying images containing abnormal regions (tumor, bleeding, etc) from this large number of image sequence. In this paper we construct a novel framework for robust and real-time abnormal region detection from large amount of capsule endoscopy images. The detected potential abnormal regions can be labeled out automatically to let physicians review further, therefore, reduce the overall reviewing process. In this paper we construct an abnormal region detection framework with the following advantages: 1) Trainable. Users can define and label any type of abnormal region they want to find; The abnormal regions, such as tumor, bleeding, etc., can be pre-defined and labeled using the graphical user interface tool we provided. 2) Efficient. Due to the large number of image data, the detection speed is very important. Our system can detect very efficiently at different scales due to the integral image features we used; 3) Robust. After feature selection we use a cascade of classifiers to further enforce the detection accuracy.
Managing biotechnology in a network-model health plan: a U.S. private payer perspective.
Watkins, John B; Choudhury, Sanchita Roy; Wong, Ed; Sullivan, Sean D
2006-01-01
Emerging biotechnology poses challenges to payers, including access, coverage, reimbursement, patient selection, and affordability. Premera Blue Cross, a private regional health plan, developed an integrated cross-functional approach to managing biologics, built around a robust formulary process that is fast, flexible, fair, and transparent to stakeholders. Results are monitored by cost and use reporting from merged pharmacy and medical claims. Utilization management and case management strategies will integrate with specialty pharmacy programs to improve outcomes and cost-effectiveness. Creative approaches to provider reimbursement can align providers' incentives with those of the plan. Redesign of member benefits can also encourage appropriate use of biotechnology.
Integrating Fiber Optic Strain Sensors into Metal Using Ultrasonic Additive Manufacturing
NASA Astrophysics Data System (ADS)
Hehr, Adam; Norfolk, Mark; Wenning, Justin; Sheridan, John; Leser, Paul; Leser, Patrick; Newman, John A.
2018-03-01
Ultrasonic additive manufacturing, a rather new three-dimensional (3D) printing technology, uses ultrasonic energy to produce metallurgical bonds between layers of metal foils near room temperature. This low temperature attribute of the process enables integration of temperature sensitive components, such as fiber optic strain sensors, directly into metal structures. This may be an enabling technology for Digital Twin applications, i.e., virtual model interaction and feedback with live load data. This study evaluates the consolidation quality, interface robustness, and load sensing limits of commercially available fiber optic strain sensors embedded into aluminum alloy 6061. Lastly, an outlook on the technology and its applications is described.
An integrated microcombustor and photonic crystal emitter for thermophotovoltaics
NASA Astrophysics Data System (ADS)
Chan, Walker R.; Stelmakh, Veronika; Allmon, William R.; Waits, Christopher M.; Soljacic, Marin; Joannopoulos, John D.; Celanovic, Ivan
2016-11-01
Thermophotovoltaic (TPV) energy conversion is appealing for portable millimeter- scale generators because of its simplicity, but it relies on a high temperatures. The performance and reliability of the high-temperature components, a microcombustor and a photonic crystal emitter, has proven challenging because they are subjected to 1000-1200°C and stresses arising from thermal expansion mismatches. In this paper, we adopt the industrial process of diffusion brazing to fabricate an integrated microcombustor and photonic crystal by bonding stacked metal layers. Diffusion brazing is simpler and faster than previous approaches of silicon MEMS and welded metal, and the end result is more robust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elmore, Joshua R.; Furches, Anna; Wolff, Gara N.
Pseudomonas putida strains are highly robust bacteria known for their ability to efficiently utilize a variety of carbon sources, including aliphatic and aromatic hydrocarbons. Recently, P. putida has been engineered to valorize the lignin stream of a lignocellulosic biomass pretreatment process. Nonetheless, when compared to platform organisms such as Escherichia coli, the toolkit for engineering P. putida is underdeveloped. Heterologous gene expression in particular is problematic. Plasmid instability and copy number variance provide challenges for replicative plasmids, while use of homologous recombination for insertion of DNA into the chromosome is slow and laborious. Furthermore, heterologous expression efforts to date typicallymore » rely on overexpression of exogenous pathways using a handful of poorly characterized promoters. In order to improve the P. putida toolkit, we developed a rapid genome integration system using the site-specific recombinase from bacteriophage Bxb1 to enable rapid, high efficiency integration of DNA into the P. putida chromosome. We also developed a library of synthetic promoters with various UP elements, -35 sequences, and -10 sequences, as well as different ribosomal binding sites. We tested these promoters using a fluorescent reporter gene, mNeonGreen, to characterize the strength of each promoter, and identified UP-element-promoter-ribosomal binding sites combinations capable of driving a ~150-fold range of protein expression levels. One additional integrating vector was developed that confers more robust kanamycin resistance when integrated at single copy into the chromosome. This genome integration and reporter systems are extensible for testing other genetic parts, such as examining terminator strength, and will allow rapid integration of heterologous pathways for metabolic engineering.« less
Mining manufacturing data for discovery of high productivity process characteristics.
Charaniya, Salim; Le, Huong; Rangwala, Huzefa; Mills, Keri; Johnson, Kevin; Karypis, George; Hu, Wei-Shou
2010-06-01
Modern manufacturing facilities for bioproducts are highly automated with advanced process monitoring and data archiving systems. The time dynamics of hundreds of process parameters and outcome variables over a large number of production runs are archived in the data warehouse. This vast amount of data is a vital resource to comprehend the complex characteristics of bioprocesses and enhance production robustness. Cell culture process data from 108 'trains' comprising production as well as inoculum bioreactors from Genentech's manufacturing facility were investigated. Each run constitutes over one-hundred on-line and off-line temporal parameters. A kernel-based approach combined with a maximum margin-based support vector regression algorithm was used to integrate all the process parameters and develop predictive models for a key cell culture performance parameter. The model was also used to identify and rank process parameters according to their relevance in predicting process outcome. Evaluation of cell culture stage-specific models indicates that production performance can be reliably predicted days prior to harvest. Strong associations between several temporal parameters at various manufacturing stages and final process outcome were uncovered. This model-based data mining represents an important step forward in establishing a process data-driven knowledge discovery in bioprocesses. Implementation of this methodology on the manufacturing floor can facilitate a real-time decision making process and thereby improve the robustness of large scale bioprocesses. 2010 Elsevier B.V. All rights reserved.
On robust parameter estimation in brain-computer interfacing
NASA Astrophysics Data System (ADS)
Samek, Wojciech; Nakajima, Shinichi; Kawanabe, Motoaki; Müller, Klaus-Robert
2017-12-01
Objective. The reliable estimation of parameters such as mean or covariance matrix from noisy and high-dimensional observations is a prerequisite for successful application of signal processing and machine learning algorithms in brain-computer interfacing (BCI). This challenging task becomes significantly more difficult if the data set contains outliers, e.g. due to subject movements, eye blinks or loose electrodes, as they may heavily bias the estimation and the subsequent statistical analysis. Although various robust estimators have been developed to tackle the outlier problem, they ignore important structural information in the data and thus may not be optimal. Typical structural elements in BCI data are the trials consisting of a few hundred EEG samples and indicating the start and end of a task. Approach. This work discusses the parameter estimation problem in BCI and introduces a novel hierarchical view on robustness which naturally comprises different types of outlierness occurring in structured data. Furthermore, the class of minimum divergence estimators is reviewed and a robust mean and covariance estimator for structured data is derived and evaluated with simulations and on a benchmark data set. Main results. The results show that state-of-the-art BCI algorithms benefit from robustly estimated parameters. Significance. Since parameter estimation is an integral part of various machine learning algorithms, the presented techniques are applicable to many problems beyond BCI.
Baronsky-Probst, J; Möltgen, C-V; Kessler, W; Kessler, R W
2016-05-25
Hot melt extrusion (HME) is a well-known process within the plastic and food industries that has been utilized for the past several decades and is increasingly accepted by the pharmaceutical industry for continuous manufacturing. For tamper-resistant formulations of e.g. opioids, HME is the most efficient production technique. The focus of this study is thus to evaluate the manufacturability of the HME process for tamper-resistant formulations. Parameters such as the specific mechanical energy (SME), as well as the melt pressure and its standard deviation, are important and will be discussed in this study. In the first step, the existing process data are analyzed by means of multivariate data analysis. Key critical process parameters such as feed rate, screw speed, and the concentration of the API in the polymers are identified, and critical quality parameters of the tablet are defined. In the second step, a relationship between the critical material, product and process quality attributes are established by means of Design of Experiments (DoEs). The resulting SME and the temperature at the die are essential data points needed to indirectly qualify the degradation of the API, which should be minimal. NIR-spectroscopy is used to monitor the material during the extrusion process. In contrast to most applications in which the probe is directly integrated into the die, the optical sensor is integrated into the cooling line of the strands. This saves costs in the probe design and maintenance and increases the robustness of the chemometric models. Finally, a process measurement system is installed to monitor and control all of the critical attributes in real-time by means of first principles, DoE models, soft sensor models, and spectroscopic information. Overall, the process is very robust as long as the screw speed is kept low. Copyright © 2015 Elsevier B.V. All rights reserved.
Miyoshi, Newton Shydeo Brandão; Pinheiro, Daniel Guariz; Silva, Wilson Araújo; Felipe, Joaquim Cezar
2013-06-06
The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. We have implemented an extension of Chado - the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different "omics" technologies with patient's clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans.
Single photon emission from plasma treated 2D hexagonal boron nitride.
Xu, Zai-Quan; Elbadawi, Christopher; Tran, Toan Trong; Kianinia, Mehran; Li, Xiuling; Liu, Daobin; Hoffman, Timothy B; Nguyen, Minh; Kim, Sejeong; Edgar, James H; Wu, Xiaojun; Song, Li; Ali, Sajid; Ford, Mike; Toth, Milos; Aharonovich, Igor
2018-05-03
Artificial atomic systems in solids are becoming increasingly important building blocks in quantum information processing and scalable quantum nanophotonic networks. Amongst numerous candidates, 2D hexagonal boron nitride has recently emerged as a promising platform hosting single photon emitters. Here, we report a number of robust plasma and thermal annealing methods for fabrication of emitters in tape-exfoliated hexagonal boron nitride (hBN) crystals. A two-step process comprising Ar plasma etching and subsequent annealing in Ar is highly robust, and yields an eight-fold increase in the concentration of emitters in hBN. The initial plasma-etching step generates emitters that suffer from blinking and bleaching, whereas the two-step process yields emitters that are photostable at room temperature with emission wavelengths greater than ∼700 nm. Density functional theory modeling suggests that the emitters might be associated with defect complexes that contain oxygen. This is further confirmed by generating the emitters via annealing hBN in air. Our findings advance the present understanding of the structure of quantum emitters in hBN and enhance the nanofabrication toolkit needed to realize integrated quantum nanophotonic circuits.
Developments in Nano-Satellite Structural Subsystem Design at NASA-GSFC
NASA Technical Reports Server (NTRS)
Rossoni, Peter; Panetta, Peter V.
1999-01-01
The NASA-GSFC Nano-satellite Technology Development Program will enable flying constellations of tens to hundreds of nano-satellites for future NASA Space and Earth Science missions. Advanced technology components must be developed to make these future spacecraft compact, lightweight, low-power, low-cost, and survivable to a radiation environment over a two-year mission lifetime. This paper describes the efforts underway to develop lightweight, low cost, and multi-functional structures, serviceable designs, and robust mechanisms. As designs shrink, the integration of various subsystems becomes a vital necessity. This paper also addresses structurally integrated electrical power, attitude control, and thermal systems. These innovations bring associated fabrication, integration, and test challenges. Candidate structural materials and processes are examined and the merits of each are discussed. Design and fabrication processes include flat stock composite construction, cast aluminum-beryllium alloy, and an injection molded fiber-reinforced plastic. A viable constellation deployment scenario is described as well as a Phase-A Nano-satellite Pathfinder study.
Using fuzzy logic to integrate neural networks and knowledge-based systems
NASA Technical Reports Server (NTRS)
Yen, John
1991-01-01
Outlined here is a novel hybrid architecture that uses fuzzy logic to integrate neural networks and knowledge-based systems. The author's approach offers important synergistic benefits to neural nets, approximate reasoning, and symbolic processing. Fuzzy inference rules extend symbolic systems with approximate reasoning capabilities, which are used for integrating and interpreting the outputs of neural networks. The symbolic system captures meta-level information about neural networks and defines its interaction with neural networks through a set of control tasks. Fuzzy action rules provide a robust mechanism for recognizing the situations in which neural networks require certain control actions. The neural nets, on the other hand, offer flexible classification and adaptive learning capabilities, which are crucial for dynamic and noisy environments. By combining neural nets and symbolic systems at their system levels through the use of fuzzy logic, the author's approach alleviates current difficulties in reconciling differences between low-level data processing mechanisms of neural nets and artificial intelligence systems.
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
1999-01-01
Study (the Restudy), assessing the hydrologic and ecological results of the Restudy modifications through pre- and postmodification monitoring and...of the south Florida ecosystem and its response to restoration activities. • Model Development—Robust models of ecological processes and the... ecological interactions are all underway. • Data Synthesis and Information Dissemination—Topical syntheses will analyze, summarize, and integrate
Bio-Inspired Microsystem for Robust Genetic Assay Recognition
Lue, Jaw-Chyng; Fang, Wai-Chi
2008-01-01
A compact integrated system-on-chip (SoC) architecture solution for robust, real-time, and on-site genetic analysis has been proposed. This microsystem solution is noise-tolerable and suitable for analyzing the weak fluorescence patterns from a PCR prepared dual-labeled DNA microchip assay. In the architecture, a preceding VLSI differential logarithm microchip is designed for effectively computing the logarithm of the normalized input fluorescence signals. A posterior VLSI artificial neural network (ANN) processor chip is used for analyzing the processed signals from the differential logarithm stage. A single-channel logarithmic circuit was fabricated and characterized. A prototype ANN chip with unsupervised winner-take-all (WTA) function was designed, fabricated, and tested. An ANN learning algorithm using a novel sigmoid-logarithmic transfer function based on the supervised backpropagation (BP) algorithm is proposed for robustly recognizing low-intensity patterns. Our results show that the trained new ANN can recognize low-fluorescence patterns better than an ANN using the conventional sigmoid function. PMID:18566679
Narasimhan, S; Chiel, H J; Bhunia, S
2011-04-01
Implantable microsystems for monitoring or manipulating brain activity typically require on-chip real-time processing of multichannel neural data using ultra low-power, miniaturized electronics. In this paper, we propose an integrated-circuit/architecture-level hardware design framework for neural signal processing that exploits the nature of the signal-processing algorithm. First, we consider different power reduction techniques and compare the energy efficiency between the ultra-low frequency subthreshold and conventional superthreshold design. We show that the superthreshold design operating at a much higher frequency can achieve comparable energy dissipation by taking advantage of extensive power gating. It also provides significantly higher robustness of operation and yield under large process variations. Next, we propose an architecture level preferential design approach for further energy reduction by isolating the critical computation blocks (with respect to the quality of the output signal) and assigning them higher delay margins compared to the noncritical ones. Possible delay failures under parameter variations are confined to the noncritical components, allowing graceful degradation in quality under voltage scaling. Simulation results using prerecorded neural data from the sea-slug (Aplysia californica) show that the application of the proposed design approach can lead to significant improvement in total energy, without compromising the output signal quality under process variations, compared to conventional design approaches.
Multiple methods integration for structural mechanics analysis and design
NASA Technical Reports Server (NTRS)
Housner, J. M.; Aminpour, M. A.
1991-01-01
A new research area of multiple methods integration is proposed for joining diverse methods of structural mechanics analysis which interact with one another. Three categories of multiple methods are defined: those in which a physical interface are well defined; those in which a physical interface is not well-defined, but selected; and those in which the interface is a mathematical transformation. Two fundamental integration procedures are presented that can be extended to integrate various methods (e.g., finite elements, Rayleigh Ritz, Galerkin, and integral methods) with one another. Since the finite element method will likely be the major method to be integrated, its enhanced robustness under element distortion is also examined and a new robust shell element is demonstrated.
Space Situational Awareness Data Processing Scalability Utilizing Google Cloud Services
NASA Astrophysics Data System (ADS)
Greenly, D.; Duncan, M.; Wysack, J.; Flores, F.
Space Situational Awareness (SSA) is a fundamental and critical component of current space operations. The term SSA encompasses the awareness, understanding and predictability of all objects in space. As the population of orbital space objects and debris increases, the number of collision avoidance maneuvers grows and prompts the need for accurate and timely process measures. The SSA mission continually evolves to near real-time assessment and analysis demanding the need for higher processing capabilities. By conventional methods, meeting these demands requires the integration of new hardware to keep pace with the growing complexity of maneuver planning algorithms. SpaceNav has implemented a highly scalable architecture that will track satellites and debris by utilizing powerful virtual machines on the Google Cloud Platform. SpaceNav algorithms for processing CDMs outpace conventional means. A robust processing environment for tracking data, collision avoidance maneuvers and various other aspects of SSA can be created and deleted on demand. Migrating SpaceNav tools and algorithms into the Google Cloud Platform will be discussed and the trials and tribulations involved. Information will be shared on how and why certain cloud products were used as well as integration techniques that were implemented. Key items to be presented are: 1.Scientific algorithms and SpaceNav tools integrated into a scalable architecture a) Maneuver Planning b) Parallel Processing c) Monte Carlo Simulations d) Optimization Algorithms e) SW Application Development/Integration into the Google Cloud Platform 2. Compute Engine Processing a) Application Engine Automated Processing b) Performance testing and Performance Scalability c) Cloud MySQL databases and Database Scalability d) Cloud Data Storage e) Redundancy and Availability
Integrating human stem cell expansion and neuronal differentiation in bioreactors
Serra, Margarida; Brito, Catarina; Costa, Eunice M; Sousa, Marcos FQ; Alves, Paula M
2009-01-01
Background Human stem cells are cellular resources with outstanding potential for cell therapy. However, for the fulfillment of this application, major challenges remain to be met. Of paramount importance is the development of robust systems for in vitro stem cell expansion and differentiation. In this work, we successfully developed an efficient scalable bioprocess for the fast production of human neurons. Results The expansion of undifferentiated human embryonal carcinoma stem cells (NTera2/cl.D1 cell line) as 3D-aggregates was firstly optimized in spinner vessel. The media exchange operation mode with an inoculum concentration of 4 × 105 cell/mL was the most efficient strategy tested, with a 4.6-fold increase in cell concentration achieved in 5 days. These results were validated in a bioreactor where similar profile and metabolic performance were obtained. Furthermore, characterization of the expanded population by immunofluorescence microscopy and flow cytometry showed that NT2 cells maintained their stem cell characteristics along the bioreactor culture time. Finally, the neuronal differentiation step was integrated in the bioreactor process, by addition of retinoic acid when cells were in the middle of the exponential phase. Neurosphere composition was monitored and neuronal differentiation efficiency evaluated along the culture time. The results show that, for bioreactor cultures, we were able to increase significantly the neuronal differentiation efficiency by 10-fold while reducing drastically, by 30%, the time required for the differentiation process. Conclusion The culture systems developed herein are robust and represent one-step-forward towards the development of integrated bioprocesses, bridging stem cell expansion and differentiation in fully controlled bioreactors. PMID:19772662
NASA Astrophysics Data System (ADS)
Zhu, Xiaoyuan; Zhang, Hui; Fang, Zongde
2015-12-01
This paper presents a robust speed synchronization controller design for an integrated motor-transmission powertrain system in which the driving motor and multi-gearbox are directly coupled. As the controller area network (CAN) is commonly used in the vehicle powertrain system, the possible network-induced random delays in both feedback and forward channel are considered and modeled by using two Markov chains in the controller design process. For the application perspective, the control law adopted here is a generalized proportional-integral (PI) control. By employing the system-augmentation technique, a delay-free stochastic closed-loop system is obtained and the generalized PI controller design problem is converted to a static output feedback (SOF) controller design problem. Since there are external disturbances involved in the closed-loop system, the energy-to-peak performance is considered to guarantee the robustness of the controller. And the controlled output is chosen as the speed synchronization error. To further improve the transient response of the closed-loop system, the pole placement is also employed in the energy-to-peak performance based speed synchronization control. The mode-dependent control gains are obtained by using an iterative linear matrix inequality (LMI) algorithm. Simulation results show the effectiveness of the proposed control approach.
AEGIS: a robust and scalable real-time public health surveillance system.
Reis, Ben Y; Kirby, Chaim; Hadden, Lucy E; Olson, Karen; McMurry, Andrew J; Daniel, James B; Mandl, Kenneth D
2007-01-01
In this report, we describe the Automated Epidemiological Geotemporal Integrated Surveillance system (AEGIS), developed for real-time population health monitoring in the state of Massachusetts. AEGIS provides public health personnel with automated near-real-time situational awareness of utilization patterns at participating healthcare institutions, supporting surveillance of bioterrorism and naturally occurring outbreaks. As real-time public health surveillance systems become integrated into regional and national surveillance initiatives, the challenges of scalability, robustness, and data security become increasingly prominent. A modular and fault tolerant design helps AEGIS achieve scalability and robustness, while a distributed storage model with local autonomy helps to minimize risk of unauthorized disclosure. The report includes a description of the evolution of the design over time in response to the challenges of a regional and national integration environment.
Zheng, Weijia; Pi, Youguo
2016-07-01
A tuning method of the fractional order proportional integral speed controller for a permanent magnet synchronous motor is proposed in this paper. Taking the combination of the integral of time and absolute error and the phase margin as the optimization index, the robustness specification as the constraint condition, the differential evolution algorithm is applied to search the optimal controller parameters. The dynamic response performance and robustness of the obtained optimal controller are verified by motor speed-tracking experiments on the motor speed control platform. Experimental results show that the proposed tuning method can enable the obtained control system to achieve both the optimal dynamic response performance and the robustness to gain variations. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
A Tightly-Coupled GPS/INS/UWB Cooperative Positioning Sensors System Supported by V2I Communication
Wang, Jian; Gao, Yang; Li, Zengke; Meng, Xiaolin; Hancock, Craig M.
2016-01-01
This paper investigates a tightly-coupled Global Position System (GPS)/Ultra-Wideband (UWB)/Inertial Navigation System (INS) cooperative positioning scheme using a Robust Kalman Filter (RKF) supported by V2I communication. The scheme proposes a method that uses range measurements of UWB units transmitted among the terminals as augmentation inputs of the observations. The UWB range inputs are used to reform the GPS observation equations that consist of pseudo-range and Doppler measurements and the updated observation equation is processed in a tightly-coupled GPS/UWB/INS integrated positioning equation using an adaptive Robust Kalman Filter. The result of the trial conducted on the roof of the Nottingham Geospatial Institute (NGI) at the University of Nottingham shows that the integrated solution provides better accuracy and improves the availability of the system in GPS denied environments. RKF can eliminate the effects of gross errors. Additionally, the internal and external reliabilities of the system are enhanced when the UWB observables received from the moving terminals are involved in the positioning algorithm. PMID:27355947
Neural circuits in auditory and audiovisual memory.
Plakke, B; Romanski, L M
2016-06-01
Working memory is the ability to employ recently seen or heard stimuli and apply them to changing cognitive context. Although much is known about language processing and visual working memory, the neurobiological basis of auditory working memory is less clear. Historically, part of the problem has been the difficulty in obtaining a robust animal model to study auditory short-term memory. In recent years there has been neurophysiological and lesion studies indicating a cortical network involving both temporal and frontal cortices. Studies specifically targeting the role of the prefrontal cortex (PFC) in auditory working memory have suggested that dorsal and ventral prefrontal regions perform different roles during the processing of auditory mnemonic information, with the dorsolateral PFC performing similar functions for both auditory and visual working memory. In contrast, the ventrolateral PFC (VLPFC), which contains cells that respond robustly to auditory stimuli and that process both face and vocal stimuli may be an essential locus for both auditory and audiovisual working memory. These findings suggest a critical role for the VLPFC in the processing, integrating, and retaining of communication information. This article is part of a Special Issue entitled SI: Auditory working memory. Copyright © 2015 Elsevier B.V. All rights reserved.
Tape transfer printing of a liquid metal alloy for stretchable RF electronics.
Jeong, Seung Hee; Hjort, Klas; Wu, Zhigang
2014-09-03
In order to make conductors with large cross sections for low impedance radio frequency (RF) electronics, while still retaining high stretchability, liquid-alloy-based microfluidic stretchable electronics offers stretchable electronic systems the unique opportunity to combine various sensors on our bodies or organs with high-quality wireless communication with the external world (devices/systems), without sacrificing enhanced user comfort. This microfluidic approach, based on printed circuit board technology, allows large area processing of large cross section conductors and robust contacts, which can handle a lot of stretching between the embedded rigid active components and the surrounding system. Although it provides such benefits, further development is needed to realize its potential as a high throughput, cost-effective process technology. In this paper, tape transfer printing is proposed to supply a rapid prototyping batch process at low cost, albeit at a low resolution of 150 μm. In particular, isolated patterns can be obtained in a simple one-step process. Finally, a stretchable radio frequency identification (RFID) tag is demonstrated. The measured results show the robustness of the hybrid integrated system when the tag is stretched at 50% for 3000 cycles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gottwald, M.; Kan, J. J.; Lee, K.
Thermal budget, stack thickness, and dipolar offset field control are crucial for seamless integration of perpendicular magnetic junctions (pMTJ) into semiconductor integrated circuits to build scalable spin-transfer-torque magnetoresistive random access memory. This paper is concerned with materials and process tuning to deliver thermally robust (400 °C, 30 min) and thin (i.e., fewer layers and integration-friendly) pMTJ utilizing Co/Pt-based bottom pinned layers. Interlayer roughness control is identified as a key enabler to achieve high thermal budgets. The dipolar offset fields of the developed film stacks at scaled dimensions are evaluated by micromagnetic simulations. This paper shows a path towards achieving sub-15 nm-thick pMTJ withmore » tunneling magnetoresistance ratio higher than 150% after 30 min of thermal excursion at 400 °C.« less
Curating and Integrating Data from Multiple Sources to Support Healthcare Analytics.
Ng, Kenney; Kakkanatt, Chris; Benigno, Michael; Thompson, Clay; Jackson, Margaret; Cahan, Amos; Zhu, Xinxin; Zhang, Ping; Huang, Paul
2015-01-01
As the volume and variety of healthcare related data continues to grow, the analysis and use of this data will increasingly depend on the ability to appropriately collect, curate and integrate disparate data from many different sources. We describe our approach to and highlight our experiences with the development of a robust data collection, curation and integration infrastructure that supports healthcare analytics. This system has been successfully applied to the processing of a variety of data types including clinical data from electronic health records and observational studies, genomic data, microbiomic data, self-reported data from surveys and self-tracked data from wearable devices from over 600 subjects. The curated data is currently being used to support healthcare analytic applications such as data visualization, patient stratification and predictive modeling.
GenInfoGuard--a robust and distortion-free watermarking technique for genetic data.
Iftikhar, Saman; Khan, Sharifullah; Anwar, Zahid; Kamran, Muhammad
2015-01-01
Genetic data, in digital format, is used in different biological phenomena such as DNA translation, mRNA transcription and protein synthesis. The accuracy of these biological phenomena depend on genetic codes and all subsequent processes. To computerize the biological procedures, different domain experts are provided with the authorized access of the genetic codes; as a consequence, the ownership protection of such data is inevitable. For this purpose, watermarks serve as the proof of ownership of data. While protecting data, embedded hidden messages (watermarks) influence the genetic data; therefore, the accurate execution of the relevant processes and the overall result becomes questionable. Most of the DNA based watermarking techniques modify the genetic data and are therefore vulnerable to information loss. Distortion-free techniques make sure that no modifications occur during watermarking; however, they are fragile to malicious attacks and therefore cannot be used for ownership protection (particularly, in presence of a threat model). Therefore, there is a need for a technique that must be robust and should also prevent unwanted modifications. In this spirit, a watermarking technique with aforementioned characteristics has been proposed in this paper. The proposed technique makes sure that: (i) the ownership rights are protected by means of a robust watermark; and (ii) the integrity of genetic data is preserved. The proposed technique-GenInfoGuard-ensures its robustness through the "watermark encoding" in permuted values, and exhibits high decoding accuracy against various malicious attacks.
Women process multisensory emotion expressions more efficiently than men.
Collignon, O; Girard, S; Gosselin, F; Saint-Amour, D; Lepore, F; Lassonde, M
2010-01-01
Despite claims in the popular press, experiments investigating whether female are more efficient than male observers at processing expression of emotions produced inconsistent findings. In the present study, participants were asked to categorize fear and disgust expressions displayed auditorily, visually, or audio-visually. Results revealed an advantage of women in all the conditions of stimulus presentation. We also observed more nonlinear probabilistic summation in the bimodal conditions in female than male observers, indicating greater neural integration of different sensory-emotional informations. These findings indicate robust differences between genders in the multisensory perception of emotion expression.
2011-01-01
Background Integration of compatible or incompatible emotional valence and semantic information is an essential aspect of complex social interactions. A modified version of the Implicit Association Test (IAT) called Dual Valence Association Task (DVAT) was designed in order to measure conflict resolution processing from compatibility/incompatibly of semantic and facial valence. The DVAT involves two emotional valence evaluative tasks which elicits two forms of emotional compatible/incompatible associations (facial and semantic). Methods Behavioural measures and Event Related Potentials were recorded while participants performed the DVAT. Results Behavioural data showed a robust effect that distinguished compatible/incompatible tasks. The effects of valence and contextual association (between facial and semantic stimuli) showed early discrimination in N170 of faces. The LPP component was modulated by the compatibility of the DVAT. Conclusions Results suggest that DVAT is a robust paradigm for studying the emotional interference effect in the processing of simultaneous information from semantic and facial stimuli. PMID:21489277
Demonstration of a robust magnonic spin wave interferometer.
Kanazawa, Naoki; Goto, Taichi; Sekiguchi, Koji; Granovsky, Alexander B; Ross, Caroline A; Takagi, Hiroyuki; Nakamura, Yuichi; Inoue, Mitsuteru
2016-07-22
Magnonics is an emerging field dealing with ultralow power consumption logic circuits, in which the flow of spin waves, rather than electric charges, transmits and processes information. Waves, including spin waves, excel at encoding information via their phase using interference. This enables a number of inputs to be processed in one device, which offers the promise of multi-input multi-output logic gates. To realize such an integrated device, it is essential to demonstrate spin wave interferometers using spatially isotropic spin waves with high operational stability. However, spin wave reflection at the waveguide edge has previously limited the stability of interfering waves, precluding the use of isotropic spin waves, i.e., forward volume waves. Here, a spin wave absorber is demonstrated comprising a yttrium iron garnet waveguide partially covered by gold. This device is shown experimentally to be a robust spin wave interferometer using the forward volume mode, with a large ON/OFF isolation value of 13.7 dB even in magnetic fields over 30 Oe.
Demonstration of a robust magnonic spin wave interferometer
Kanazawa, Naoki; Goto, Taichi; Sekiguchi, Koji; Granovsky, Alexander B.; Ross, Caroline A.; Takagi, Hiroyuki; Nakamura, Yuichi; Inoue, Mitsuteru
2016-01-01
Magnonics is an emerging field dealing with ultralow power consumption logic circuits, in which the flow of spin waves, rather than electric charges, transmits and processes information. Waves, including spin waves, excel at encoding information via their phase using interference. This enables a number of inputs to be processed in one device, which offers the promise of multi-input multi-output logic gates. To realize such an integrated device, it is essential to demonstrate spin wave interferometers using spatially isotropic spin waves with high operational stability. However, spin wave reflection at the waveguide edge has previously limited the stability of interfering waves, precluding the use of isotropic spin waves, i.e., forward volume waves. Here, a spin wave absorber is demonstrated comprising a yttrium iron garnet waveguide partially covered by gold. This device is shown experimentally to be a robust spin wave interferometer using the forward volume mode, with a large ON/OFF isolation value of 13.7 dB even in magnetic fields over 30 Oe. PMID:27443989
Physical constraints on biological integral control design for homeostasis and sensory adaptation.
Ang, Jordan; McMillen, David R
2013-01-22
Synthetic biology includes an effort to use design-based approaches to create novel controllers, biological systems aimed at regulating the output of other biological processes. The design of such controllers can be guided by results from control theory, including the strategy of integral feedback control, which is central to regulation, sensory adaptation, and long-term robustness. Realization of integral control in a synthetic network is an attractive prospect, but the nature of biochemical networks can make the implementation of even basic control structures challenging. Here we present a study of the general challenges and important constraints that will arise in efforts to engineer biological integral feedback controllers or to analyze existing natural systems. Constraints arise from the need to identify target output values that the combined process-plus-controller system can reach, and to ensure that the controller implements a good approximation of integral feedback control. These constraints depend on mild assumptions about the shape of input-output relationships in the biological components, and thus will apply to a variety of biochemical systems. We summarize our results as a set of variable constraints intended to provide guidance for the design or analysis of a working biological integral feedback controller. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.
An integrative approach for measuring semantic similarities using gene ontology.
Peng, Jiajie; Li, Hongxiang; Jiang, Qinghua; Wang, Yadong; Chen, Jin
2014-01-01
Gene Ontology (GO) provides rich information and a convenient way to study gene functional similarity, which has been successfully used in various applications. However, the existing GO based similarity measurements have limited functions for only a subset of GO information is considered in each measure. An appropriate integration of the existing measures to take into account more information in GO is demanding. We propose a novel integrative measure called InteGO2 to automatically select appropriate seed measures and then to integrate them using a metaheuristic search method. The experiment results show that InteGO2 significantly improves the performance of gene similarity in human, Arabidopsis and yeast on both molecular function and biological process GO categories. InteGO2 computes gene-to-gene similarities more accurately than tested existing measures and has high robustness. The supplementary document and software are available at http://mlg.hit.edu.cn:8082/.
Qiu, Shi; Yang, Wen-Zhi; Yao, Chang-Liang; Qiu, Zhi-Dong; Shi, Xiao-Jian; Zhang, Jing-Xian; Hou, Jin-Jun; Wang, Qiu-Rong; Wu, Wan-Ying; Guo, De-An
2016-07-01
A key segment in authentication of herbal medicines is the establishment of robust biomarkers that embody the intrinsic metabolites difference independent of the growing environment or processing technics. We present a strategy by nontargeted metabolomics and "Commercial-homophyletic" comparison-induced biomarkers verification with new bioinformatic vehicles, to improve the efficiency and reliability in authentication of herbal medicines. The chemical differentiation of five different parts (root, leaf, flower bud, berry, and seed) of Panax ginseng was illustrated as a case study. First, an optimized ultra-performance liquid chromatography/quadrupole time-of-flight-MS(E) (UPLC/QTOF-MS(E)) approach was established for global metabolites profiling. Second, UNIFI™ combined with search of an in-house library was employed to automatically characterize the metabolites. Third, pattern recognition multivariate statistical analysis of the MS(E) data of different parts of commercial and homophyletic samples were separately performed to explore potential biomarkers. Fourth, potential biomarkers deduced from commercial and homophyletic root and leaf samples were cross-compared to infer robust biomarkers. Fifth, discriminating models by artificial neutral network (ANN) were established to identify different parts of P. ginseng. Consequently, 164 compounds were characterized, and 11 robust biomarkers enabling the differentiation among root, leaf, flower bud, and berry, were discovered by removing those structurally unstable and possibly processing-related ones. The ANN models using the robust biomarkers managed to exactly discriminate four different parts and root adulterant with leaf as well. Conclusively, biomarkers verification using homophyletic samples conduces to the discovery of robust biomarkers. The integrated strategy facilitates authentication of herbal medicines in a more efficient and more intelligent manner. Copyright © 2016 Elsevier B.V. All rights reserved.
Integrated Process Modeling-A Process Validation Life Cycle Companion.
Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph
2017-10-17
During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.
Cunha, Bárbara; Aguiar, Tiago; Silva, Marta M; Silva, Ricardo J S; Sousa, Marcos F Q; Pineda, Earl; Peixoto, Cristina; Carrondo, Manuel J T; Serra, Margarida; Alves, Paula M
2015-11-10
The integration of up- and downstream unit operations can result in the elimination of hold steps, thus decreasing the footprint, and ultimately can create robust closed system operations. This type of design is desirable for the bioprocess of human mesenchymal stem cells (hMSC), where high numbers of pure cells, at low volumes, need to be delivered for therapy applications. This study reports a proof of concept of the integration of a continuous perfusion culture in bioreactors with a tangential flow filtration (TFF) system for the concentration and washing of hMSC. Moreover, we have also explored a continuous alternative for concentrating hMSC. Results show that expanding cells in a continuous perfusion operation mode provided a higher expansion ratio, and led to a shift in cells' metabolism. TFF operated either in continuous or discontinuous allowed to concentrate cells, with high cell recovery (>80%) and viability (>95%); furthermore, continuous TFF permitted to operate longer with higher cell concentrations. Continuous diafiltration led to higher protein clearance (98%) with lower cell death, when comparing to discontinuous diafiltration. Overall, an integrated process allowed for a shorter process time, recovering 70% of viable hMSC (>95%), with no changes in terms of morphology, immunophenotype, proliferation capacity and multipotent differentiation potential. Copyright © 2015 Elsevier B.V. All rights reserved.
Robust passive control for a class of uncertain neutral systems based on sliding mode observer.
Liu, Zhen; Zhao, Lin; Kao, Yonggui; Gao, Cunchen
2017-01-01
The passivity-based sliding mode control (SMC) problem for a class of uncertain neutral systems with unmeasured states is investigated. Firstly, a particular non-fragile state observer is designed to generate the estimations of the system states, based upon which a novel integral-type sliding surface function is established for the control process. Secondly, a new sufficient condition for robust asymptotic stability and passivity of the resultant sliding mode dynamics (SMDs) is obtained in terms of linear matrix inequalities (LMIs). Thirdly, the finite-time reachability of the predesigned sliding surface is ensured by resorting to a novel adaptive SMC law. Finally, the validity and superiority of the scheme are justified via several examples. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
A robust low-rate coding scheme for packet video
NASA Technical Reports Server (NTRS)
Chen, Y. C.; Sayood, Khalid; Nelson, D. J.; Arikan, E. (Editor)
1991-01-01
Due to the rapidly evolving field of image processing and networking, video information promises to be an important part of telecommunication systems. Although up to now video transmission has been transported mainly over circuit-switched networks, it is likely that packet-switched networks will dominate the communication world in the near future. Asynchronous transfer mode (ATM) techniques in broadband-ISDN can provide a flexible, independent and high performance environment for video communication. For this paper, the network simulator was used only as a channel in this simulation. Mixture blocking coding with progressive transmission (MBCPT) has been investigated for use over packet networks and has been found to provide high compression rate with good visual performance, robustness to packet loss, tractable integration with network mechanics and simplicity in parallel implementation.
Tomographical process monitoring of laser transmission welding with OCT
NASA Astrophysics Data System (ADS)
Ackermann, Philippe; Schmitt, Robert
2017-06-01
Process control of laser processes still encounters many obstacles. Although these processes are stable, a narrow process parameter window during the process or process deviations have led to an increase on the requirements for the process itself and on monitoring devices. Laser transmission welding as a contactless and locally limited joining technique is well-established in a variety of demanding production areas. For example, sensitive parts demand a particle-free joining technique which does not affect the inner components. Inline integrated non-destructive optical measurement systems capable of providing non-invasive tomographical images of the transparent material, the weld seam and its surrounding areas with micron resolution would improve the overall process. Obtained measurement data enable qualitative feedback into the system to adapt parameters for a more robust process. Within this paper we present the inline monitoring device based on Fourier-domain optical coherence tomography developed within the European-funded research project "Manunet Weldable". This device, after adaptation to the laser transmission welding process is optically and mechanically integrated into the existing laser system. The main target lies within the inline process control destined to extract tomographical geometrical measurement data from the weld seam forming process. Usage of this technology makes offline destructive testing of produced parts obsolete. 1,2,3,4
Flexible distributed architecture for semiconductor process control and experimentation
NASA Astrophysics Data System (ADS)
Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.
1997-01-01
Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.
Genomic medicine in the military
De Castro, Mauricio; Biesecker, Leslie G; Turner, Clesson; Brenner, Ruth; Witkop, Catherine; Mehlman, Maxwell; Bradburne, Chris; Green, Robert C
2016-01-01
The announcement of the Precision Medicine Initiative was an important step towards establishing the use of genomic information as part of the wider practice of medicine. The US military has been exploring the role that genomic information will have in health care for service members (SMs) and its integration into the continuum of military medicine. An important part of the process is establishing robust protections to protect SMs from genetic discrimination in the era of exome/genome sequencing. PMID:29263806
A robust functional-data-analysis method for data recovery in multichannel sensor systems.
Sun, Jian; Liao, Haitao; Upadhyaya, Belle R
2014-08-01
Multichannel sensor systems are widely used in condition monitoring for effective failure prevention of critical equipment or processes. However, loss of sensor readings due to malfunctions of sensors and/or communication has long been a hurdle to reliable operations of such integrated systems. Moreover, asynchronous data sampling and/or limited data transmission are usually seen in multiple sensor channels. To reliably perform fault diagnosis and prognosis in such operating environments, a data recovery method based on functional principal component analysis (FPCA) can be utilized. However, traditional FPCA methods are not robust to outliers and their capabilities are limited in recovering signals with strongly skewed distributions (i.e., lack of symmetry). This paper provides a robust data-recovery method based on functional data analysis to enhance the reliability of multichannel sensor systems. The method not only considers the possibly skewed distribution of each channel of signal trajectories, but is also capable of recovering missing data for both individual and correlated sensor channels with asynchronous data that may be sparse as well. In particular, grand median functions, rather than classical grand mean functions, are utilized for robust smoothing of sensor signals. Furthermore, the relationship between the functional scores of two correlated signals is modeled using multivariate functional regression to enhance the overall data-recovery capability. An experimental flow-control loop that mimics the operation of coolant-flow loop in a multimodular integral pressurized water reactor is used to demonstrate the effectiveness and adaptability of the proposed data-recovery method. The computational results illustrate that the proposed method is robust to outliers and more capable than the existing FPCA-based method in terms of the accuracy in recovering strongly skewed signals. In addition, turbofan engine data are also analyzed to verify the capability of the proposed method in recovering non-skewed signals.
Breakdown of interdependent directed networks.
Liu, Xueming; Stanley, H Eugene; Gao, Jianxi
2016-02-02
Increasing evidence shows that real-world systems interact with one another via dependency connectivities. Failing connectivities are the mechanism behind the breakdown of interacting complex systems, e.g., blackouts caused by the interdependence of power grids and communication networks. Previous research analyzing the robustness of interdependent networks has been limited to undirected networks. However, most real-world networks are directed, their in-degrees and out-degrees may be correlated, and they are often coupled to one another as interdependent directed networks. To understand the breakdown and robustness of interdependent directed networks, we develop a theoretical framework based on generating functions and percolation theory. We find that for interdependent Erdős-Rényi networks the directionality within each network increases their vulnerability and exhibits hybrid phase transitions. We also find that the percolation behavior of interdependent directed scale-free networks with and without degree correlations is so complex that two criteria are needed to quantify and compare their robustness: the percolation threshold and the integrated size of the giant component during an entire attack process. Interestingly, we find that the in-degree and out-degree correlations in each network layer increase the robustness of interdependent degree heterogeneous networks that most real networks are, but decrease the robustness of interdependent networks with homogeneous degree distribution and with strong coupling strengths. Moreover, by applying our theoretical analysis to real interdependent international trade networks, we find that the robustness of these real-world systems increases with the in-degree and out-degree correlations, confirming our theoretical analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Ji, Haoran; Wang, Chengshan
Distributed generators (DGs) including photovoltaic panels (PVs) have been integrated dramatically in active distribution networks (ADNs). Due to the strong volatility and uncertainty, the high penetration of PV generation immensely exacerbates the conditions of voltage violation in ADNs. However, the emerging flexible interconnection technology based on soft open points (SOPs) provides increased controllability and flexibility to the system operation. For fully exploiting the regulation ability of SOPs to address the problems caused by PV, this paper proposes a robust optimization method to achieve the robust optimal operation of SOPs in ADNs. A two-stage adjustable robust optimization model is built tomore » tackle the uncertainties of PV outputs, in which robust operation strategies of SOPs are generated to eliminate the voltage violations and reduce the power losses of ADNs. A column-and-constraint generation (C&CG) algorithm is developed to solve the proposed robust optimization model, which are formulated as second-order cone program (SOCP) to facilitate the accuracy and computation efficiency. Case studies on the modified IEEE 33-node system and comparisons with the deterministic optimization approach are conducted to verify the effectiveness and robustness of the proposed method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiebenga, J. H.; Atzema, E. H.; Boogaard, A. H. van den
Robust design of forming processes using numerical simulations is gaining attention throughout the industry. In this work, it is demonstrated how robust optimization can assist in further stretching the limits of metal forming processes. A deterministic and a robust optimization study are performed, considering a stretch-drawing process of a hemispherical cup product. For the robust optimization study, both the effect of material and process scatter are taken into account. For quantifying the material scatter, samples of 41 coils of a drawing quality forming steel have been collected. The stochastic material behavior is obtained by a hybrid approach, combining mechanical testingmore » and texture analysis, and efficiently implemented in a metamodel based optimization strategy. The deterministic and robust optimization results are subsequently presented and compared, demonstrating an increased process robustness and decreased number of product rejects by application of the robust optimization approach.« less
Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation
NASA Technical Reports Server (NTRS)
Roberts, Barry; Bhanu, Bir
1992-01-01
Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.
An integrated CMOS bio-potential amplifier with a feed-forward DC cancellation topology.
Parthasarathy, Jayant; Erdman, Arthur G; Redish, Aaron D; Ziaie, Babak
2006-01-01
This paper describes a novel technique to realize an integrated CMOS bio-potential amplifier with a feedforward DC cancellation topology. The amplifier is designed to provide substantial DC cancellation even while amplifying very low frequency signals. More than 80 dB offset rejection ratio is achieved without any external capacitors. The cancellation scheme is robust against process and temperature variations. The amplifier is fabricated through MOSIS AMI 1.5 microm technology (0.05 mm2 area). Measurement results show a gain of 43.5 dB in the pass band (<1 mHz-5 KHz), an input referred noise of 3.66 microVrms, and a current consumption of 22 microA.
Fugitive Methane Gas Emission Monitoring in oil and gas industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Levente
Identifying fugitive methane leaks allow optimization of the extraction process, can extend gas extraction equipment lifetime, and eliminate hazardous work conditions. We demonstrate a wireless sensor network based on cost effective and robust chemi-resistive methane sensors combined with real time analytics to identify leaks from 2 scfh to 10000 scfh. The chemi-resistive sensors were validated for sensitivity better than 1 ppm of methane plume detection. The real time chemical sensor and wind data is integrated into an inversion models to identify the location and the magnitude of the methane leak. This integrated solution can be deployed in outdoor environment formore » long term monitoring of chemical plumes.« less
Park, Chan Woo; Moon, Yu Gyeong; Seong, Hyejeong; Jung, Soon Won; Oh, Ji-Young; Na, Bock Soon; Park, Nae-Man; Lee, Sang Seok; Im, Sung Gap; Koo, Jae Bon
2016-06-22
We demonstrate a new patterning technique for gallium-based liquid metals on flat substrates, which can provide both high pattern resolution (∼20 μm) and alignment precision as required for highly integrated circuits. In a very similar manner as in the patterning of solid metal films by photolithography and lift-off processes, the liquid metal layer painted over the whole substrate area can be selectively removed by dissolving the underlying photoresist layer, leaving behind robust liquid patterns as defined by the photolithography. This quick and simple method makes it possible to integrate fine-scale interconnects with preformed devices precisely, which is indispensable for realizing monolithically integrated stretchable circuits. As a way for constructing stretchable integrated circuits, we propose a hybrid configuration composed of rigid device regions and liquid interconnects, which is constructed on a rigid substrate first but highly stretchable after being transferred onto an elastomeric substrate. This new method can be useful in various applications requiring both high-resolution and precisely aligned patterning of gallium-based liquid metals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.
2011-03-01
This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less
Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M
2013-06-01
Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monozov, Dmitriy; Lukie, Zarija
2016-04-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less
Gap geometry dictates epithelial closure efficiency
Ravasio, Andrea; Cheddadi, Ibrahim; Chen, Tianchi; Pereira, Telmo; Ong, Hui Ting; Bertocchi, Cristina; Brugues, Agusti; Jacinto, Antonio; Kabla, Alexandre J.; Toyama, Yusuke; Trepat, Xavier; Gov, Nir; Neves de Almeida, Luís; Ladoux, Benoit
2015-01-01
Closure of wounds and gaps in tissues is fundamental for the correct development and physiology of multicellular organisms and, when misregulated, may lead to inflammation and tumorigenesis. To re-establish tissue integrity, epithelial cells exhibit coordinated motion into the void by active crawling on the substrate and by constricting a supracellular actomyosin cable. Coexistence of these two mechanisms strongly depends on the environment. However, the nature of their coupling remains elusive because of the complexity of the overall process. Here we demonstrate that epithelial gap geometry in both in vitro and in vivo regulates these collective mechanisms. In addition, the mechanical coupling between actomyosin cable contraction and cell crawling acts as a large-scale regulator to control the dynamics of gap closure. Finally, our computational modelling clarifies the respective roles of the two mechanisms during this process, providing a robust and universal mechanism to explain how epithelial tissues restore their integrity. PMID:26158873
Analytic network process model for sustainable lean and green manufacturing performance indicator
NASA Astrophysics Data System (ADS)
Aminuddin, Adam Shariff Adli; Nawawi, Mohd Kamal Mohd; Mohamed, Nik Mohd Zuki Nik
2014-09-01
Sustainable manufacturing is regarded as the most complex manufacturing paradigm to date as it holds the widest scope of requirements. In addition, its three major pillars of economic, environment and society though distinct, have some overlapping among each of its elements. Even though the concept of sustainability is not new, the development of the performance indicator still needs a lot of improvement due to its multifaceted nature, which requires integrated approach to solve the problem. This paper proposed the best combination of criteria en route a robust sustainable manufacturing performance indicator formation via Analytic Network Process (ANP). The integrated lean, green and sustainable ANP model can be used to comprehend the complex decision system of the sustainability assessment. The finding shows that green manufacturing is more sustainable than lean manufacturing. It also illustrates that procurement practice is the most important criteria in the sustainable manufacturing performance indicator.
NASA Astrophysics Data System (ADS)
Saunders, Vance M.
1999-06-01
The downsizing of the Department of Defense (DoD) and the associated reduction in budgets has re-emphasized the need for commonality, reuse, and standards with respect to the way DoD does business. DoD has implemented significant changes in how it buys weapon systems. The new emphasis is on concurrent engineering with Integrated Product and Process Development and collaboration with Integrated Product Teams. The new DoD vision includes Simulation Based Acquisition (SBA), a process supported by robust, collaborative use of simulation technology that is integrated across acquisition phases and programs. This paper discusses the Air Force Research Laboratory's efforts to use Modeling and Simulation (M&S) resources within a Collaborative Enterprise Environment to support SBA and other Collaborative Enterprise and Virtual Prototyping (CEVP) applications. The paper will discuss four technology areas: (1) a Processing Ontology that defines a hierarchically nested set of collaboration contexts needed to organize and support multi-disciplinary collaboration using M&S, (2) a partial taxonomy of intelligent agents needed to manage different M&S resource contributions to advancing the state of product development, (3) an agent- based process for interfacing disparate M&S resources into a CEVP framework, and (4) a Model-View-Control based approach to defining `a new way of doing business' for users of CEVP frameworks/systems.
NASA Technical Reports Server (NTRS)
Chen, Wei; Tsui, Kwok-Leung; Allen, Janet K.; Mistree, Farrokh
1994-01-01
In this paper we introduce a comprehensive and rigorous robust design procedure to overcome some limitations of the current approaches. A comprehensive approach is general enough to model the two major types of robust design applications, namely, robust design associated with the minimization of the deviation of performance caused by the deviation of noise factors (uncontrollable parameters), and robust design due to the minimization of the deviation of performance caused by the deviation of control factors (design variables). We achieve mathematical rigor by using, as a foundation, principles from the design of experiments and optimization. Specifically, we integrate the Response Surface Method (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closed-form solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. Our focus in this paper is on illustrating our approach rather than on the results per se.
Integrating Terrain Maps Into a Reactive Navigation Strategy
NASA Technical Reports Server (NTRS)
Howard, Ayanna; Werger, Barry; Seraji, Homayoun
2006-01-01
An improved method of processing information for autonomous navigation of a robotic vehicle across rough terrain involves the integration of terrain maps into a reactive navigation strategy. Somewhat more precisely, the method involves the incorporation, into navigation logic, of data equivalent to regional traversability maps. The terrain characteristic is mapped using a fuzzy-logic representation of the difficulty of traversing the terrain. The method is robust in that it integrates a global path-planning strategy with sensor-based regional and local navigation strategies to ensure a high probability of success in reaching a destination and avoiding obstacles along the way. The sensor-based strategies use cameras aboard the vehicle to observe the regional terrain, defined as the area of the terrain that covers the immediate vicinity near the vehicle to a specified distance a few meters away.
Audio-visual affective expression recognition
NASA Astrophysics Data System (ADS)
Huang, Thomas S.; Zeng, Zhihong
2007-11-01
Automatic affective expression recognition has attracted more and more attention of researchers from different disciplines, which will significantly contribute to a new paradigm for human computer interaction (affect-sensitive interfaces, socially intelligent environments) and advance the research in the affect-related fields including psychology, psychiatry, and education. Multimodal information integration is a process that enables human to assess affective states robustly and flexibly. In order to understand the richness and subtleness of human emotion behavior, the computer should be able to integrate information from multiple sensors. We introduce in this paper our efforts toward machine understanding of audio-visual affective behavior, based on both deliberate and spontaneous displays. Some promising methods are presented to integrate information from both audio and visual modalities. Our experiments show the advantage of audio-visual fusion in affective expression recognition over audio-only or visual-only approaches.
Single-use thermoplastic microfluidic burst valves enabling on-chip reagent storage
Rahmanian, Omid D.
2014-01-01
A simple and reliable method for fabricating single-use normally closed burst valves in thermoplastic microfluidic devices is presented, using a process flow that is readily integrated into established workflows for the fabrication of thermoplastic microfluidics. An experimental study of valve performance reveals the relationships between valve geometry and burst pressure. The technology is demonstrated in a device employing multiple valves engineered to actuate at different inlet pressures that can be generated using integrated screw pumps. On-chip storage and reconstitution of fluorescein salt sealed within defined reagent chambers are demonstrated. By taking advantage of the low gas and water permeability of cyclic olefin copolymer, the robust burst valves allow on-chip hermetic storage of reagents, making the technology well suited for the development of integrated and disposable assays for use at the point of care. PMID:25972774
A fully-integrated aptamer-based affinity assay platform for monitoring astronaut health in space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xianbin; Durland, Ross H.; Hecht, Ariel H.
2010-07-01
Here we demonstrate the suitability of robust nucleic acid affinity reagents in an integrated point-of-care diagnostic platform for monitoring proteomic biomarkers indicative of astronaut health in spaceflight applications. A model thioaptamer targeting nuclear factor-kappa B (NF-{kappa}B) is evaluated in an on-chip electrophoretic gel-shift assay for human serum. Key steps of (i) mixing sample with the aptamer, (ii) buffer exchange, and (iii) preconcentration of sample were successfully integrated upstream of fluorescence-based detection. Challenges due to (i) nonspecific interactions with serum, and (ii) preconcentration at a nanoporous membrane are discussed and successfully resolved to yield a robust, rapid, and fully-integrated diagnostic system.
Sinclair, D.; Oranje, B.; Razak, K.A.; Siegel, S.J.; Schmid, S.
2017-01-01
Brains are constantly flooded with sensory information that needs to be filtered at the pre-attentional level and integrated into endogenous activity in order to allow for detection of salient information and an appropriate behavioral response. People with Autism Spectrum Disorder (ASD) or Fragile X Syndrome (FXS) are often over- or under-reactive to stimulation, leading to a wide range of behavioral symptoms. This altered sensitivity may be caused by disrupted sensory processing, signal integration and/or gating, and is often being neglected. Here, we review translational experimental approaches that are used to investigate sensory processing in humans with ASD and FXS, and in relevant rodent models. This includes electroencephalographic measurement of event related potentials, neural oscillations and mismatch negativity, as well as habituation and pre-pulse inhibition of startle. We outline robust evidence of disrupted sensory processing in individuals with ASD and FXS, and in respective animal models, focusing on the auditory sensory domain. Animal models provide an excellent opportunity to examine common mechanisms of sensory pathophysiology in order to develop therapeutics. PMID:27235081
Evaluating integrated strategies for robust treatment of high saline piggery wastewater.
Kim, Hyun-Chul; Choi, Wook Jin; Chae, A Na; Park, Joonhong; Kim, Hyung Joo; Song, Kyung Guen
2016-02-01
In this study, we integrated physicochemical and biological strategies for the robust treatment of piggery effluent in which high levels of organic constituents, inorganic nutrients, color, and salts remained. Piggery effluent that was stabilized in an anaerobic digester was sequentially coagulated, micro-filtered, and air-stripped prior to biological treatment with mixotrophic algal species that showed tolerance to high salinity (up to 4.8% as Cl(-)). The algae treatment was conducted with continuous O2 supplementation instead of using the combination of high lighting and CO2 injection. The microalga Scenedesmus quadricauda employed as a bio-agent was capable of assimilating both nitrogen (222 mg N g cell(-1) d(-1)) and phosphorus (9.3 mg P g cell(-1) d(-1)) and utilizing dissolved organics (2053 mg COD g cell(-1) d(-1)) as a carbon source in a single treatment process under the heterotrophic growth conditions. The heterotrophic growth of S. quadricauda proceeded rapidly by directly incorporating organic substrate in the oxidative assimilation process, which coincided with the high productivity of algal biomass, accounting for 2.4 g cell L(-1) d(-1). The algae-treated wastewater was subsequently ozonated to comply with discharge permits that limit color in the effluent, which also resulted in improved biodegradability of residual organics. The integrated treatment scheme proposed in this study also achieved 89% removal of COD, 88% removal of TN, and 60% removal of TP. The advantage of using the hybrid configuration suggests that this would be a promising strategy in full-scale treatment facilities for piggery effluent. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vanderlinden, J. P.; Baztan, J.
2014-12-01
The prupose of this paper is to present the "Adaptation Research a Transdisciplinary community and policy centered appoach" (ARTisticc) project. ARTisticc's goal is to apply innovative standardized transdisciplinary art and science integrative approaches to foster robust, socially, culturally and scientifically, community centred adaptation to climate change. The approach used in the project is based on the strong understanding that adaptation is: (a) still "a concept of uncertain form"; (b) a concept dealing with uncertainty; (c) a concept that calls for an analysis that goes beyond the traditional disciplinary organization of science, and; (d) an unconventional process in the realm of science and policy integration. The project is centered on case studies in France, Greenland, Russia, India, Canada, Alaska, and Senegal. In every site we jointly develop artwork while we analyzing how natural science, essentially geosciences can be used in order to better adapt in the future, how society adapt to current changes and how memories of past adaptations frames current and future processes. Artforms are mobilized in order to share scientific results with local communities and policy makers, this in a way that respects cultural specificities while empowering stakeholders, ARTISTICC translates these "real life experiments" into stories and artwork that are meaningful to those affected by climate change. The scientific results and the culturally mediated productions will thereafter be used in order to co-construct, with NGOs and policy makers, policy briefs, i.e. robust and scientifically legitimate policy recommendations regarding coastal adaptation. This co-construction process will be in itself analysed with the goal of increasing arts and science's performative functions in the universe of evidence-based policy making. The project involves scientists from natural sciences, the social sciences and the humanities, as well as artitis from the performing arts (playwriters, film directors) as well as the visual arts (photographs, designers, sculptor) working in France, Senegal, India, Russia, Greenland, Alaska, and Canada
Fracturing And Liquid CONvection
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-02-29
FALCON has been developed to enable simulation of the tightly coupled fluid-rock behavior in hydrothermal and engineered geothermal system (EGS) reservoirs, targeting the dynamics of fracture stimulation, fluid flow, rock deformation, and heat transport in a single integrated code, with the ultimate goal of providing a tool that can be used to test the viability of EGS in the United States and worldwide. Reliable reservoir performance predictions of EGS systems require accurate and robust modeling for the coupled thermalhydrologicalmechanical processes.
Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke
2018-04-09
In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology.
CMOS-compatible photonic devices for single-photon generation
NASA Astrophysics Data System (ADS)
Xiong, Chunle; Bell, Bryn; Eggleton, Benjamin J.
2016-09-01
Sources of single photons are one of the key building blocks for quantum photonic technologies such as quantum secure communication and powerful quantum computing. To bring the proof-of-principle demonstration of these technologies from the laboratory to the real world, complementary metal-oxide-semiconductor (CMOS)-compatible photonic chips are highly desirable for photon generation, manipulation, processing and even detection because of their compactness, scalability, robustness, and the potential for integration with electronics. In this paper, we review the development of photonic devices made from materials (e.g., silicon) and processes that are compatible with CMOS fabrication facilities for the generation of single photons.
Fiber Optic Tamper Indicating Enclosure (TIE); A Case Study in Authentication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anheier, Norman C.; Benz, Jacob M.; Tanner, Jennifer E.
2015-07-15
A robust fiber optic-based tamper-indicating enclosure (TIE) has been developed by PNNL through funding by the National Nuclear Security Administration Office of Nuclear Verification over the past few years. The objective of this work is to allow monitors to have confidence in both the authenticity and integrity of the TIE and the monitoring equipment inside, throughout the time it may be located at a host facility. Incorporating authentication features into the design were the focus of fiscal year 2014 development efforts. Throughout the development process, modifications have been made to the physical TIE design based on lessons learned via exercisesmore » and expert elicitation. The end result is a robust and passive TIE which can be utilized to protect monitoring party equipment left in a host facility.« less
E-Textile Antennas for Space Environments
NASA Technical Reports Server (NTRS)
Kennedy, Timothy F.; Fink, Patrick W.; Chu, Andrew W.
2007-01-01
The ability to integrate antennas and other radio frequency (RF) devices into wearable systems is increasingly important as wireless voice, video, and data sources become ubiquitous. Consumer applications including mobile computing, communications, and entertainment, as well as military and space applications for integration of biotelemetry, detailed tracking information and status of handheld tools, devices and on-body inventories are driving forces for research into wearable antennas and other e-textile devices. Operational conditions for military and space applications of wireless systems are often such that antennas are a limiting factor in wireless performance. The changing antenna platform, i.e. the dynamic wearer, can detune and alter the radiation characteristics of e-textile antennas, making antenna element selection and design challenging. Antenna designs and systems that offer moderate bandwidth, perform well with flexure, and are electronically reconfigurable are ideally suited to wearable applications. Several antennas, shown in Figure 1, have been created using a NASA-developed process for e-textiles that show promise in being integrated into a robust wireless system for space-based applications. Preliminary characterization of the antennas with flexure indicates that antenna performance can be maintained, and that a combination of antenna design and placement are useful in creating robust designs. Additionally, through utilization of modern smart antenna techniques, even greater flexibility can be achieved since antenna performance can be adjusted in real-time to compensate for the antenna s changing environment.
Towards Integrated Health Technology Assessment for Improving Decision Making in Selected Countries.
Oortwijn, Wija; Determann, Domino; Schiffers, Krijn; Tan, Siok Swan; van der Tuin, Jeroen
2017-09-01
To assess the level of comprehensiveness of health technology assessment (HTA) practices around the globe and to formulate recommendations for enhancing legitimacy and fairness of related decision-making processes. To identify best practices, we developed an evaluation framework consisting of 13 criteria on the basis of the INTEGRATE-HTA model (integrative perspective on assessing health technologies) and the Accountability for Reasonableness framework (deliberative appraisal process). We examined different HTA systems in middle-income countries (Argentina, Brazil, and Thailand) and high-income countries (Australia, Canada, England, France, Germany, Scotland, and South Korea). For this purpose, desk research and structured interviews with relevant key stakeholders (N = 32) in the selected countries were conducted. HTA systems in Canada, England, and Scotland appear relatively well aligned with our framework, followed by Australia, Germany, and France. Argentina and South Korea are at an early stage, whereas Brazil and Thailand are at an intermediate level. Both desk research and interviews revealed that scoping is often not part of the HTA process. In contrast, providing evidence reports for assessment is well established. Indirect and unintended outcomes are increasingly considered, but there is room for improvement. Monitoring and evaluation of the HTA process is not well established across countries. Finally, adopting transparent and robust processes, including stakeholder consultation, takes time. This study presents a framework for assessing the level of comprehensiveness of the HTA process in a country. On the basis of applying the framework, we formulate recommendations on how the HTA community can move toward a more integrated decision-making process using HTA. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Airborne Transducer Integrity under Operational Environment for Structural Health Monitoring
Salmanpour, Mohammad Saleh; Sharif Khodaei, Zahra; Aliabadi, Mohammad Hossein
2016-01-01
This paper investigates the robustness of permanently mounted transducers used in airborne structural health monitoring systems, when exposed to the operational environment. Typical airliners operate in a range of conditions, hence, structural health monitoring (SHM) transducer robustness and integrity must be demonstrated for these environments. A set of extreme temperature, altitude and vibration environment test profiles are developed using the existing Radio Technical Commission for Aeronautics (RTCA)/DO-160 test methods. Commercially available transducers and manufactured versions bonded to carbon fibre reinforced polymer (CFRP) composite materials are tested. It was found that the DuraAct transducer is robust to environmental conditions tested, while the other transducer types degrade under the same conditions. PMID:27973450
NASA Technical Reports Server (NTRS)
Tri, Terry O.; Kennedy, Kriss J.; Toups, Larry; Gill, Tracy R.; Howe, A. Scott
2011-01-01
This paper describes the construction, assembly, subsystem integration, transportation, and field testing operations associated with the Habitat Demonstration Unit (HDU) Pressurized Excursion Module (PEM) and discusses lessons learned. In a one-year period beginning summer 2009, a tightly scheduled design-develop-build process was utilized by a small NASA "tiger team" to produce the functional HDU-PEM prototype in time to participate in the 2010 Desert Research and Technology Studies (Desert RATS) field campaign. The process required the coordination of multiple teams, subcontractors, facility management and safety staff. It also required a well-choreographed material handling and transportation process to deliver the finished product from the NASA-Johnson Space Center facilities to the remote Arizona desert locations of the field test. Significant findings of this paper include the team s greater understanding of the HDU-PEM s many integration issues and the in-field training the team acquired which will enable the implementation of the next-generation of improvements and development of high-fidelity field operations in a harsh environment. The Desert RATS analog environment is being promoted by NASA as an efficient means to design, build, and integrate multiple technologies in a mission architecture context, with the eventual goal of evolving the technologies into robust flight hardware systems. The HDU-PEM in-field demonstration at Desert RATS 2010 provided a validation process for the integration team, which has already begun to retool for the 2011 field tests that require an adapted architecture.
Skin-inspired hydrogel-elastomer hybrids with robust interfaces and functional microstructures
NASA Astrophysics Data System (ADS)
Yuk, Hyunwoo; Zhang, Teng; Parada, German Alberto; Liu, Xinyue; Zhao, Xuanhe
2016-06-01
Inspired by mammalian skins, soft hybrids integrating the merits of elastomers and hydrogels have potential applications in diverse areas including stretchable and bio-integrated electronics, microfluidics, tissue engineering, soft robotics and biomedical devices. However, existing hydrogel-elastomer hybrids have limitations such as weak interfacial bonding, low robustness and difficulties in patterning microstructures. Here, we report a simple yet versatile method to assemble hydrogels and elastomers into hybrids with extremely robust interfaces (interfacial toughness over 1,000 Jm-2) and functional microstructures such as microfluidic channels and electrical circuits. The proposed method is generally applicable to various types of tough hydrogels and diverse commonly used elastomers including polydimethylsiloxane Sylgard 184, polyurethane, latex, VHB and Ecoflex. We further demonstrate applications enabled by the robust and microstructured hydrogel-elastomer hybrids including anti-dehydration hydrogel-elastomer hybrids, stretchable and reactive hydrogel-elastomer microfluidics, and stretchable hydrogel circuit boards patterned on elastomer.
Unified dead-time compensation structure for SISO processes with multiple dead times.
Normey-Rico, Julio E; Flesch, Rodolfo C C; Santos, Tito L M
2014-11-01
This paper proposes a dead-time compensation structure for processes with multiple dead times. The controller is based on the filtered Smith predictor (FSP) dead-time compensator structure and it is able to control stable, integrating, and unstable processes with multiple input/output dead times. An equivalent model of the process is first computed in order to define the predictor structure. Using this equivalent model, the primary controller and the predictor filter are tuned to obtain an internally stable closed-loop system which also attempts some closed-loop specifications in terms of set-point tracking, disturbance rejection, and robustness. Some simulation case studies are used to illustrate the good properties of the proposed approach. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Optimized pulses for the control of uncertain qubits
Grace, Matthew D.; Dominy, Jason M.; Witzel, Wayne M.; ...
2012-05-18
The construction of high-fidelity control fields that are robust to control, system, and/or surrounding environment uncertainties is a crucial objective for quantum information processing. Using the two-state Landau-Zener model for illustrative simulations of a controlled qubit, we generate optimal controls for π/2 and π pulses and investigate their inherent robustness to uncertainty in the magnitude of the drift Hamiltonian. Next, we construct a quantum-control protocol to improve system-drift robustness by combining environment-decoupling pulse criteria and optimal control theory for unitary operations. By perturbatively expanding the unitary time-evolution operator for an open quantum system, previous analysis of environment-decoupling control pulses hasmore » calculated explicit control-field criteria to suppress environment-induced errors up to (but not including) third order from π/2 and π pulses. We systematically integrate this criteria with optimal control theory, incorporating an estimate of the uncertain parameter to produce improvements in gate fidelity and robustness, demonstrated via a numerical example based on double quantum dot qubits. For the qubit model used in this work, postfacto analysis of the resulting controls suggests that realistic control-field fluctuations and noise may contribute just as significantly to gate errors as system and environment fluctuations.« less
Stochastic Integration H∞ Filter for Rapid Transfer Alignment of INS.
Zhou, Dapeng; Guo, Lei
2017-11-18
The performance of an inertial navigation system (INS) operated on a moving base greatly depends on the accuracy of rapid transfer alignment (RTA). However, in practice, the coexistence of large initial attitude errors and uncertain observation noise statistics poses a great challenge for the estimation accuracy of misalignment angles. This study aims to develop a novel robust nonlinear filter, namely the stochastic integration H ∞ filter (SIH ∞ F) for improving both the accuracy and robustness of RTA. In this new nonlinear H ∞ filter, the stochastic spherical-radial integration rule is incorporated with the framework of the derivative-free H ∞ filter for the first time, and the resulting SIH ∞ F simultaneously attenuates the negative effect in estimations caused by significant nonlinearity and large uncertainty. Comparisons between the SIH ∞ F and previously well-known methodologies are carried out by means of numerical simulation and a van test. The results demonstrate that the newly-proposed method outperforms the cubature H ∞ filter. Moreover, the SIH ∞ F inherits the benefit of the traditional stochastic integration filter, but with more robustness in the presence of uncertainty.
Canales, Javier; Moyano, Tomás C.; Villarroel, Eva; Gutiérrez, Rodrigo A.
2014-01-01
Nitrogen (N) is an essential macronutrient for plant growth and development. Plants adapt to changes in N availability partly by changes in global gene expression. We integrated publicly available root microarray data under contrasting nitrate conditions to identify new genes and functions important for adaptive nitrate responses in Arabidopsis thaliana roots. Overall, more than 2000 genes exhibited changes in expression in response to nitrate treatments in Arabidopsis thaliana root organs. Global regulation of gene expression by nitrate depends largely on the experimental context. However, despite significant differences from experiment to experiment in the identity of regulated genes, there is a robust nitrate response of specific biological functions. Integrative gene network analysis uncovered relationships between nitrate-responsive genes and 11 highly co-expressed gene clusters (modules). Four of these gene network modules have robust nitrate responsive functions such as transport, signaling, and metabolism. Network analysis hypothesized G2-like transcription factors are key regulatory factors controlling transport and signaling functions. Our meta-analysis highlights the role of biological processes not studied before in the context of the nitrate response such as root hair development and provides testable hypothesis to advance our understanding of nitrate responses in plants. PMID:24570678
Empirical study using network of semantically related associations in bridging the knowledge gap.
Abedi, Vida; Yeasin, Mohammed; Zand, Ramin
2014-11-27
The data overload has created a new set of challenges in finding meaningful and relevant information with minimal cognitive effort. However designing robust and scalable knowledge discovery systems remains a challenge. Recent innovations in the (biological) literature mining tools have opened new avenues to understand the confluence of various diseases, genes, risk factors as well as biological processes in bridging the gaps between the massive amounts of scientific data and harvesting useful knowledge. In this paper, we highlight some of the findings using a text analytics tool, called ARIANA--Adaptive Robust and Integrative Analysis for finding Novel Associations. Empirical study using ARIANA reveals knowledge discovery instances that illustrate the efficacy of such tool. For example, ARIANA can capture the connection between the drug hexamethonium and pulmonary inflammation and fibrosis that caused the tragic death of a healthy volunteer in a 2001 John Hopkins asthma study, even though the abstract of the study was not part of the semantic model. An integrated system, such as ARIANA, could assist the human expert in exploratory literature search by bringing forward hidden associations, promoting data reuse and knowledge discovery as well as stimulating interdisciplinary projects by connecting information across the disciplines.
Robust performance of multiple tasks by a mobile robot
NASA Technical Reports Server (NTRS)
Beckerman, Martin; Barnett, Deanna L.; Dickens, Mike; Weisbin, Charles R.
1989-01-01
While there have been many successful mobile robot experiments, only a few papers have addressed issues pertaining to the range of applicability, or robustness, of robotic systems. The purpose of this paper is to report results of a series of benchmark experiments done to determine and quantify the robustness of an integrated hardware and software system of a mobile robot.
Lin, Zhicheng; He, Sheng
2012-10-25
Object identities ("what") and their spatial locations ("where") are processed in distinct pathways in the visual system, raising the question of how the what and where information is integrated. Because of object motions and eye movements, the retina-based representations are unstable, necessitating nonretinotopic representation and integration. A potential mechanism is to code and update objects according to their reference frames (i.e., frame-centered representation and integration). To isolate frame-centered processes, in a frame-to-frame apparent motion configuration, we (a) presented two preceding or trailing objects on the same frame, equidistant from the target on the other frame, to control for object-based (frame-based) effect and space-based effect, and (b) manipulated the target's relative location within its frame to probe frame-centered effect. We show that iconic memory, visual priming, and backward masking depend on objects' relative frame locations, orthogonal of the retinotopic coordinate. These findings not only reveal that iconic memory, visual priming, and backward masking can be nonretinotopic but also demonstrate that these processes are automatically constrained by contextual frames through a frame-centered mechanism. Thus, object representation is robustly and automatically coupled to its reference frame and continuously being updated through a frame-centered, location-specific mechanism. These findings lead to an object cabinet framework, in which objects ("files") within the reference frame ("cabinet") are orderly coded relative to the frame.
Chhatre, Sunil; Jones, Carl; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel; Newcombe, Anthony; Keshavarz-Moore, Eli
2006-01-01
Growing commercial pressures in the pharmaceutical industry are establishing a need for robust computer simulations of whole bioprocesses to allow rapid prediction of the effects of changes made to manufacturing operations. This paper presents an integrated process simulation that models the cGMP manufacture of the FDA-approved biotherapeutic CroFab, an IgG fragment used to treat rattlesnake envenomation (Protherics U.K. Limited, Blaenwaun, Ffostrasol, Llandysul, Wales, U.K.). Initially, the product is isolated from ovine serum by precipitation and centrifugation, before enzymatic digestion of the IgG to produce FAB and FC fragments. These are purified by ion exchange and affinity chromatography to remove the FC and non-specific FAB fragments from the final venom-specific FAB product. The model was constructed in a discrete event simulation environment and used to determine the potential impact of a series of changes to the process, such as increasing the step efficiencies or volumes of chromatographic matrices, upon product yields and process times. The study indicated that the overall FAB yield was particularly sensitive to changes in the digestive and affinity chromatographic step efficiencies, which have a predicted 30% greater impact on process FAB yield than do the precipitation or centrifugation stages. The study showed that increasing the volume of affinity matrix has a negligible impact upon total process time. Although results such as these would require experimental verification within the physical constraints of the process and the facility, the model predictions are still useful in allowing rapid "what-if" scenario analysis of the likely impacts of process changes within such an integrated production process.
Sehgal, Vasudha; Seviour, Elena G; Moss, Tyler J; Mills, Gordon B; Azencott, Robert; Ram, Prahlad T
2015-01-01
MicroRNAs (miRNAs) play a crucial role in the maintenance of cellular homeostasis by regulating the expression of their target genes. As such, the dysregulation of miRNA expression has been frequently linked to cancer. With rapidly accumulating molecular data linked to patient outcome, the need for identification of robust multi-omic molecular markers is critical in order to provide clinical impact. While previous bioinformatic tools have been developed to identify potential biomarkers in cancer, these methods do not allow for rapid classification of oncogenes versus tumor suppressors taking into account robust differential expression, cutoffs, p-values and non-normality of the data. Here, we propose a methodology, Robust Selection Algorithm (RSA) that addresses these important problems in big data omics analysis. The robustness of the survival analysis is ensured by identification of optimal cutoff values of omics expression, strengthened by p-value computed through intensive random resampling taking into account any non-normality in the data and integration into multi-omic functional networks. Here we have analyzed pan-cancer miRNA patient data to identify functional pathways involved in cancer progression that are associated with selected miRNA identified by RSA. Our approach demonstrates the way in which existing survival analysis techniques can be integrated with a functional network analysis framework to efficiently identify promising biomarkers and novel therapeutic candidates across diseases.
Structure to self-structuring: infrastructures and processes in neurobehavioural rehabilitation.
Jackson, Howard F; Hague, Gemma; Daniels, Leanne; Aguilar, Ralph; Carr, Darren; Kenyon, William
2014-01-01
The importance of structure in post-acute brain injury rehabilitation is repeatedly mentioned in clinical practice. However, there has been little exploration of the key elements of structure that promote greater levels of functioning and emotional/behavioural stability and how these elements are optimally integrated within the infrastructure of a rehabilitation service. The nature of structure and why it is helpful is explored initially. Thereafter, the processes involved in transition from externally supported structure to the client 'self-structuring' are described. The infrastructure for facilitating these transitional processes are considered in terms of the design of services for systemic neurorehabilitation encompassing environmental factors (e.g. living environments, vocational and recreational options, step-up services and social milieus), therapeutic alliances (rehabilitation professionals, family, friends), organisational structures (service delivery, rehabilitation coaching, transdisciplinary teams) and rehabilitation philosophies and practice. It is concluded that the process of supporting individuals to transition from the 'structure' of the environment and other people towards self-structuring skills is a critical process in rehabilitation. This is reliant upon a comprehensive and robust organisational infrastructure that can successfully and flexibly integrate the core elements of structure across a transitional pathway towards increased independence and self-structuring.
NASA Astrophysics Data System (ADS)
Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.
2016-12-01
The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.
Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy
NASA Astrophysics Data System (ADS)
Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.
2011-08-01
The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.
Picosecond and femtosecond lasers for industrial material processing
NASA Astrophysics Data System (ADS)
Mayerhofer, R.; Serbin, J.; Deeg, F. W.
2016-03-01
Cold laser materials processing using ultra short pulsed lasers has become one of the most promising new technologies for high-precision cutting, ablation, drilling and marking of almost all types of material, without causing unwanted thermal damage to the part. These characteristics have opened up new application areas and materials for laser processing, allowing previously impossible features to be created and also reducing the amount of post-processing required to an absolute minimum, saving time and cost. However, short pulse widths are only one part of thee story for industrial manufacturing processes which focus on total costs and maximum productivity and production yield. Like every other production tool, ultra-short pulse lasers have too provide high quality results with maximum reliability. Robustness and global on-site support are vital factors, as well ass easy system integration.
NASA Astrophysics Data System (ADS)
Chang, Che-Chia; Liu, Po-Tsun; Chien, Chen-Yu; Fan, Yang-Shun
2018-04-01
This study demonstrates the integration of a thin film transistor (TFT) and resistive random-access memory (RRAM) to form a one-transistor-one-resistor (1T1R) configuration. With the concept of the current conducting direction in RRAM and TFT, a triple-layer stack design of Pt/InGaZnO/Al2O3 is proposed for both the switching layer of RRAM and the channel layer of TFT. This proposal decreases the complexity of fabrication and the numbers of photomasks required. Also, the robust endurance and stable retention characteristics are exhibited by the 1T1R architecture for promising applications in memory-embedded flat panel displays.
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
A practical approach to automate randomized design of experiments for ligand-binding assays.
Tsoi, Jennifer; Patel, Vimal; Shih, Judy
2014-03-01
Design of experiments (DOE) is utilized in optimizing ligand-binding assay by modeling factor effects. To reduce the analyst's workload and error inherent with DOE, we propose the integration of automated liquid handlers to perform the randomized designs. A randomized design created from statistical software was imported into custom macro converting the design into a liquid-handler worklist to automate reagent delivery. An optimized assay was transferred to a contract research organization resulting in a successful validation. We developed a practical solution for assay optimization by integrating DOE and automation to increase assay robustness and enable successful method transfer. The flexibility of this process allows it to be applied to a variety of assay designs.
Robust and compact entanglement generation from diode-laser-pumped four-wave mixing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrie, B. J.; Yang, Y.; Eaton, M.
Four-wave-mixing processes are now routinely used to demonstrate multi-spatial-mode Einstein- Podolsky-Rosen entanglement and intensity difference squeezing. Recently, diode-laser-pumped four-wave mixing processes have been shown to provide an affordable, compact, and stable source for intensity difference squeezing, but it was unknown if excess phase noise present in power amplifier pump configurations would be an impediment to achieving quadrature entanglement. Here, we demonstrate the operating regimes under which these systems are capable of producing entanglement and under which excess phase noise produced by the amplifier contaminates the output state. We show that Einstein-Podolsky-Rosen entanglement in two mode squeezed states can be generatedmore » by a four-wave-mixing source deriving both the pump field and the local oscillators from a tapered-amplifier diode-laser. In conclusion, this robust continuous variable entanglement source is highly scalable and amenable to miniaturization, making it a critical step toward the development of integrated quantum sensors and scalable quantum information processors, such as spatial comb cluster states.« less
Robust and compact entanglement generation from diode-laser-pumped four-wave mixing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrie, B. J., E-mail: lawriebj@ornl.gov; Pooser, R. C.; Yang, Y.
Four-wave-mixing processes are now routinely used to demonstrate multi-spatial-mode Einstein-Podolsky-Rosen entanglement and intensity difference squeezing. Diode-laser-pumped four-wave mixing processes have recently been shown to provide an affordable, compact, and stable source for intensity difference squeezing, but it was unknown if excess phase noise present in power amplifier pump configurations would be an impediment to achieving quadrature entanglement. Here, we demonstrate the operating regimes under which these systems are capable of producing entanglement and under which excess phase noise produced by the amplifier contaminates the output state. We show that Einstein-Podolsky-Rosen entanglement in two mode squeezed states can be generated bymore » a four-wave-mixing source deriving both the pump field and the local oscillators from a tapered-amplifier diode-laser. This robust continuous variable entanglement source is highly scalable and amenable to miniaturization, making it a critical step toward the development of integrated quantum sensors and scalable quantum information processors, such as spatial comb cluster states.« less
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Gu, Chengfan
2018-01-01
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation. PMID:29415509
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter.
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Zhong, Yongmin; Gu, Chengfan
2018-02-06
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation.
NASA Astrophysics Data System (ADS)
Xu, Guoqing; Liu, Ping; Ren, Yurong; Huang, Xiaobing; Peng, Zhiguang; Tang, Yougen; Wang, Haiyan
2017-09-01
The fabrication of an ideal electrode architecture consisting of robust three dimensional (3D) nanowire networks have gained special interest for energy storage applications owing to the integrated advantages of nanostructures and microstructures. In this work, 3D MoO2 nanotextiles assembled from highly interconnected elongated nanowires are successfully prepared by a facile stirring assisted hydrothermal method and followed by an annealing process. In addition, a methylbenzene/water biphasic reaction system is involved in the hydrothermal process. When used as an anode material in Li ion batteries (LIBs), this robust MoO2 nanotextiles exhibit a high reversible capacity (860.4 mAh g-1 at 300 mA g-1), excellent cycling performance (89% capacity retention after 160 cycles) and rate capability (577 mAh g-1 at 2000 mA g-1). Various synthetic factors to the fabrication of 3D nanotextiles structure are discussed here and this design of 3D network structures may be extended to the preparation of other functional nanomaterials.
Robust and compact entanglement generation from diode-laser-pumped four-wave mixing
Lawrie, B. J.; Yang, Y.; Eaton, M.; ...
2016-04-11
Four-wave-mixing processes are now routinely used to demonstrate multi-spatial-mode Einstein- Podolsky-Rosen entanglement and intensity difference squeezing. Recently, diode-laser-pumped four-wave mixing processes have been shown to provide an affordable, compact, and stable source for intensity difference squeezing, but it was unknown if excess phase noise present in power amplifier pump configurations would be an impediment to achieving quadrature entanglement. Here, we demonstrate the operating regimes under which these systems are capable of producing entanglement and under which excess phase noise produced by the amplifier contaminates the output state. We show that Einstein-Podolsky-Rosen entanglement in two mode squeezed states can be generatedmore » by a four-wave-mixing source deriving both the pump field and the local oscillators from a tapered-amplifier diode-laser. In conclusion, this robust continuous variable entanglement source is highly scalable and amenable to miniaturization, making it a critical step toward the development of integrated quantum sensors and scalable quantum information processors, such as spatial comb cluster states.« less
Nanowire active-matrix circuitry for low-voltage macroscale artificial skin.
Takei, Kuniharu; Takahashi, Toshitake; Ho, Johnny C; Ko, Hyunhyub; Gillies, Andrew G; Leu, Paul W; Fearing, Ronald S; Javey, Ali
2010-10-01
Large-scale integration of high-performance electronic components on mechanically flexible substrates may enable new applications in electronics, sensing and energy. Over the past several years, tremendous progress in the printing and transfer of single-crystalline, inorganic micro- and nanostructures on plastic substrates has been achieved through various process schemes. For instance, contact printing of parallel arrays of semiconductor nanowires (NWs) has been explored as a versatile route to enable fabrication of high-performance, bendable transistors and sensors. However, truly macroscale integration of ordered NW circuitry has not yet been demonstrated, with the largest-scale active systems being of the order of 1 cm(2) (refs 11,15). This limitation is in part due to assembly- and processing-related obstacles, although larger-scale integration has been demonstrated for randomly oriented NWs (ref. 16). Driven by this challenge, here we demonstrate macroscale (7×7 cm(2)) integration of parallel NW arrays as the active-matrix backplane of a flexible pressure-sensor array (18×19 pixels). The integrated sensor array effectively functions as an artificial electronic skin, capable of monitoring applied pressure profiles with high spatial resolution. The active-matrix circuitry operates at a low operating voltage of less than 5 V and exhibits superb mechanical robustness and reliability, without performance degradation on bending to small radii of curvature (2.5 mm) for over 2,000 bending cycles. This work presents the largest integration of ordered NW-array active components, and demonstrates a model platform for future integration of nanomaterials for practical applications.
van der Weiden, Anouk; Aarts, Henk; Prikken, Merel; van Haren, Neeltje E M
2016-02-01
Successful social interaction requires the ability to integrate as well as distinguish own and others' actions. Normally, the integration and distinction of self and other are a well-balanced process, occurring without much effort or conscious attention. However, not everyone is blessed with the ability to balance self-other distinction and integration, resulting in personal distress in reaction to other people's emotions or even a loss of self [e.g., in (subclinical) psychosis]. Previous research has demonstrated that the integration and distinction of others' actions cause interference with one's own action performance (commonly assessed with a social Simon task). The present study had two goals. First, as previous studies on the social Simon effect employed relatively small samples (N < 50 per test), we aimed for a sample size that allowed us to test the robustness of the action interference effect. Second, we tested to what extent action interference reflects individual differences in traits related to self-other distinction (i.e., personal distress in reaction to other people's emotions and subclinical psychotic symptoms). Based on a questionnaire study among a large sample (N = 745), we selected a subsample (N = 130) of participants scoring low, average, or high on subclinical psychotic symptoms, or on personal distress. The selected participants performed a social Simon task. Results showed a robust social Simon effect, regardless of individual differences in personal distress or subclinical psychotic symptoms. However, exploratory analyses revealed that the sex composition of interaction pairs modulated social Simon effects. Possible explanations for these findings are discussed.
Wei, Wenhui; Gao, Zhaohui; Gao, Shesheng; Jia, Ke
2018-01-01
In order to meet the requirements of autonomy and reliability for the navigation system, combined with the method of measuring speed by using the spectral redshift information of the natural celestial bodies, a new scheme, consisting of Strapdown Inertial Navigation System (SINS)/Spectral Redshift (SRS)/Geomagnetic Navigation System (GNS), is designed for autonomous integrated navigation systems. The principle of this SINS/SRS/GNS autonomous integrated navigation system is explored, and the corresponding mathematical model is established. Furthermore, a robust adaptive central difference particle filtering algorithm is proposed for this autonomous integrated navigation system. The simulation experiments are conducted and the results show that the designed SINS/SRS/GNS autonomous integrated navigation system possesses good autonomy, strong robustness and high reliability, thus providing a new solution for autonomous navigation technology. PMID:29642549
NASA Astrophysics Data System (ADS)
Ma, Xu; Li, Yanqiu; Guo, Xuejia; Dong, Lisong
2012-03-01
Optical proximity correction (OPC) and phase shifting mask (PSM) are the most widely used resolution enhancement techniques (RET) in the semiconductor industry. Recently, a set of OPC and PSM optimization algorithms have been developed to solve for the inverse lithography problem, which are only designed for the nominal imaging parameters without giving sufficient attention to the process variations due to the aberrations, defocus and dose variation. However, the effects of process variations existing in the practical optical lithography systems become more pronounced as the critical dimension (CD) continuously shrinks. On the other hand, the lithography systems with larger NA (NA>0.6) are now extensively used, rendering the scalar imaging models inadequate to describe the vector nature of the electromagnetic field in the current optical lithography systems. In order to tackle the above problems, this paper focuses on developing robust gradient-based OPC and PSM optimization algorithms to the process variations under a vector imaging model. To achieve this goal, an integrative and analytic vector imaging model is applied to formulate the optimization problem, where the effects of process variations are explicitly incorporated in the optimization framework. The steepest descent algorithm is used to optimize the mask iteratively. In order to improve the efficiency of the proposed algorithms, a set of algorithm acceleration techniques (AAT) are exploited during the optimization procedure.
Development of a Robust Identifier for NPPs Transients Combining ARIMA Model and EBP Algorithm
NASA Astrophysics Data System (ADS)
Moshkbar-Bakhshayesh, Khalil; Ghofrani, Mohammad B.
2014-08-01
This study introduces a novel identification method for recognition of nuclear power plants (NPPs) transients by combining the autoregressive integrated moving-average (ARIMA) model and the neural network with error backpropagation (EBP) learning algorithm. The proposed method consists of three steps. First, an EBP based identifier is adopted to distinguish the plant normal states from the faulty ones. In the second step, ARIMA models use integrated (I) process to convert non-stationary data of the selected variables into stationary ones. Subsequently, ARIMA processes, including autoregressive (AR), moving-average (MA), or autoregressive moving-average (ARMA) are used to forecast time series of the selected plant variables. In the third step, for identification the type of transients, the forecasted time series are fed to the modular identifier which has been developed using the latest advances of EBP learning algorithm. Bushehr nuclear power plant (BNPP) transients are probed to analyze the ability of the proposed identifier. Recognition of transient is based on similarity of its statistical properties to the reference one, rather than the values of input patterns. More robustness against noisy data and improvement balance between memorization and generalization are salient advantages of the proposed identifier. Reduction of false identification, sole dependency of identification on the sign of each output signal, selection of the plant variables for transients training independent of each other, and extendibility for identification of more transients without unfavorable effects are other merits of the proposed identifier.
NASA Astrophysics Data System (ADS)
Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd
2009-05-01
Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.
Medrea, Ioana
2013-01-01
The mouse has become an important model system for studying the cellular basis of learning and coding of heading by the vestibular system. Here we recorded from single neurons in the vestibular nuclei to understand how vestibular pathways encode self-motion under natural conditions, during which proprioceptive and motor-related signals as well as vestibular inputs provide feedback about an animal's movement through the world. We recorded neuronal responses in alert behaving mice focusing on a group of neurons, termed vestibular-only cells, that are known to control posture and project to higher-order centers. We found that the majority (70%, n = 21/30) of neurons were bimodal, in that they responded robustly to passive stimulation of proprioceptors as well as passive stimulation of the vestibular system. Additionally, the linear summation of a given neuron's vestibular and neck sensitivities predicted well its responses when both stimuli were applied simultaneously. In contrast, neuronal responses were suppressed when the same motion was actively generated, with the one striking exception that the activity of bimodal neurons similarly and robustly encoded head on body position in all conditions. Our results show that proprioceptive and motor-related signals are combined with vestibular information at the first central stage of vestibular processing in mice. We suggest that these results have important implications for understanding the multisensory integration underlying accurate postural control and the neural representation of directional heading in the head direction cell network of mice. PMID:24089394
Integrated assessment of urban drainage system under the framework of uncertainty analysis.
Dong, X; Chen, J; Zeng, S; Zhao, D
2008-01-01
Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.
Note: On-chip multifunctional fluorescent-magnetic Janus helical microswimmers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, G., E-mail: gilgueng.hwang@lpn.cnrs.fr; Decanini, D.; Leroy, L.
Microswimmers integrated into microfluidic devices that are capable of self-illumination through fluorescence could revolutionize many aspects of technology, especially for biological applications. Few illumination and propulsion techniques of helical microswimmers inside microfluidic channels have been demonstrated. This paper presents the fabrication, detachment, and magnetic propulsions of multifunctional fluorescent-magnetic helical microswimmers integrated inside microfluidics. The fabrication process is based on two-photon laser lithography to pattern 3-D nanostructures from fluorescent photoresist coupled with conventional microfabrication techniques for magnetic thin film deposition by shadowing. After direct integration inside a microfluidic device, injected gas bubble allows gentle detachment of the integrated helical microswimmers whosemore » magnetic propulsion can then be directly applied inside the microfluidic channel using external electromagnetic coil setup. With their small scale, fluorescence, excellent resistance to liquid/gas surface tension, and robust propulsion capability inside the microfluidic channel, the microswimmers can be used as high-resolution and large-range mobile micromanipulators inside microfluidic channels.« less
Electromagnetic pulsed thermography for natural cracks inspection
NASA Astrophysics Data System (ADS)
Gao, Yunlai; Tian, Gui Yun; Wang, Ping; Wang, Haitao; Gao, Bin; Woo, Wai Lok; Li, Kongjing
2017-02-01
Emerging integrated sensing and monitoring of material degradation and cracks are increasingly required for characterizing the structural integrity and safety of infrastructure. However, most conventional nondestructive evaluation (NDE) methods are based on single modality sensing which is not adequate to evaluate structural integrity and natural cracks. This paper proposed electromagnetic pulsed thermography for fast and comprehensive defect characterization. It hybrids multiple physical phenomena i.e. magnetic flux leakage, induced eddy current and induction heating linking to physics as well as signal processing algorithms to provide abundant information of material properties and defects. New features are proposed using 1st derivation that reflects multiphysics spatial and temporal behaviors to enhance the detection of cracks with different orientations. Promising results that robust to lift-off changes and invariant features for artificial and natural cracks detection have been demonstrated that the proposed method significantly improves defect detectability. It opens up multiphysics sensing and integrated NDE with potential impact for natural understanding and better quantitative evaluation of natural cracks including stress corrosion crack (SCC) and rolling contact fatigue (RCF).
Materials, Structures and Manufacturing: An Integrated Approach to Develop Expandable Structures
NASA Technical Reports Server (NTRS)
Belvin, W. Keith; Zander, Martin E.; Sleight, Daid W.; Connell, John; Holloway, Nancy; Palmieri, Frank
2012-01-01
Membrane dominated space structures are lightweight and package efficiently for launch; however, they must be expanded (deployed) in-orbit to achieve the desired geometry. These expandable structural systems include solar sails, solar power arrays, antennas, and numerous other large aperture devices that are used to collect, reflect and/or transmit electromagnetic radiation. In this work, an integrated approach to development of thin-film damage tolerant membranes is explored using advanced manufacturing. Bio-inspired hierarchical structures were printed on films using additive manufacturing to achieve improved tear resistance and to facilitate membrane deployment. High precision, robust expandable structures can be realized using materials that are both space durable and processable using additive manufacturing. Test results show this initial work produced higher tear resistance than neat film of equivalent mass. Future research and development opportunities for expandable structural systems designed using an integrated approach to structural design, manufacturing, and materials selection are discussed.
Design and Development of ChemInfoCloud: An Integrated Cloud Enabled Platform for Virtual Screening.
Karthikeyan, Muthukumarasamy; Pandit, Deepak; Bhavasar, Arvind; Vyas, Renu
2015-01-01
The power of cloud computing and distributed computing has been harnessed to handle vast and heterogeneous data required to be processed in any virtual screening protocol. A cloud computing platorm ChemInfoCloud was built and integrated with several chemoinformatics and bioinformatics tools. The robust engine performs the core chemoinformatics tasks of lead generation, lead optimisation and property prediction in a fast and efficient manner. It has also been provided with some of the bioinformatics functionalities including sequence alignment, active site pose prediction and protein ligand docking. Text mining, NMR chemical shift (1H, 13C) prediction and reaction fingerprint generation modules for efficient lead discovery are also implemented in this platform. We have developed an integrated problem solving cloud environment for virtual screening studies that also provides workflow management, better usability and interaction with end users using container based virtualization, OpenVz.
Chip-based quantum key distribution
NASA Astrophysics Data System (ADS)
Sibson, P.; Erven, C.; Godfrey, M.; Miki, S.; Yamashita, T.; Fujiwara, M.; Sasaki, M.; Terai, H.; Tanner, M. G.; Natarajan, C. M.; Hadfield, R. H.; O'Brien, J. L.; Thompson, M. G.
2017-02-01
Improvement in secure transmission of information is an urgent need for governments, corporations and individuals. Quantum key distribution (QKD) promises security based on the laws of physics and has rapidly grown from proof-of-concept to robust demonstrations and deployment of commercial systems. Despite these advances, QKD has not been widely adopted, and large-scale deployment will likely require chip-based devices for improved performance, miniaturization and enhanced functionality. Here we report low error rate, GHz clocked QKD operation of an indium phosphide transmitter chip and a silicon oxynitride receiver chip--monolithically integrated devices using components and manufacturing processes from the telecommunications industry. We use the reconfigurability of these devices to demonstrate three prominent QKD protocols--BB84, Coherent One Way and Differential Phase Shift--with performance comparable to state-of-the-art. These devices, when combined with integrated single photon detectors, pave the way for successfully integrating QKD into future telecommunications networks.
Chip-based quantum key distribution
Sibson, P.; Erven, C.; Godfrey, M.; Miki, S.; Yamashita, T.; Fujiwara, M.; Sasaki, M.; Terai, H.; Tanner, M. G.; Natarajan, C. M.; Hadfield, R. H.; O'Brien, J. L.; Thompson, M. G.
2017-01-01
Improvement in secure transmission of information is an urgent need for governments, corporations and individuals. Quantum key distribution (QKD) promises security based on the laws of physics and has rapidly grown from proof-of-concept to robust demonstrations and deployment of commercial systems. Despite these advances, QKD has not been widely adopted, and large-scale deployment will likely require chip-based devices for improved performance, miniaturization and enhanced functionality. Here we report low error rate, GHz clocked QKD operation of an indium phosphide transmitter chip and a silicon oxynitride receiver chip—monolithically integrated devices using components and manufacturing processes from the telecommunications industry. We use the reconfigurability of these devices to demonstrate three prominent QKD protocols—BB84, Coherent One Way and Differential Phase Shift—with performance comparable to state-of-the-art. These devices, when combined with integrated single photon detectors, pave the way for successfully integrating QKD into future telecommunications networks. PMID:28181489
MacNamara, Aine; Collins, Dave
2014-01-01
Gulbin and colleagues (Gulbin, J. P., Croser, M. J., Morley, E. J., & Weissensteiner, J. R. (2013). An integrated framework for the optimisation of sport and athlete development: A practitioner approach. Journal of Sports Sciences) present a new sport and athlete development framework that evolved from empirical observations from working with the Australian Institute of Sport. The FTEM (Foundations, Talent, Elite, Mastery) framework is proposed to integrate general and specialised phases of development for participants within the active lifestyle, sport participation and sport excellence pathways. A number of issues concerning the FTEM framework are presented. We also propose the need to move beyond prescriptive models of talent identification and development towards a consideration of features of best practice and process markers of development together with robust guidelines about the implementation of these in applied practice.
Terrell, Kimberly A; Wildt, David E; Anthony, Nicola M; Bavister, Barry D; Leibo, S P; Penfold, Linda M; Marker, Laurie L; Crosier, Adrienne E
2012-04-01
Felid spermatozoa are sensitive to cryopreservation-induced damage, but functional losses can be mitigated by post-thaw swim-up or density gradient processing methods that selectively recover motile or structurally-normal spermatozoa, respectively. Despite the importance of sperm energy production to achieving fertilization, there is little knowledge about the influence of cryopreservation or post-thaw processing on felid sperm metabolism. We conducted a comparative study of domestic cat and cheetah sperm metabolism after cryopreservation and post-thaw processing. We hypothesized that freezing/thawing impairs sperm metabolism and that swim-up, but not density gradient centrifugation, recovers metabolically-normal spermatozoa. Ejaculates were cryopreserved, thawed, and processed by swim-up, Accudenz gradient centrifugation, or conventional washing (representing the 'control'). Sperm glucose and pyruvate uptake, lactate production, motility, and acrosomal integrity were assessed. Mitochondrial membrane potential (MMP) was measured in cat spermatozoa. In both species, lactate production, motility, and acrosomal integrity were reduced in post-thaw, washed samples compared to freshly-collected ejaculates. Glucose uptake was minimal pre- and post-cryopreservation, whereas pyruvate uptake was similar between treatments due to high coefficients of variation. In the cat, swim-up, but not Accudenz processing, recovered spermatozoa with increased lactate production, pyruvate uptake, and motility compared to controls. Although confounded by differences in non-specific fluorescence among processing methods, MMP values within treatments were positively correlated to sperm motility and acrosomal integrity. Cheetah spermatozoa isolated by either selection method exhibited improved motility and/or acrosomal integrity, but remained metabolically compromised. Collectively, findings revealed a metabolically-robust subpopulation of cryopreserved cat, but not cheetah, spermatozoa, recovered by selecting for motility rather than morphology. Published by Elsevier Inc.
Jabłoński, Michał; Starčuková, Jana; Starčuk, Zenon
2017-01-23
Proton magnetic resonance spectroscopy is a non-invasive measurement technique which provides information about concentrations of up to 20 metabolites participating in intracellular biochemical processes. In order to obtain any metabolic information from measured spectra a processing should be done in specialized software, like jMRUI. The processing is interactive and complex and often requires many trials before obtaining a correct result. This paper proposes a jMRUI enhancement for efficient and unambiguous history tracking and file identification. A database storing all processing steps, parameters and files used in processing was developed for jMRUI. The solution was developed in Java, authors used a SQL database for robust storage of parameters and SHA-256 hash code for unambiguous file identification. The developed system was integrated directly in jMRUI and it will be publically available. A graphical user interface was implemented in order to make the user experience more comfortable. The database operation is invisible from the point of view of the common user, all tracking operations are performed in the background. The implemented jMRUI database is a tool that can significantly help the user to track the processing history performed on data in jMRUI. The created tool is oriented to be user-friendly, robust and easy to use. The database GUI allows the user to browse the whole processing history of a selected file and learn e.g. what processing lead to the results, where the original data are stored, to obtain the list of all processing actions performed on spectra.
A novel double loop control model design for chemical unstable processes.
Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He
2014-03-01
In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. © 2013 ISA Published by ISA All rights reserved.
Killian, Tobias; Dickopf, Steffen; Haas, Alexander K; Kirstenpfad, Claudia; Mayer, Klaus; Brinkmann, Ulrich
2017-11-13
We have devised an effective and robust method for the characterization of gene-editing events. The efficacy of editing-mediated mono- and bi-allelic gene inactivation and integration events is quantified based on colony counts. The combination of diphtheria toxin (DT) and puromycin (PM) selection enables analyses of 10,000-100,000 individual cells, assessing hundreds of clones with inactivated genes per experiment. Mono- and bi-allelic gene inactivation is differentiated by DT resistance, which occurs only upon bi-allelic inactivation. PM resistance indicates integration. The robustness and generalizability of the method were demonstrated by quantifying the frequency of gene inactivation and cassette integration under different editing approaches: CRISPR/Cas9-mediated complete inactivation was ~30-50-fold more frequent than cassette integration. Mono-allelic inactivation without integration occurred >100-fold more frequently than integration. Assessment of gRNA length confirmed 20mers to be most effective length for inactivation, while 16-18mers provided the highest overall integration efficacy. The overall efficacy was ~2-fold higher for CRISPR/Cas9 than for zinc-finger nuclease and was significantly increased upon modulation of non-homologous end joining or homology-directed repair. The frequencies and ratios of editing events were similar for two different DPH genes (independent of the target sequence or chromosomal location), which indicates that the optimization parameters identified with this method can be generalized.
Integrated modelling of crop production and nitrate leaching with the Daisy model.
Manevski, Kiril; Børgesen, Christen D; Li, Xiaoxin; Andersen, Mathias N; Abrahamsen, Per; Hu, Chunsheng; Hansen, Søren
2016-01-01
An integrated modelling strategy was designed and applied to the Soil-Vegetation-Atmosphere Transfer model Daisy for simulation of crop production and nitrate leaching under pedo-climatic and agronomic environment different than that of model original parameterisation. The points of significance and caution in the strategy are: •Model preparation should include field data in detail due to the high complexity of the soil and the crop processes simulated with process-based model, and should reflect the study objectives. Inclusion of interactions between parameters in a sensitivity analysis results in better account for impacts on outputs of measured variables.•Model evaluation on several independent data sets increases robustness, at least on coarser time scales such as month or year. It produces a valuable platform for adaptation of the model to new crops or for the improvement of the existing parameters set. On daily time scale, validation for highly dynamic variables such as soil water transport remains challenging. •Model application is demonstrated with relevance for scientists and regional managers. The integrated modelling strategy is applicable for other process-based models similar to Daisy. It is envisaged that the strategy establishes model capability as a useful research/decision-making, and it increases knowledge transferability, reproducibility and traceability.
Integrating hidden Markov model and PRAAT: a toolbox for robust automatic speech transcription
NASA Astrophysics Data System (ADS)
Kabir, A.; Barker, J.; Giurgiu, M.
2010-09-01
An automatic time-aligned phone transcription toolbox of English speech corpora has been developed. Especially the toolbox would be very useful to generate robust automatic transcription and able to produce phone level transcription using speaker independent models as well as speaker dependent models without manual intervention. The system is based on standard Hidden Markov Models (HMM) approach and it was successfully experimented over a large audiovisual speech corpus namely GRID corpus. One of the most powerful features of the toolbox is the increased flexibility in speech processing where the speech community would be able to import the automatic transcription generated by HMM Toolkit (HTK) into a popular transcription software, PRAAT, and vice-versa. The toolbox has been evaluated through statistical analysis on GRID data which shows that automatic transcription deviates by an average of 20 ms with respect to manual transcription.
Participatory action research in corrections: The HITEC 2 program.
Cherniack, Martin; Dussetschleger, Jeffrey; Dugan, Alicia; Farr, Dana; Namazi, Sara; El Ghaziri, Mazen; Henning, Robert
2016-03-01
HITEC 2 (Health Improvement through Employee Control 2) is the follow-up to HITEC, a participatory action research (PAR) program that integrates health and work conditions interventions designed by the workforce. HITEC 2 compares intervention programs between two correctional sites, one using a pure workforce level design team and the other using a more structured and time delineated labor-management kaizen effectiveness team. HITEC 2 utilizes a seven step participatory Intervention Design and Analysis Scorecard (IDEAS) for planning interventions. Consistent with PAR, process and intervention efficacy measures are developed and administered through workforce representation. Participation levels, robustness of participatory structures and sophistication of interventions have increased at each measured interval. Health comparisons between 2008 and 2013 showed increased hypertension, static weight maintenance, and increased 'readiness to change'. The PAR approaches are robust and sustained. Their long-term effectiveness in this population is not yet clear. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Fluorescence lifetime assays: current advances and applications in drug discovery.
Pritz, Stephan; Doering, Klaus; Woelcke, Julian; Hassiepen, Ulrich
2011-06-01
Fluorescence lifetime assays complement the portfolio of established assay formats available in drug discovery, particularly with the recent advances in microplate readers and the commercial availability of novel fluorescent labels. Fluorescence lifetime assists in lowering complexity of compound screening assays, affording a modular, toolbox-like approach to assay development and yielding robust homogeneous assays. To date, materials and procedures have been reported for biochemical assays on proteases, as well as on protein kinases and phosphatases. This article gives an overview of two assay families, distinguished by the origin of the fluorescence signal modulation. The pharmaceutical industry demands techniques with a robust, integrated compound profiling process and short turnaround times. Fluorescence lifetime assays have already helped the drug discovery field, in this sense, by enhancing productivity during the hit-to-lead and lead optimization phases. Future work will focus on covering other biochemical molecular modifications by investigating the detailed photo-physical mechanisms underlying the fluorescence signal.
Wang, Sibo; Ren, Zheng; Guo, Yanbing; ...
2016-03-21
We report the scalable three-dimensional (3-D) integration of functional nanostructures into applicable platforms represents a promising technology to meet the ever-increasing demands of fabricating high performance devices featuring cost-effectiveness, structural sophistication and multi-functional enabling. Such an integration process generally involves a diverse array of nanostructural entities (nano-entities) consisting of dissimilar nanoscale building blocks such as nanoparticles, nanowires, and nanofilms made of metals, ceramics, or polymers. Various synthetic strategies and integration methods have enabled the successful assembly of both structurally and functionally tailored nano-arrays into a unique class of monolithic devices. The performance of nano-array based monolithic devices is dictated bymore » a few important factors such as materials substrate selection, nanostructure composition and nano-architecture geometry. Therefore, the rational material selection and nano-entity manipulation during the nano-array integration process, aiming to exploit the advantageous characteristics of nanostructures and their ensembles, are critical steps towards bridging the design of nanostructure integrated monolithic devices with various practical applications. In this article, we highlight the latest research progress of the two-dimensional (2-D) and 3-D metal and metal oxide based nanostructural integrations into prototype devices applicable with ultrahigh efficiency, good robustness and improved functionality. Lastly, selective examples of nano-array integration, scalable nanomanufacturing and representative monolithic devices such as catalytic converters, sensors and batteries will be utilized as the connecting dots to display a roadmap from hierarchical nanostructural assembly to practical nanotechnology implications ranging from energy, environmental, to chemical and biotechnology areas.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Sibo; Ren, Zheng; Guo, Yanbing
We report the scalable three-dimensional (3-D) integration of functional nanostructures into applicable platforms represents a promising technology to meet the ever-increasing demands of fabricating high performance devices featuring cost-effectiveness, structural sophistication and multi-functional enabling. Such an integration process generally involves a diverse array of nanostructural entities (nano-entities) consisting of dissimilar nanoscale building blocks such as nanoparticles, nanowires, and nanofilms made of metals, ceramics, or polymers. Various synthetic strategies and integration methods have enabled the successful assembly of both structurally and functionally tailored nano-arrays into a unique class of monolithic devices. The performance of nano-array based monolithic devices is dictated bymore » a few important factors such as materials substrate selection, nanostructure composition and nano-architecture geometry. Therefore, the rational material selection and nano-entity manipulation during the nano-array integration process, aiming to exploit the advantageous characteristics of nanostructures and their ensembles, are critical steps towards bridging the design of nanostructure integrated monolithic devices with various practical applications. In this article, we highlight the latest research progress of the two-dimensional (2-D) and 3-D metal and metal oxide based nanostructural integrations into prototype devices applicable with ultrahigh efficiency, good robustness and improved functionality. Lastly, selective examples of nano-array integration, scalable nanomanufacturing and representative monolithic devices such as catalytic converters, sensors and batteries will be utilized as the connecting dots to display a roadmap from hierarchical nanostructural assembly to practical nanotechnology implications ranging from energy, environmental, to chemical and biotechnology areas.« less
Improved determination of vector lithospheric magnetic anomalies from MAGSAT data
NASA Technical Reports Server (NTRS)
Ravat, Dhananjay
1993-01-01
Scientific contributions made in developing new methods to isolate and map vector magnetic anomalies from measurements made by Magsat are described. In addition to the objective of the proposal, the isolation and mapping of equatorial vector lithospheric Magsat anomalies, isolation of polar ionospheric fields during the period were also studied. Significant progress was also made in isolation of polar delta(Z) component and scalar anomalies as well as integration and synthesis of various techniques of removing equatorial and polar ionospheric effects. The significant contributions of this research are: (1) development of empirical/analytical techniques in modeling ionospheric fields in Magsat data and their removal from uncorrected anomalies to obtain better estimates of lithospheric anomalies (this task was accomplished for equatorial delta(X), delta(Z), and delta(B) component and polar delta(Z) and delta(B) component measurements; (2) integration of important processing techniques developed during the last decade with the newly developed technologies of ionospheric field modeling into an optimum processing scheme; and (3) implementation of the above processing scheme to map the most robust magnetic anomalies of the lithosphere (components as well as scalar).
Delpiano, J; Pizarro, L; Peddie, C J; Jones, M L; Griffin, L D; Collinson, L M
2018-04-26
Integrated array tomography combines fluorescence and electron imaging of ultrathin sections in one microscope, and enables accurate high-resolution correlation of fluorescent proteins to cell organelles and membranes. Large numbers of serial sections can be imaged sequentially to produce aligned volumes from both imaging modalities, thus producing enormous amounts of data that must be handled and processed using novel techniques. Here, we present a scheme for automated detection of fluorescent cells within thin resin sections, which could then be used to drive automated electron image acquisition from target regions via 'smart tracking'. The aim of this work is to aid in optimization of the data acquisition process through automation, freeing the operator to work on other tasks and speeding up the process, while reducing data rates by only acquiring images from regions of interest. This new method is shown to be robust against noise and able to deal with regions of low fluorescence. © 2018 The Authors. Journal of Microscopy published by JohnWiley & Sons Ltd on behalf of Royal Microscopical Society.
Verkhivker, Gennady M
2016-01-01
The human protein kinome presents one of the largest protein families that orchestrate functional processes in complex cellular networks, and when perturbed, can cause various cancers. The abundance and diversity of genetic, structural, and biochemical data underlies the complexity of mechanisms by which targeted and personalized drugs can combat mutational profiles in protein kinases. Coupled with the evolution of system biology approaches, genomic and proteomic technologies are rapidly identifying and charactering novel resistance mechanisms with the goal to inform rationale design of personalized kinase drugs. Integration of experimental and computational approaches can help to bring these data into a unified conceptual framework and develop robust models for predicting the clinical drug resistance. In the current study, we employ a battery of synergistic computational approaches that integrate genetic, evolutionary, biochemical, and structural data to characterize the effect of cancer mutations in protein kinases. We provide a detailed structural classification and analysis of genetic signatures associated with oncogenic mutations. By integrating genetic and structural data, we employ network modeling to dissect mechanisms of kinase drug sensitivities to oncogenic EGFR mutations. Using biophysical simulations and analysis of protein structure networks, we show that conformational-specific drug binding of Lapatinib may elicit resistant mutations in the EGFR kinase that are linked with the ligand-mediated changes in the residue interaction networks and global network properties of key residues that are responsible for structural stability of specific functional states. A strong network dependency on high centrality residues in the conformation-specific Lapatinib-EGFR complex may explain vulnerability of drug binding to a broad spectrum of mutations and the emergence of drug resistance. Our study offers a systems-based perspective on drug design by unravelling complex relationships between robustness of targeted kinase genes and binding specificity of targeted kinase drugs. We discuss how these approaches can exploit advances in chemical biology and network science to develop novel strategies for rationally tailored and robust personalized drug therapies.
A natural language query system for Hubble Space Telescope proposal selection
NASA Technical Reports Server (NTRS)
Hornick, Thomas; Cohen, William; Miller, Glenn
1987-01-01
The proposal selection process for the Hubble Space Telescope is assisted by a robust and easy to use query program (TACOS). The system parses an English subset language sentence regardless of the order of the keyword phases, allowing the user a greater flexibility than a standard command query language. Capabilities for macro and procedure definition are also integrated. The system was designed for flexibility in both use and maintenance. In addition, TACOS can be applied to any knowledge domain that can be expressed in terms of a single reaction. The system was implemented mostly in Common LISP. The TACOS design is described in detail, with particular attention given to the implementation methods of sentence processing.
Knaup, Petra; Schöpe, Lothar
2012-01-01
The authors see the major potential of systematically processing data from AAL-technology in higher sustainability, higher technology acceptance, higher security, higher robustness, higher flexibility and better integration in existing structures and processes. This potential is currently underachieved and not yet systematically promoted. The authors have written a position paper on potential and necessity of substantial IT research enhancing Ambient Assisted Living (AAL) applications. This paper summarizes the most important challenges in the fields health care, data protection, operation and user interfaces. Research in medical informatics is necessary among others in the fields flexible authorization concept, medical information needs, algorithms to evaluate user profiles and visualization of aggregated data.
Designed tools for analysis of lithography patterns and nanostructures
NASA Astrophysics Data System (ADS)
Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann
2017-03-01
We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.
High-sensitivity GMR with low coercivity in top-IrMn spin-valves
NASA Astrophysics Data System (ADS)
Liu, H. R.; Qu, B. J.; Ren, T. L.; Liu, L. T.; Xie, H. L.; Li, C. X.; Ku, W. J.
2003-12-01
Top-IrMn spin-valves with a structure of Ta/NiFe/CoFe/Cu/CoFe/IrMn/Ta have been investigated. The spin-valves were deposited by high vacuum DC magnetron sputtering at room temperature. The magnetoresistance ratio reaches 9.12% at room temperature. The coercivity of the free layer and the exchange bias field is 1.04 and 180 Oe, respectively. The maximum sensitivity of the spin-valves is 8.36%/Oe. A reduction of 33.2% of the coercivity was obtained after a 2-min RIE process. Utilizing standard integrated circuit (IC) process, mass production of robust giant magnetoresistance sensors can be achieved with these spin-valve thin films.
Crypto-Watermarking of Transmitted Medical Images.
Al-Haj, Ali; Mohammad, Ahmad; Amer, Alaa'
2017-02-01
Telemedicine is a booming healthcare practice that has facilitated the exchange of medical data and expertise between healthcare entities. However, the widespread use of telemedicine applications requires a secured scheme to guarantee confidentiality and verify authenticity and integrity of exchanged medical data. In this paper, we describe a region-based, crypto-watermarking algorithm capable of providing confidentiality, authenticity, and integrity for medical images of different modalities. The proposed algorithm provides authenticity by embedding robust watermarks in images' region of non-interest using SVD in the DWT domain. Integrity is provided in two levels: strict integrity implemented by a cryptographic hash watermark, and content-based integrity implemented by a symmetric encryption-based tamper localization scheme. Confidentiality is achieved as a byproduct of hiding patient's data in the image. Performance of the algorithm was evaluated with respect to imperceptibility, robustness, capacity, and tamper localization, using different medical images. The results showed the effectiveness of the algorithm in providing security for telemedicine applications.
Robust Stabilization of T-S Fuzzy Stochastic Descriptor Systems via Integral Sliding Modes.
Li, Jinghao; Zhang, Qingling; Yan, Xing-Gang; Spurgeon, Sarah K
2017-09-19
This paper addresses the robust stabilization problem for T-S fuzzy stochastic descriptor systems using an integral sliding mode control paradigm. A classical integral sliding mode control scheme and a nonparallel distributed compensation (Non-PDC) integral sliding mode control scheme are presented. It is shown that two restrictive assumptions previously adopted developing sliding mode controllers for Takagi-Sugeno (T-S) fuzzy stochastic systems are not required with the proposed framework. A unified framework for sliding mode control of T-S fuzzy systems is formulated. The proposed Non-PDC integral sliding mode control scheme encompasses existing schemes when the previously imposed assumptions hold. Stability of the sliding motion is analyzed and the sliding mode controller is parameterized in terms of the solutions of a set of linear matrix inequalities which facilitates design. The methodology is applied to an inverted pendulum model to validate the effectiveness of the results presented.
Using Business Process Specification and Agent to Integrate a Scenario Driven Supply Chain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, Hyunbo; Kulvatunyou, Boonserm; Jeong, Hanil
2004-07-01
In today's increasingly competitive global market, most enterprises place high priority on reducing order-fulfillment costs, minimizing time-to-market, and maximizing product quality. The desire of businesses to achieve these goals has seen a shift from a make-to-stock paradigm to a make-to-order paradigm. The success of this new paradigm requires robust and efficient supply chain integration and the ability to operate in the business-to-business (B2B) environment. Recent internet-based approaches have enabled instantaneous and secure information sharing among trading partners (i.e., customers, manufacturers, and suppliers). In this paper, we present a framework that enables both integration and B2B operations. This framework uses pre-definedmore » business process specifications (BPS) and agent technologies. The BPS, which specifies a message choreography among the trading partners, is modeled using a modified Unified Modeling Language (UML). The behavior of the enterprise applications within each trading partner -- how they respond to external events specified in the BPS -- is modeled using Petri-nets and implemented as a collection of agents. The concepts and models proposed in this paper should provide the starting point for the formulation of a structured approach to B2B supply chain integration and implementation.« less
Robust multiperson detection and tracking for mobile service and social robots.
Li, Liyuan; Yan, Shuicheng; Yu, Xinguo; Tan, Yeow Kee; Li, Haizhou
2012-10-01
This paper proposes an efficient system which integrates multiple vision models for robust multiperson detection and tracking for mobile service and social robots in public environments. The core technique is a novel maximum likelihood (ML)-based algorithm which combines the multimodel detections in mean-shift tracking. First, a likelihood probability which integrates detections and similarity to local appearance is defined. Then, an expectation-maximization (EM)-like mean-shift algorithm is derived under the ML framework. In each iteration, the E-step estimates the associations to the detections, and the M-step locates the new position according to the ML criterion. To be robust to the complex crowded scenarios for multiperson tracking, an improved sequential strategy to perform the mean-shift tracking is proposed. Under this strategy, human objects are tracked sequentially according to their priority order. To balance the efficiency and robustness for real-time performance, at each stage, the first two objects from the list of the priority order are tested, and the one with the higher score is selected. The proposed method has been successfully implemented on real-world service and social robots. The vision system integrates stereo-based and histograms-of-oriented-gradients-based human detections, occlusion reasoning, and sequential mean-shift tracking. Various examples to show the advantages and robustness of the proposed system for multiperson tracking from mobile robots are presented. Quantitative evaluations on the performance of multiperson tracking are also performed. Experimental results indicate that significant improvements have been achieved by using the proposed method.
Fonoudi, Hananeh; Ansari, Hassan; Abbasalizadeh, Saeed; Larijani, Mehran Rezaei; Kiani, Sahar; Hashemizadeh, Shiva; Zarchi, Ali Sharifi; Bosman, Alexis; Blue, Gillian M; Pahlavan, Sara; Perry, Matthew; Orr, Yishay; Mayorchak, Yaroslav; Vandenberg, Jamie; Talkhabi, Mahmood; Winlaw, David S; Harvey, Richard P; Aghdami, Nasser; Baharvand, Hossein
2015-12-01
Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs), in conjunction with the promising outcomes from preclinical and clinical studies, have raised new hopes for cardiac cell therapy. We report the development of a scalable, robust, and integrated differentiation platform for large-scale production of hPSC-CM aggregates in a stirred suspension bioreactor as a single-unit operation. Precise modulation of the differentiation process by small molecule activation of WNT signaling, followed by inactivation of transforming growth factor-β and WNT signaling and activation of sonic hedgehog signaling in hPSCs as size-controlled aggregates led to the generation of approximately 100% beating CM spheroids containing virtually pure (∼90%) CMs in 10 days. Moreover, the developed differentiation strategy was universal, as demonstrated by testing multiple hPSC lines (5 human embryonic stem cell and 4 human inducible PSC lines) without cell sorting or selection. The produced hPSC-CMs successfully expressed canonical lineage-specific markers and showed high functionality, as demonstrated by microelectrode array and electrophysiology tests. This robust and universal platform could become a valuable tool for the mass production of functional hPSC-CMs as a prerequisite for realizing their promising potential for therapeutic and industrial applications, including drug discovery and toxicity assays. Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs) and the development of novel cell therapy strategies using hPSC-CMs (e.g., cardiac patches) in conjunction with promising preclinical and clinical studies, have raised new hopes for patients with end-stage cardiovascular disease, which remains the leading cause of morbidity and mortality globally. In this study, a simplified, scalable, robust, and integrated differentiation platform was developed to generate clinical grade hPSC-CMs as cell aggregates under chemically defined culture conditions. This approach resulted in approximately 100% beating CM spheroids with virtually pure (∼90%) functional cardiomyocytes in 10 days from multiple hPSC lines. This universal and robust bioprocessing platform can provide sufficient numbers of hPSC-CMs for companies developing regenerative medicine technologies to rescue, replace, and help repair damaged heart tissues and for pharmaceutical companies developing advanced biologics and drugs for regeneration of lost heart tissue using high-throughput technologies. It is believed that this technology can expedite clinical progress in these areas to achieve a meaningful impact on improving clinical outcomes, cost of care, and quality of life for those patients disabled and experiencing heart disease. ©AlphaMed Press.
Fonoudi, Hananeh; Ansari, Hassan; Abbasalizadeh, Saeed; Larijani, Mehran Rezaei; Kiani, Sahar; Hashemizadeh, Shiva; Zarchi, Ali Sharifi; Bosman, Alexis; Blue, Gillian M.; Pahlavan, Sara; Perry, Matthew; Orr, Yishay; Mayorchak, Yaroslav; Vandenberg, Jamie; Talkhabi, Mahmood; Winlaw, David S.; Harvey, Richard P.; Aghdami, Nasser
2015-01-01
Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs), in conjunction with the promising outcomes from preclinical and clinical studies, have raised new hopes for cardiac cell therapy. We report the development of a scalable, robust, and integrated differentiation platform for large-scale production of hPSC-CM aggregates in a stirred suspension bioreactor as a single-unit operation. Precise modulation of the differentiation process by small molecule activation of WNT signaling, followed by inactivation of transforming growth factor-β and WNT signaling and activation of sonic hedgehog signaling in hPSCs as size-controlled aggregates led to the generation of approximately 100% beating CM spheroids containing virtually pure (∼90%) CMs in 10 days. Moreover, the developed differentiation strategy was universal, as demonstrated by testing multiple hPSC lines (5 human embryonic stem cell and 4 human inducible PSC lines) without cell sorting or selection. The produced hPSC-CMs successfully expressed canonical lineage-specific markers and showed high functionality, as demonstrated by microelectrode array and electrophysiology tests. This robust and universal platform could become a valuable tool for the mass production of functional hPSC-CMs as a prerequisite for realizing their promising potential for therapeutic and industrial applications, including drug discovery and toxicity assays. Significance Recent advances in the generation of cardiomyocytes (CMs) from human pluripotent stem cells (hPSCs) and the development of novel cell therapy strategies using hPSC-CMs (e.g., cardiac patches) in conjunction with promising preclinical and clinical studies, have raised new hopes for patients with end-stage cardiovascular disease, which remains the leading cause of morbidity and mortality globally. In this study, a simplified, scalable, robust, and integrated differentiation platform was developed to generate clinical grade hPSC-CMs as cell aggregates under chemically defined culture conditions. This approach resulted in approximately 100% beating CM spheroids with virtually pure (∼90%) functional cardiomyocytes in 10 days from multiple hPSC lines. This universal and robust bioprocessing platform can provide sufficient numbers of hPSC-CMs for companies developing regenerative medicine technologies to rescue, replace, and help repair damaged heart tissues and for pharmaceutical companies developing advanced biologics and drugs for regeneration of lost heart tissue using high-throughput technologies. It is believed that this technology can expedite clinical progress in these areas to achieve a meaningful impact on improving clinical outcomes, cost of care, and quality of life for those patients disabled and experiencing heart disease. PMID:26511653
Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region
NASA Astrophysics Data System (ADS)
Gilroy, Kristin; Mens, Marjolein; Haasnoot, Marjolijn; Jeuken, Ad
2016-04-01
More frequent and intense hydrologic events under climate change are expected to enhance water security and flood risk management challenges worldwide. Traditional planning approaches must be adapted to address climate change and develop solutions with an appropriate level of robustness and flexibility. The Climate Risk Informed Decision Analysis (CRIDA) method is a novel planning approach embodying a suite of complementary methods, including decision scaling and adaptation pathways. Decision scaling offers a bottom-up approach to assess risk and tailors the complexity of the analysis to the problem at hand and the available capacity. Through adaptation pathway,s an array of future strategies towards climate robustness are developed, ranging in flexibility and immediacy of investments. Flexible pathways include transfer points to other strategies to ensure that the system can be adapted if future conditions vary from those expected. CRIDA combines these two approaches in a stakeholder driven process which guides decision makers through the planning and decision process, taking into account how the confidence in the available science, the consequences in the system, and the capacity of institutions should influence strategy selection. In this presentation, we will explain the CRIDA method and compare it to existing planning processes, such as the US Army Corps of Engineers Principles and Guidelines as well as Integrated Water Resources Management Planning. Then, we will apply the approach to a hypothetical case study for the Waas Region, a large downstream river basin facing rapid development threatened by increased flood risks. Through the case study, we will demonstrate how a stakeholder driven process can be used to evaluate system robustness to climate change; develop adaptation pathways for multiple objectives and criteria; and illustrate how varying levels of confidence, consequences, and capacity would play a role in the decision making process, specifically in regards to the level of robustness and flexibility in the selected strategy. This work will equip practitioners and decision makers with an example of a structured process for decision making under climate uncertainty that can be scaled as needed to the problem at hand. This presentation builds further on another submitted abstract "Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning" by Jeuken et al.
Robust Fixed-Structure Controller Synthesis
NASA Technical Reports Server (NTRS)
Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)
2000-01-01
The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.
Advanced engineering software for in-space assembly and manned planetary spacecraft
NASA Technical Reports Server (NTRS)
Delaquil, Donald; Mah, Robert
1990-01-01
Meeting the objectives of the Lunar/Mars initiative to establish safe and cost-effective extraterrestrial bases requires an integrated software/hardware approach to operational definitions and systems implementation. This paper begins this process by taking a 'software-first' approach to systems design, for implementing specific mission scenarios in the domains of in-space assembly and operations of the manned Mars spacecraft. The technological barriers facing implementation of robust operational systems within these two domains are discussed, and preliminary software requirements and architectures that resolve these barriers are provided.
CARDS: A blueprint and environment for domain-specific software reuse
NASA Technical Reports Server (NTRS)
Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine
1992-01-01
CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'
NASA Astrophysics Data System (ADS)
Shahzad, Muhammad A.
1999-02-01
With the emergence of data warehousing, Decision support systems have evolved to its best. At the core of these warehousing systems lies a good database management system. Database server, used for data warehousing, is responsible for providing robust data management, scalability, high performance query processing and integration with other servers. Oracle being the initiator in warehousing servers, provides a wide range of features for facilitating data warehousing. This paper is designed to review the features of data warehousing - conceptualizing the concept of data warehousing and, lastly, features of Oracle servers for implementing a data warehouse.
Event and Apparent Horizon Finders for 3 + 1 Numerical Relativity.
Thornburg, Jonathan
2007-01-01
Event and apparent horizons are key diagnostics for the presence and properties of black holes. In this article I review numerical algorithms and codes for finding event and apparent horizons in numerically-computed spacetimes, focusing on calculations done using the 3 + 1 ADM formalism. The event horizon of an asymptotically-flat spacetime is the boundary between those events from which a future-pointing null geodesic can reach future null infinity and those events from which no such geodesic exists. The event horizon is a (continuous) null surface in spacetime. The event horizon is defined nonlocally in time : it is a global property of the entire spacetime and must be found in a separate post-processing phase after all (or at least the nonstationary part) of spacetime has been numerically computed. There are three basic algorithms for finding event horizons, based on integrating null geodesics forwards in time, integrating null geodesics backwards in time, and integrating null surfaces backwards in time. The last of these is generally the most efficient and accurate. In contrast to an event horizon, an apparent horizon is defined locally in time in a spacelike slice and depends only on data in that slice, so it can be (and usually is) found during the numerical computation of a spacetime. A marginally outer trapped surface (MOTS) in a slice is a smooth closed 2-surface whose future-pointing outgoing null geodesics have zero expansion Θ. An apparent horizon is then defined as a MOTS not contained in any other MOTS. The MOTS condition is a nonlinear elliptic partial differential equation (PDE) for the surface shape, containing the ADM 3-metric, its spatial derivatives, and the extrinsic curvature as coefficients. Most "apparent horizon" finders actually find MOTSs. There are a large number of apparent horizon finding algorithms, with differing trade-offs between speed, robustness, accuracy, and ease of programming. In axisymmetry, shooting algorithms work well and are fairly easy to program. In slices with no continuous symmetries, spectral integral-iteration algorithms and elliptic-PDE algorithms are fast and accurate, but require good initial guesses to converge. In many cases, Schnetter's "pretracking" algorithm can greatly improve an elliptic-PDE algorithm's robustness. Flow algorithms are generally quite slow but can be very robust in their convergence. Minimization methods are slow and relatively inaccurate in the context of a finite differencing simulation, but in a spectral code they can be relatively faster and more robust.
Control and automation of multilayered integrated microfluidic device fabrication.
Kipper, Sarit; Frolov, Ludmila; Guy, Ortal; Pellach, Michal; Glick, Yair; Malichi, Asaf; Knisbacher, Binyamin A; Barbiro-Michaely, Efrat; Avrahami, Dorit; Yavets-Chen, Yehuda; Levanon, Erez Y; Gerber, Doron
2017-01-31
Integrated microfluidics is a sophisticated three-dimensional (multi layer) solution for high complexity serial or parallel processes. Fabrication of integrated microfluidic devices requires soft lithography and the stacking of thin-patterned PDMS layers. Precise layer alignment and bonding is crucial. There are no previously reported standards for alignment of the layers, which is mostly performed using uncontrolled processes with very low alignment success. As a result, integrated microfluidics is mostly used in academia rather than in the many potential industrial applications. We have designed and manufactured a semiautomatic Microfluidic Device Assembly System (μDAS) for full device production. μDAS comprises an electrooptic mechanical system consisting of four main parts: optical system, smart media holder (for PDMS), a micropositioning xyzθ system and a macropositioning XY mechanism. The use of the μDAS yielded valuable information regarding PDMS as the material for device fabrication, revealed previously unidentified errors, and enabled optimization of a robust fabrication process. In addition, we have demonstrated the utilization of the μDAS technology for fabrication of a complex 3 layered device with over 12 000 micromechanical valves and an array of 64 × 64 DNA spots on a glass substrate with high yield and high accuracy. We increased fabrication yield from 25% to about 85% with an average layer alignment error of just ∼4 μm. It also increased our protein expression yields from 80% to over 90%, allowing us to investigate more proteins per experiment. The μDAS has great potential to become a valuable tool for both advancing integrated microfluidics in academia and producing and applying microfluidic devices in the industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.
2016-05-03
ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. Themetabolite,protein, andlipidextraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of thismore » protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental,in vitro, and clinical). IMPORTANCEIn systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample.« less
Characterization of Geiger mode avalanche photodiodes for fluorescence decay measurements
NASA Astrophysics Data System (ADS)
Jackson, John C.; Phelan, Don; Morrison, Alan P.; Redfern, R. Michael; Mathewson, Alan
2002-05-01
Geiger mode avalanche photodiodes (APD) can be biased above the breakdown voltage to allow detection of single photons. Because of the increase in quantum efficiency, magnetic field immunity, robustness, longer operating lifetime and reduction in costs, solid-state detectors capable of operating at non-cryogenic temperatures and providing single photon detection capabilities provide attractive alternatives to the photomultiplier tube (PMT). Shallow junction Geiger mode APD detectors provide the ability to manufacture photon detectors and detector arrays with CMOS compatible processing steps and allows the use of novel Silicon-on-Insulator(SoI) technology to provide future integrated sensing solutions. Previous work on Geiger mode APD detectors has focused on increasing the active area of the detector to make it more PMT like, easing the integration of discrete reaction, detection and signal processing into laboratory experimental systems. This discrete model for single photon detection works well for laboratory sized test and measurement equipment, however the move towards microfluidics and systems on a chip requires integrated sensing solutions. As we move towards providing integrated functionality of increasingly nanoscopic sized emissions, small area detectors and detector arrays that can be easily integrated into marketable systems, with sensitive small area single photon counting detectors will be needed. This paper will demonstrate the 2-dimensional and 3-dimensional simulation of optical coupling that occurs in Geiger mode APDs. Fabricated Geiger mode APD detectors optimized for fluorescence decay measurements were characterized and preliminary results show excellent results for their integration into fluorescence decay measurement systems.
A multimodal interface for real-time soldier-robot teaming
NASA Astrophysics Data System (ADS)
Barber, Daniel J.; Howard, Thomas M.; Walter, Matthew R.
2016-05-01
Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems becoming increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.
Ion traps fabricated in a CMOS foundry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehta, K. K.; Ram, R. J.; Eltony, A. M.
2014-07-28
We demonstrate trapping in a surface-electrode ion trap fabricated in a 90-nm CMOS (complementary metal-oxide-semiconductor) foundry process utilizing the top metal layer of the process for the trap electrodes. The process includes doped active regions and metal interconnect layers, allowing for co-fabrication of standard CMOS circuitry as well as devices for optical control and measurement. With one of the interconnect layers defining a ground plane between the trap electrode layer and the p-type doped silicon substrate, ion loading is robust and trapping is stable. We measure a motional heating rate comparable to those seen in surface-electrode traps of similar size.more » This demonstration of scalable quantum computing hardware utilizing a commercial CMOS process opens the door to integration and co-fabrication of electronics and photonics for large-scale quantum processing in trapped-ion arrays.« less
NASA Astrophysics Data System (ADS)
Sutrisno, Agung; Gunawan, Indra; Vanany, Iwan
2017-11-01
In spite of being integral part in risk - based quality improvement effort, studies improving quality of selection of corrective action priority using FMEA technique are still limited in literature. If any, none is considering robustness and risk in selecting competing improvement initiatives. This study proposed a theoretical model to select risk - based competing corrective action by considering robustness and risk of competing corrective actions. We incorporated the principle of robust design in counting the preference score among corrective action candidates. Along with considering cost and benefit of competing corrective actions, we also incorporate the risk and robustness of corrective actions. An example is provided to represent the applicability of the proposed model.
NASA Technical Reports Server (NTRS)
Solarna, David; Moser, Gabriele; Le Moigne-Stewart, Jacqueline; Serpico, Sebastiano B.
2017-01-01
Because of the large variety of sensors and spacecraft collecting data, planetary science needs to integrate various multi-sensor and multi-temporal images. These multiple data represent a precious asset, as they allow the study of targets spectral responses and of changes in the surface structure; because of their variety, they also require accurate and robust registration. A new crater detection algorithm, used to extract features that will be integrated in an image registration framework, is presented. A marked point process-based method has been developed to model the spatial distribution of elliptical objects (i.e. the craters) and a birth-death Markov chain Monte Carlo method, coupled with a region-based scheme aiming at computational efficiency, is used to find the optimal configuration fitting the image. The extracted features are exploited, together with a newly defined fitness function based on a modified Hausdorff distance, by an image registration algorithm whose architecture has been designed to minimize the computational time.
BIM based virtual environment for fire emergency evacuation.
Wang, Bin; Li, Haijiang; Rezgui, Yacine; Bradley, Alex; Ong, Hoang N
2014-01-01
Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management.
A simple analytical infiltration model for short-duration rainfall
NASA Astrophysics Data System (ADS)
Wang, Kaiwen; Yang, Xiaohua; Liu, Xiaomang; Liu, Changming
2017-12-01
Many infiltration models have been proposed to simulate infiltration process. Different initial soil conditions and non-uniform initial water content can lead to infiltration simulation errors, especially for short-duration rainfall (SHR). Few infiltration models are specifically derived to eliminate the errors caused by the complex initial soil conditions. We present a simple analytical infiltration model for SHR infiltration simulation, i.e., Short-duration Infiltration Process model (SHIP model). The infiltration simulated by 5 models (i.e., SHIP (high) model, SHIP (middle) model, SHIP (low) model, Philip model and Parlange model) were compared based on numerical experiments and soil column experiments. In numerical experiments, SHIP (middle) and Parlange models had robust solutions for SHR infiltration simulation of 12 typical soils under different initial soil conditions. The absolute values of percent bias were less than 12% and the values of Nash and Sutcliffe efficiency were greater than 0.83. Additionally, in soil column experiments, infiltration rate fluctuated in a range because of non-uniform initial water content. SHIP (high) and SHIP (low) models can simulate an infiltration range, which successfully covered the fluctuation range of the observed infiltration rate. According to the robustness of solutions and the coverage of fluctuation range of infiltration rate, SHIP model can be integrated into hydrologic models to simulate SHR infiltration process and benefit the flood forecast.
Continuous manufacturing of extended release tablets via powder mixing and direct compression.
Ervasti, Tuomas; Simonaho, Simo-Pekka; Ketolainen, Jarkko; Forsberg, Peter; Fransson, Magnus; Wikström, Håkan; Folestad, Staffan; Lakio, Satu; Tajarobi, Pirjo; Abrahmsén-Alami, Susanna
2015-11-10
The aim of the current work was to explore continuous dry powder mixing and direct compression for manufacturing of extended release (ER) matrix tablets. The study was span out with a challenging formulation design comprising ibuprofen compositions with varying particle size and a relatively low amount of the matrix former hydroxypropyl methylcellulose (HPMC). Standard grade HPMC (CR) was compared to a recently developed direct compressible grade (DC2). The work demonstrate that ER tablets with desired quality attributes could be manufactured via integrated continuous mixing and direct compression. The most robust tablet quality (weight, assay, tensile strength) was obtained using high mixer speed and large particle size ibuprofen and HPMC DC2 due to good powder flow. At low mixer speed it was more difficult to achieve high quality low dose tablets. Notably, with HPMC DC2 the processing conditions had a significant effect on drug release. Longer processing time and/or faster mixer speed was needed to achieve robust release with compositions containing DC2 compared with those containing CR. This work confirms the importance of balancing process parameters and material properties to find consistent product quality. Also, adaptive control is proven a pivotal means for control of continuous manufacturing systems. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Imany, Poolad; Jaramillo-Villegas, Jose A.; Odele, Ogaga D.
Quantum frequency combs from chip-scale integrated sources are promising candidates for scalable and robust quantum information processing (QIP). However, to use these quantum combs for frequency domain QIP, demonstration of entanglement in the frequency basis, showing that the entangled photons are in a coherent superposition of multiple frequency bins, is required. We present a verification of qubit and qutrit frequency-bin entanglement using an on-chip quantum frequency comb with 40 mode pairs, through a two-photon interference measurement that is based on electro-optic phase modulation. Our demonstrations provide an important contribution in establishing integrated optical microresonators as a source for high-dimensional frequency-binmore » encoded quantum computing, as well as dense quantum key distribution.« less
Imany, Poolad; Jaramillo-Villegas, Jose A.; Odele, Ogaga D.; ...
2018-01-18
Quantum frequency combs from chip-scale integrated sources are promising candidates for scalable and robust quantum information processing (QIP). However, to use these quantum combs for frequency domain QIP, demonstration of entanglement in the frequency basis, showing that the entangled photons are in a coherent superposition of multiple frequency bins, is required. We present a verification of qubit and qutrit frequency-bin entanglement using an on-chip quantum frequency comb with 40 mode pairs, through a two-photon interference measurement that is based on electro-optic phase modulation. Our demonstrations provide an important contribution in establishing integrated optical microresonators as a source for high-dimensional frequency-binmore » encoded quantum computing, as well as dense quantum key distribution.« less
Autonomous Operations System: Development and Application
NASA Technical Reports Server (NTRS)
Toro Medina, Jaime A.; Wilkins, Kim N.; Walker, Mark; Stahl, Gerald M.
2016-01-01
Autonomous control systems provides the ability of self-governance beyond the conventional control system. As the complexity of mechanical and electrical systems increases, there develops a natural drive for developing robust control systems to manage complicated operations. By closing the bridge between conventional automated systems to knowledge based self-awareness systems, nominal control of operations can evolve into relying on safe critical mitigation processes to support any off-nominal behavior. Current research and development efforts lead by the Autonomous Propellant Loading (APL) group at NASA Kennedy Space Center aims to improve cryogenic propellant transfer operations by developing an automated control and health monitoring system. As an integrated systems, the center aims to produce an Autonomous Operations System (AOS) capable of integrating health management operations with automated control to produce a fully autonomous system.
Parareal algorithms with local time-integrators for time fractional differential equations
NASA Astrophysics Data System (ADS)
Wu, Shu-Lin; Zhou, Tao
2018-04-01
It is challenge work to design parareal algorithms for time-fractional differential equations due to the historical effect of the fractional operator. A direct extension of the classical parareal method to such equations will lead to unbalance computational time in each process. In this work, we present an efficient parareal iteration scheme to overcome this issue, by adopting two recently developed local time-integrators for time fractional operators. In both approaches, one introduces auxiliary variables to localized the fractional operator. To this end, we propose a new strategy to perform the coarse grid correction so that the auxiliary variables and the solution variable are corrected separately in a mixed pattern. It is shown that the proposed parareal algorithm admits robust rate of convergence. Numerical examples are presented to support our conclusions.
Iannuzzi, David; Grant, Andrew; Corriveau, Hélène; Boissy, Patrick; Michaud, Francois
2016-12-01
The objective of this study was to design effectively integrated information architecture for a mobile teleoperated robot in remote assistance to the delivery of home health care. Three role classes were identified related to the deployment of a telerobot, namely, engineer, technology integrator, and health professional. Patients and natural caregivers were indirectly considered, this being a component of future field studies. Interviewing representatives of each class provided the functions, and information content and flows for each function. Interview transcripts enabled the formulation of UML (Universal Modeling Language) diagrams for feedback from participants. The proposed information architecture was validated with a use-case scenario. The integrated information architecture incorporates progressive design, ergonomic integration, and the home care needs from medical specialist, nursing, physiotherapy, occupational therapy, and social worker care perspectives. The integrated architecture iterative process promoted insight among participants. The use-case scenario evaluation showed the design's robustness. Complex innovation such as a telerobot must coherently mesh with health-care service delivery needs. The deployment of integrated information architecture bridging development, with specialist and home care applications, is necessary for home care technology innovation. It enables continuing evolution of robot and novel health information design in the same integrated architecture, while accounting for patient ecological need.
Numerosity estimation benefits from transsaccadic information integration
Hübner, Carolin; Schütz, Alexander C.
2017-01-01
Humans achieve a stable and homogeneous representation of their visual environment, although visual processing varies across the visual field. Here we investigated the circumstances under which peripheral and foveal information is integrated for numerosity estimation across saccades. We asked our participants to judge the number of black and white dots on a screen. Information was presented either in the periphery before a saccade, in the fovea after a saccade, or in both areas consecutively to measure transsaccadic integration. In contrast to previous findings, we found an underestimation of numerosity for foveal presentation and an overestimation for peripheral presentation. We used a maximum-likelihood model to predict accuracy and reliability in the transsaccadic condition based on peripheral and foveal values. We found near-optimal integration of peripheral and foveal information, consistently with previous findings about orientation integration. In three consecutive experiments, we disrupted object continuity between the peripheral and foveal presentations to probe the limits of transsaccadic integration. Even for global changes on our numerosity stimuli, no influence of object discontinuity was observed. Overall, our results suggest that transsaccadic integration is a robust mechanism that also works for complex visual features such as numerosity and is operative despite internal or external mismatches between foveal and peripheral information. Transsaccadic integration facilitates an accurate and reliable perception of our environment. PMID:29149766
Lin, Zhicheng; He, Sheng
2012-01-01
Object identities (“what”) and their spatial locations (“where”) are processed in distinct pathways in the visual system, raising the question of how the what and where information is integrated. Because of object motions and eye movements, the retina-based representations are unstable, necessitating nonretinotopic representation and integration. A potential mechanism is to code and update objects according to their reference frames (i.e., frame-centered representation and integration). To isolate frame-centered processes, in a frame-to-frame apparent motion configuration, we (a) presented two preceding or trailing objects on the same frame, equidistant from the target on the other frame, to control for object-based (frame-based) effect and space-based effect, and (b) manipulated the target's relative location within its frame to probe frame-centered effect. We show that iconic memory, visual priming, and backward masking depend on objects' relative frame locations, orthogonal of the retinotopic coordinate. These findings not only reveal that iconic memory, visual priming, and backward masking can be nonretinotopic but also demonstrate that these processes are automatically constrained by contextual frames through a frame-centered mechanism. Thus, object representation is robustly and automatically coupled to its reference frame and continuously being updated through a frame-centered, location-specific mechanism. These findings lead to an object cabinet framework, in which objects (“files”) within the reference frame (“cabinet”) are orderly coded relative to the frame. PMID:23104817
Automatic Image Registration of Multimodal Remotely Sensed Data with Global Shearlet Features
NASA Technical Reports Server (NTRS)
Murphy, James M.; Le Moigne, Jacqueline; Harding, David J.
2015-01-01
Automatic image registration is the process of aligning two or more images of approximately the same scene with minimal human assistance. Wavelet-based automatic registration methods are standard, but sometimes are not robust to the choice of initial conditions. That is, if the images to be registered are too far apart relative to the initial guess of the algorithm, the registration algorithm does not converge or has poor accuracy, and is thus not robust. These problems occur because wavelet techniques primarily identify isotropic textural features and are less effective at identifying linear and curvilinear edge features. We integrate the recently developed mathematical construction of shearlets, which is more effective at identifying sparse anisotropic edges, with an existing automatic wavelet-based registration algorithm. Our shearlet features algorithm produces more distinct features than wavelet features algorithms; the separation of edges from textures is even stronger than with wavelets. Our algorithm computes shearlet and wavelet features for the images to be registered, then performs least squares minimization on these features to compute a registration transformation. Our algorithm is two-staged and multiresolution in nature. First, a cascade of shearlet features is used to provide a robust, though approximate, registration. This is then refined by registering with a cascade of wavelet features. Experiments across a variety of image classes show an improved robustness to initial conditions, when compared to wavelet features alone.
Automatic Image Registration of Multi-Modal Remotely Sensed Data with Global Shearlet Features
Murphy, James M.; Le Moigne, Jacqueline; Harding, David J.
2017-01-01
Automatic image registration is the process of aligning two or more images of approximately the same scene with minimal human assistance. Wavelet-based automatic registration methods are standard, but sometimes are not robust to the choice of initial conditions. That is, if the images to be registered are too far apart relative to the initial guess of the algorithm, the registration algorithm does not converge or has poor accuracy, and is thus not robust. These problems occur because wavelet techniques primarily identify isotropic textural features and are less effective at identifying linear and curvilinear edge features. We integrate the recently developed mathematical construction of shearlets, which is more effective at identifying sparse anisotropic edges, with an existing automatic wavelet-based registration algorithm. Our shearlet features algorithm produces more distinct features than wavelet features algorithms; the separation of edges from textures is even stronger than with wavelets. Our algorithm computes shearlet and wavelet features for the images to be registered, then performs least squares minimization on these features to compute a registration transformation. Our algorithm is two-staged and multiresolution in nature. First, a cascade of shearlet features is used to provide a robust, though approximate, registration. This is then refined by registering with a cascade of wavelet features. Experiments across a variety of image classes show an improved robustness to initial conditions, when compared to wavelet features alone. PMID:29123329
Robust Hybrid Finite Element Methods for Antennas and Microwave Circuits
NASA Technical Reports Server (NTRS)
Gong, J.; Volakis, John L.
1996-01-01
One of the primary goals in this dissertation is concerned with the development of robust hybrid finite element-boundary integral (FE-BI) techniques for modeling and design of conformal antennas of arbitrary shape. Both the finite element and integral equation methods will be first overviewed in this chapter with an emphasis on recently developed hybrid FE-BI methodologies for antennas, microwave and millimeter wave applications. The structure of the dissertation is then outlined. We conclude the chapter with discussions of certain fundamental concepts and methods in electromagnetics, which are important to this study.
Integration Methodology For Oil-Free Shaft Support Systems: Four Steps to Success
NASA Technical Reports Server (NTRS)
Howard, Samuel A.; DellaCorte, Christopher; Bruckner, Robert J.
2010-01-01
Commercial applications for Oil-Free turbomachinery are slowly becoming a reality. Micro-turbine generators, highspeed electric motors, and electrically driven centrifugal blowers are a few examples of products available in today's commercial marketplace. Gas foil bearing technology makes most of these applications possible. A significant volume of component level research has led to recent acceptance of gas foil bearings in several specialized applications, including those mentioned above. Component tests identifying such characteristics as load carrying capacity, power loss, thermal behavior, rotordynamic coefficients, etc. all help the engineer design foil bearing machines, but the development process can be just as important. As the technology gains momentum and acceptance in a wider array of machinery, the complexity and variety of applications will grow beyond the current class of machines. Following a robust integration methodology will help improve the probability of successful development of future Oil-Free turbomachinery. This paper describes a previously successful four-step integration methodology used in the development of several Oil-Free turbomachines. Proper application of the methods put forward here enable successful design of Oil-Free turbomachinery. In addition when significant design changes or unique machinery are developed, this four-step process must be considered.
Chen, Zhiru; Hong, Wenxue
2016-02-01
Considering the low accuracy of prediction in the positive samples and poor overall classification effects caused by unbalanced sample data of MicroRNA (miRNA) target, we proposes a support vector machine (SVM)-integration of under-sampling and weight (IUSM) algorithm in this paper, an under-sampling based on the ensemble learning algorithm. The algorithm adopts SVM as learning algorithm and AdaBoost as integration framework, and embeds clustering-based under-sampling into the iterative process, aiming at reducing the degree of unbalanced distribution of positive and negative samples. Meanwhile, in the process of adaptive weight adjustment of the samples, the SVM-IUSM algorithm eliminates the abnormal ones in negative samples with robust sample weights smoothing mechanism so as to avoid over-learning. Finally, the prediction of miRNA target integrated classifier is achieved with the combination of multiple weak classifiers through the voting mechanism. The experiment revealed that the SVM-IUSW, compared with other algorithms on unbalanced dataset collection, could not only improve the accuracy of positive targets and the overall effect of classification, but also enhance the generalization ability of miRNA target classifier.
Rowland, Kevin C; Joy, Anita
2015-03-01
Reports on the status of dental education have concluded that there is a need for various types of curricular reform, making recommendations that include better integration of basic, behavioral, and clinical sciences, increased case-based teaching, emphasis on student-driven learning, and creation of lifelong learners. Dental schools faced with decreasing contact hours, increasing teaching material, and technological advancements have experimented with alternate curricular strategies. At Southern Illinois University School of Dental Medicine, curricular changes have begun with a series of integrated biomedical sciences courses. During the process of planning and implementing the integrated courses, a novel venue-the gross anatomy laboratory-was used to introduce all Year 1 students to critical thinking, self-directed learning, and the scientific method. The venture included student-driven documentation of anatomical variations encountered in the laboratory using robust scientific methods, thorough literature review, and subsequent presentation of findings in peer review settings. Students responded positively, with over 75% agreeing the experience intellectually challenged them. This article describes the process of re-envisioning the gross anatomy laboratory as an effective venue for small group-based, student-driven projects that focus on key pedagogical concepts to encourage the development of lifelong learners.
Robust adaptive multichannel SAR processing based on covariance matrix reconstruction
NASA Astrophysics Data System (ADS)
Tan, Zhen-ya; He, Feng
2018-04-01
With the combination of digital beamforming (DBF) processing, multichannel synthetic aperture radar(SAR) systems in azimuth promise well in high-resolution and wide-swath imaging, whereas conventional processing methods don't take the nonuniformity of scattering coefficient into consideration. This paper brings up a robust adaptive Multichannel SAR processing method which utilizes the Capon spatial spectrum estimator to obtain the spatial spectrum distribution over all ambiguous directions first, and then the interference-plus-noise covariance Matrix is reconstructed based on definition to acquire the Multichannel SAR processing filter. The performance of processing under nonuniform scattering coefficient is promoted by this novel method and it is robust again array errors. The experiments with real measured data demonstrate the effectiveness and robustness of the proposed method.
Membrane Resonance Enables Stable and Robust Gamma Oscillations
Moca, Vasile V.; Nikolić, Danko; Singer, Wolf; Mureşan, Raul C.
2014-01-01
Neuronal mechanisms underlying beta/gamma oscillations (20–80 Hz) are not completely understood. Here, we show that in vivo beta/gamma oscillations in the cat visual cortex sometimes exhibit remarkably stable frequency even when inputs fluctuate dramatically. Enhanced frequency stability is associated with stronger oscillations measured in individual units and larger power in the local field potential. Simulations of neuronal circuitry demonstrate that membrane properties of inhibitory interneurons strongly determine the characteristics of emergent oscillations. Exploration of networks containing either integrator or resonator inhibitory interneurons revealed that: (i) Resonance, as opposed to integration, promotes robust oscillations with large power and stable frequency via a mechanism called RING (Resonance INduced Gamma); resonance favors synchronization by reducing phase delays between interneurons and imposes bounds on oscillation cycle duration; (ii) Stability of frequency and robustness of the oscillation also depend on the relative timing of excitatory and inhibitory volleys within the oscillation cycle; (iii) RING can reproduce characteristics of both Pyramidal INterneuron Gamma (PING) and INterneuron Gamma (ING), transcending such classifications; (iv) In RING, robust gamma oscillations are promoted by slow but are impaired by fast inputs. Results suggest that interneuronal membrane resonance can be an important ingredient for generation of robust gamma oscillations having stable frequency. PMID:23042733
Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco
2017-05-01
Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Aerial video mosaicking using binary feature tracking
NASA Astrophysics Data System (ADS)
Minnehan, Breton; Savakis, Andreas
2015-05-01
Unmanned Aerial Vehicles are becoming an increasingly attractive platform for many applications, as their cost decreases and their capabilities increase. Creating detailed maps from aerial data requires fast and accurate video mosaicking methods. Traditional mosaicking techniques rely on inter-frame homography estimations that are cascaded through the video sequence. Computationally expensive keypoint matching algorithms are often used to determine the correspondence of keypoints between frames. This paper presents a video mosaicking method that uses an object tracking approach for matching keypoints between frames to improve both efficiency and robustness. The proposed tracking method matches local binary descriptors between frames and leverages the spatial locality of the keypoints to simplify the matching process. Our method is robust to cascaded errors by determining the homography between each frame and the ground plane rather than the prior frame. The frame-to-ground homography is calculated based on the relationship of each point's image coordinates and its estimated location on the ground plane. Robustness to moving objects is integrated into the homography estimation step through detecting anomalies in the motion of keypoints and eliminating the influence of outliers. The resulting mosaics are of high accuracy and can be computed in real time.
JIMM: the next step for mission-level models
NASA Astrophysics Data System (ADS)
Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.
2001-09-01
The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Baetz, B. W.; Ancell, B. C.
2017-05-01
The particle filtering techniques have been receiving increasing attention from the hydrologic community due to its ability to properly estimate model parameters and states of nonlinear and non-Gaussian systems. To facilitate a robust quantification of uncertainty in hydrologic predictions, it is necessary to explicitly examine the forward propagation and evolution of parameter uncertainties and their interactions that affect the predictive performance. This paper presents a unified probabilistic framework that merges the strengths of particle Markov chain Monte Carlo (PMCMC) and factorial polynomial chaos expansion (FPCE) algorithms to robustly quantify and reduce uncertainties in hydrologic predictions. A Gaussian anamorphosis technique is used to establish a seamless bridge between the data assimilation using the PMCMC and the uncertainty propagation using the FPCE through a straightforward transformation of posterior distributions of model parameters. The unified probabilistic framework is applied to the Xiangxi River watershed of the Three Gorges Reservoir (TGR) region in China to demonstrate its validity and applicability. Results reveal that the degree of spatial variability of soil moisture capacity is the most identifiable model parameter with the fastest convergence through the streamflow assimilation process. The potential interaction between the spatial variability in soil moisture conditions and the maximum soil moisture capacity has the most significant effect on the performance of streamflow predictions. In addition, parameter sensitivities and interactions vary in magnitude and direction over time due to temporal and spatial dynamics of hydrologic processes.
NASA Astrophysics Data System (ADS)
Zhao, Zhiguo; Lei, Dan; Chen, Jiayi; Li, Hangyu
2018-05-01
When the four-wheel-drive hybrid electric vehicle (HEV) equipped with a dry dual clutch transmission (DCT) is in the mode transition process from pure electrical rear wheel drive to front wheel drive with engine or hybrid drive, the problem of vehicle longitudinal jerk is prominent. A mode transition robust control algorithm which resists external disturbance and model parameter fluctuation has been developed, by taking full advantage of fast and accurate torque (or speed) response of three electrical power sources and getting the clutch of DCT fully involved in the mode transition process. Firstly, models of key components of driveline system have been established, and the model of five-degrees-of-freedom vehicle longitudinal dynamics has been built by using a Uni-Tire model. Next, a multistage optimal control method has been produced to realize the decision of engine torque and clutch-transmitted torque. The sliding-mode control strategy for measurable disturbance has been proposed at the stage of engine speed dragged up. Meanwhile, the double tracking control architecture that integrates the model calculating feedforward control with H∞ robust feedback control has been presented at the stage of speed synchronization. Finally, the results from Matlab/Simulink software and hardware-in-the-loop test both demonstrate that the proposed control strategy for mode transition can not only coordinate the torque among different power sources and clutch while minimizing vehicle longitudinal jerk, but also provide strong robustness to model uncertainties and external disturbance.
Travers, Brittany G.; Bigler, Erin D.; Tromp, Do P. M.; Adluru, Nagesh; Froehlich, Alyson L.; Ennis, Chad; Lange, Nicholas; Nielsen, Jared A.; Prigge, Molly B. D.; Alexander, Andrew L.; Lainhart, Janet E.
2014-01-01
The present study used an accelerated longitudinal design to examine group differences and age-related changes in processing speed in 81 individuals with Autism Spectrum Disorder (ASD) compared to 56 age-matched individuals with typical development (ages 6–39 years). Processing speed was assessed using the Wechsler Intelligence Scale for Children-3rd edition (WISC-III) and the Wechsler Adult Intelligence Scale-3rd edition (WAIS-III). Follow-up analyses examined processing speed subtest performance and relations between processing speed and white matter microstructure (as measured with diffusion tensor imaging [DTI] in a subset of these participants). After controlling for full scale IQ, the present results show that processing speed index standard scores were on average 12 points lower in the group with ASD compared to the group with typical development. There were, however, no significant group differences in standard score age-related changes within this age range. For subtest raw scores, the group with ASD demonstrated robustly slower processing speeds in the adult versions of the IQ test (i.e., WAIS-III) but not in the child versions (WISC-III), even though age-related changes were similar in both the ASD and typically developing groups. This pattern of results may reflect difficulties that become increasingly evident in ASD on more complex measures of processing speed. Finally, DTI measures of whole-brain white matter microstructure suggested that fractional anisotropy (but not mean diffusivity, radial diffusivity, or axial diffusivity) made significant but small-sized contributions to processing speed standard scores across our entire sample. Taken together, the present findings suggest that robust decreases in processing speed may be present in ASD, more pronounced in adulthood, and partially attributable to white matter microstructural integrity. PMID:24269298
Badawy, Sherif I F; Narang, Ajit S; LaMarche, Keirnan R; Subramanian, Ganeshkumar A; Varia, Sailesh A; Lin, Judy; Stevens, Tim; Shah, Pankaj A
2016-01-01
Modern drug product development is expected to follow quality-by-design (QbD) paradigm. At the same time, although there are several issue-specific examples in the literature that demonstrate the application of QbD principles, a holistic demonstration of the application of QbD principles to drug product development and control strategy, is lacking. This article provides an integrated case study on the systematic application of QbD to product development and demonstrates the implementation of QbD concepts in the different aspects of product and process design for brivanib alaninate film-coated tablets. Using a risk-based approach, the strategy for development entailed identification of product critical quality attributes (CQAs), assessment of risks to the CQAs, and performing experiments to understand and mitigate identified risks. Quality risk assessments and design of experiments were performed to understand the quality of the input raw materials required for a robust formulation and the impact of manufacturing process parameters on CQAs. In addition to the material property and process parameter controls, the proposed control strategy includes use of process analytical technology and conventional analytical tests to control in-process material attributes and ensure quality of the final product. Copyright © 2016. Published by Elsevier Inc.
Time-Warp–Invariant Neuronal Processing
Gütig, Robert; Sompolinsky, Haim
2009-01-01
Fluctuations in the temporal durations of sensory signals constitute a major source of variability within natural stimulus ensembles. The neuronal mechanisms through which sensory systems can stabilize perception against such fluctuations are largely unknown. An intriguing instantiation of such robustness occurs in human speech perception, which relies critically on temporal acoustic cues that are embedded in signals with highly variable duration. Across different instances of natural speech, auditory cues can undergo temporal warping that ranges from 2-fold compression to 2-fold dilation without significant perceptual impairment. Here, we report that time-warp–invariant neuronal processing can be subserved by the shunting action of synaptic conductances that automatically rescales the effective integration time of postsynaptic neurons. We propose a novel spike-based learning rule for synaptic conductances that adjusts the degree of synaptic shunting to the temporal processing requirements of a given task. Applying this general biophysical mechanism to the example of speech processing, we propose a neuronal network model for time-warp–invariant word discrimination and demonstrate its excellent performance on a standard benchmark speech-recognition task. Our results demonstrate the important functional role of synaptic conductances in spike-based neuronal information processing and learning. The biophysics of temporal integration at neuronal membranes can endow sensory pathways with powerful time-warp–invariant computational capabilities. PMID:19582146
NASA Astrophysics Data System (ADS)
Alabdulkarem, Abdullah
Liquefied natural gas (LNG) plants are energy intensive. As a result, the power plants operating these LNG plants emit high amounts of CO2 . To mitigate global warming that is caused by the increase in atmospheric CO2, CO2 capture and sequestration (CCS) using amine absorption is proposed. However, the major challenge of implementing this CCS system is the associated power requirement, increasing power consumption by about 15--25%. Therefore, the main scope of this work is to tackle this challenge by minimizing CCS power consumption as well as that of the entire LNG plant though system integration and rigorous optimization. The power consumption of the LNG plant was reduced through improving the process of liquefaction itself. In this work, a genetic algorithm (GA) was used to optimize a propane pre-cooled mixed-refrigerant (C3-MR) LNG plant modeled using HYSYS software. An optimization platform coupling Matlab with HYSYS was developed. New refrigerant mixtures were found, with savings in power consumption as high as 13%. LNG plants optimization with variable natural gas feed compositions was addressed and the solution was proposed through applying robust optimization techniques, resulting in a robust refrigerant which can liquefy a range of natural gas feeds. The second approach for reducing the power consumption is through process integration and waste heat utilization in the integrated CCS system. Four waste heat sources and six potential uses were uncovered and evaluated using HYSYS software. The developed models were verified against experimental data from the literature with good agreement. Net available power enhancement in one of the proposed CCS configuration is 16% more than the conventional CCS configuration. To reduce the CO2 pressurization power into a well for enhanced oil recovery (EOR) applications, five CO2 pressurization methods were explored. New CO2 liquefaction cycles were developed and modeled using HYSYS software. One of the developed liquefaction cycles using NH3 as a refrigerant resulted in 5% less power consumption than the conventional multi-stage compression cycle. Finally, a new concept of providing the CO2 regeneration heat is proposed. The proposed concept is using a heat pump to provide the regeneration heat as well as process heat and CO2 liquefaction heat. Seven configurations of heat pumps integrated with CCS were developed. One of the heat pumps consumes 24% less power than the conventional system or 59% less total equivalent power demand than the conventional system with steam extraction and CO2 compression.
NASA Astrophysics Data System (ADS)
Liu, Weiqiang; Chen, Rujun; Cai, Hongzhu; Luo, Weibin
2016-12-01
In this paper, we investigated the robust processing of noisy spread spectrum induced polarization (SSIP) data. SSIP is a new frequency domain induced polarization method that transmits pseudo-random m-sequence as source current where m-sequence is a broadband signal. The potential information at multiple frequencies can be obtained through measurement. Removing the noise is a crucial problem for SSIP data processing. Considering that if the ordinary mean stack and digital filter are not capable of reducing the impulse noise effectively in SSIP data processing, the impact of impulse noise will remain in the complex resistivity spectrum that will affect the interpretation of profile anomalies. We implemented a robust statistical method to SSIP data processing. The robust least-squares regression is used to fit and remove the linear trend from the original data before stacking. The robust M estimate is used to stack the data of all periods. The robust smooth filter is used to suppress the residual noise for data after stacking. For robust statistical scheme, the most appropriate influence function and iterative algorithm are chosen by testing the simulated data to suppress the outliers' influence. We tested the benefits of the robust SSIP data processing using examples of SSIP data recorded in a test site beside a mine in Gansu province, China.
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
In our previous research, we showed that robust Bayesian methods can be used in environmental modeling to define a set of probability distributions for key parameters that captures the effects of expert disagreement, ambiguity, or ignorance. This entire set can then be update...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyaya, Belle R.; Hines, J. Wesley; Lu, Baofu
2005-06-03
The overall purpose of this Nuclear Engineering Education Research (NEER) project was to integrate new, innovative, and existing technologies to develop a fault diagnostics and characterization system for nuclear plant steam generators (SG) and heat exchangers (HX). Issues related to system level degradation of SG and HX tubing, including tube fouling, performance under reduced heat transfer area, and the damage caused by stress corrosion cracking, are the important factors that influence overall plant operation, maintenance, and economic viability of nuclear power systems. The research at The University of Tennessee focused on the development of techniques for monitoring process and structuralmore » integrity of steam generators and heat exchangers. The objectives of the project were accomplished by the completion of the following tasks. All the objectives were accomplished during the project period. This report summarizes the research and development activities, results, and accomplishments during June 2001 September 2004. Development and testing of a high-fidelity nodal model of a U-tube steam generator (UTSG) to simulate the effects of fouling and to generate a database representing normal and degraded process conditions. Application of the group method of data handling (GMDH) method for process variable prediction. Development of a laboratory test module to simulate particulate fouling of HX tubes and its effect on overall thermal resistance. Application of the GMDH technique to predict HX fluid temperatures, and to compare with the calculated thermal resistance.Development of a hybrid modeling technique for process diagnosis and its evaluation using laboratory heat exchanger test data. Development and testing of a sensor suite using piezo-electric devices for monitoring structural integrity of both flat plates (beams) and tubing. Experiments were performed in air, and in water with and without bubbly flow. Development of advanced signal processing methods using wavelet transforms and image processing techniques for isolating flaw types. Development and implementation of a new nonlinear and non-stationary signal processing method, called the Hilbert-Huang transform (HHT), for flaw detection and location. This is a more robust and adaptive approach compared to the wavelet transform.Implementation of a moving-window technique in the time domain for detecting and quantifying flaw types in tubular structures. A window zooming technique was also developed for flaw location in tubes. Theoretical study of elastic wave propagation (longitudinal and shear waves) in metallic flat plates and tubing with and without flaws. Simulation of the Lamb wave propagation using the finite-element code ABAQUS. This enabled the verification of the experimental results. The research tasks included both analytical research and experimental studies. The experimental results helped to enhance the robustness of fault monitoring methods and to provide a systematic verification of the analytical results. The results of this research were disseminated in scientific meetings. The journal manuscript titled, "Structural Integrity Monitoring of Steam generator Tubing Using Transient Acoustic Signal Analysis," was published in IEEE Trasactions on Nuclear Science, Vol. 52, No. 1, February 2005. The new findings of this research have potential applications in aerospace and civil structures. The report contains a complete bibliography that was developed during the course of the project.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-11
...] Draft Programmatic Environmental Assessment for the Integrated Public Alert and Warning Program's... from construction- related actions taken under the Integrated Public Alert and Warning Program (IPAWS... Order 13407, Public Alert and Warning System, by providing robust and survivable power generation, fuel...
NASA Technical Reports Server (NTRS)
Garg, Sanjay
1993-01-01
Results are presented from an application of H-infinity control design methodology to a centralized integrated flight/propulsion control (IFPC) system design for a supersonic STOVL fighter aircraft in transition flight. The emphasis is on formulating the H-infinity optimal control synthesis problem such that the critical requirements for the flight and propulsion systems are adequately reflected within the linear, centralized control problem formulation and the resulting controller provides robustness to modeling uncertainties and model parameter variations with flight condition. Detailed evaluation results are presented for a reduced order controller obtained from the improved H-infinity control design showing that the control design meets the specified nominal performance objective as well as provides stability robustness for variations in plant system dynamics with changes in aircraft trim speed within the transition flight envelope.
Bischoff-Mattson, Zachary; Lynch, Amanda H
2017-07-01
Integration, a widely promoted response to the multi-scale complexities of social-environmental sustainability, is diversely and sometimes poorly conceptualized. In this paper we explore integrative governance, which we define as an iterative and contextual process for negotiating and advancing the common interest. We ground this definition in a discussion of institutional factors conditioning integrative governance of environmental water in Australia's Murray-Darling Basin. The Murray-Darling Basin is an iconic system of social-ecological complexity, evocative of large-scale conservation challenges in other developed arid river basins. Our critical assessment of integrative governance practices in that context emerges through analysis of interviews with policy participants and documents pertaining to environmental water management in the tri-state area of southwestern New South Wales, northwestern Victoria, and the South Australian Riverland. We identify four linked challenges: (i) decision support for developing socially robust environmental water management goals, (ii) resource constraints on adaptive practice, (iii) inter-state differences in participatory decision-making and devolution of authority, and (iv) representative inclusion in decision-making. Our appraisal demonstrates these as pivotal challenges for integrative governance in the common interest. We conclude by offering a perspective on the potential for supporting integrative governance through the bridging capacity of Australia's Commonwealth Environmental Water Holder.
Robust In-Flight Sensor Fault Diagnostics for Aircraft Engine Based on Sliding Mode Observers
Chang, Xiaodong; Huang, Jinquan; Lu, Feng
2017-01-01
For a sensor fault diagnostic system of aircraft engines, the health performance degradation is an inevitable interference that cannot be neglected. To address this issue, this paper investigates an integrated on-line sensor fault diagnostic scheme for a commercial aircraft engine based on a sliding mode observer (SMO). In this approach, one sliding mode observer is designed for engine health performance tracking, and another for sensor fault reconstruction. Both observers are employed in in-flight applications. The results of the former SMO are analyzed for post-flight updating the baseline model of the latter. This idea is practical and feasible since the updating process does not require the algorithm to be regulated or redesigned, so that ground-based intervention is avoided, and the update process is implemented in an economical and efficient way. With this setup, the robustness of the proposed scheme to the health degradation is much enhanced and the latter SMO is able to fulfill sensor fault reconstruction over the course of the engine life. The proposed sensor fault diagnostic system is applied to a nonlinear simulation of a commercial aircraft engine, and its effectiveness is evaluated in several fault scenarios. PMID:28398255
A robust color signal processing with wide dynamic range WRGB CMOS image sensor
NASA Astrophysics Data System (ADS)
Kawada, Shun; Kuroda, Rihito; Sugawa, Shigetoshi
2011-01-01
We have developed a robust color reproduction methodology by a simple calculation with a new color matrix using the formerly developed wide dynamic range WRGB lateral overflow integration capacitor (LOFIC) CMOS image sensor. The image sensor was fabricated through a 0.18 μm CMOS technology and has a 45 degrees oblique pixel array, the 4.2 μm effective pixel pitch and the W pixels. A W pixel was formed by replacing one of the two G pixels in the Bayer RGB color filter. The W pixel has a high sensitivity through the visible light waveband. An emerald green and yellow (EGY) signal is generated from the difference between the W signal and the sum of RGB signals. This EGY signal mainly includes emerald green and yellow lights. These colors are difficult to be reproduced accurately by the conventional simple linear matrix because their wave lengths are in the valleys of the spectral sensitivity characteristics of the RGB pixels. A new linear matrix based on the EGY-RGB signal was developed. Using this simple matrix, a highly accurate color processing with a large margin to the sensitivity fluctuation and noise has been achieved.
Robust In-Flight Sensor Fault Diagnostics for Aircraft Engine Based on Sliding Mode Observers.
Chang, Xiaodong; Huang, Jinquan; Lu, Feng
2017-04-11
For a sensor fault diagnostic system of aircraft engines, the health performance degradation is an inevitable interference that cannot be neglected. To address this issue, this paper investigates an integrated on-line sensor fault diagnostic scheme for a commercial aircraft engine based on a sliding mode observer (SMO). In this approach, one sliding mode observer is designed for engine health performance tracking, and another for sensor fault reconstruction. Both observers are employed in in-flight applications. The results of the former SMO are analyzed for post-flight updating the baseline model of the latter. This idea is practical and feasible since the updating process does not require the algorithm to be regulated or redesigned, so that ground-based intervention is avoided, and the update process is implemented in an economical and efficient way. With this setup, the robustness of the proposed scheme to the health degradation is much enhanced and the latter SMO is able to fulfill sensor fault reconstruction over the course of the engine life. The proposed sensor fault diagnostic system is applied to a nonlinear simulation of a commercial aircraft engine, and its effectiveness is evaluated in several fault scenarios.
Benefits to blood banks of a sales and operations planning process.
Keal, Donald A; Hebert, Phil
2010-12-01
A formal sales and operations planning (S&OP) process is a decision making and communication process that balances supply and demand while integrating all business operational components with customer-focused business plans that links high level strategic plans to day-to-day operations. Furthermore, S&OP can assist in managing change across the organization as it provides the opportunity to be proactive in the face of problems and opportunities while establishing a plan for everyone to follow. Some of the key outcomes from a robust S&OP process in blood banking would include: higher customer satisfaction (donors and health care providers), balanced inventory across product lines and customers, more stable production rates and higher productivity, more cooperation across the entire operation, and timely updates to the business plan resulting in better forecasting and fewer surprises that negatively impact the bottom line. © 2010 American Association of Blood Banks.
NASA Astrophysics Data System (ADS)
Neuhauser, D.; Dietz, L.; Lombard, P.; Klein, F.; Zuzlewski, S.; Kohler, W.; Hellweg, M.; Luetgert, J.; Oppenheimer, D.; Romanowicz, B.
2006-12-01
The longstanding cooperation between the USGS Menlo Park and UC Berkeley's Seismological Laboratory for monitoring earthquakes and providing data to the research community is achieving a new level of integration. While station support and data collection for each network (NC, BK, BP) remain the responsibilities of the host institution, picks, codas and amplitudes will be produced and shared between the data centers continuously. Thus, realtime earthquake processing from triggering and locating through magnitude and moment tensor calculation and Shakemap production will take place independently at both locations, improving the robustness of event reporting in the Northern California Earthquake Management Center. Parametric data will also be exchanged with the Southern California Earthquake Management System to allow statewide earthquake detection and processing for further redundancy within the California Integrated Seismic Network (CISN). The database plays an integral part in this system, providing the coordination for event processing as well as the repository for event, instrument (metadata) and waveform information. The same master database serves both realtime processing, data quality control and archival, and the data center which provides waveforms and earthquake data to users in the research community. Continuous waveforms from all BK, BP, and NC stations, event waveform gathers, and event information automatically become available at the Northern California Earthquake Data Center (NCEDC). Currently, the NCEDC is collecting and makes available over 4 TByes of data per year from the NCEMC stations and other seismic networks, as well as from GPS and and other geophysical instrumentation.
Tuning and Robustness Analysis for the Orion Absolute Navigation System
NASA Technical Reports Server (NTRS)
Holt, Greg N.; Zanetti, Renato; D'Souza, Christopher
2013-01-01
The Orion Multi-Purpose Crew Vehicle (MPCV) is currently under development as NASA's next-generation spacecraft for exploration missions beyond Low Earth Orbit. The MPCV is set to perform an orbital test flight, termed Exploration Flight Test 1 (EFT-1), some time in late 2014. The navigation system for the Orion spacecraft is being designed in a Multi-Organizational Design Environment (MODE) team including contractor and NASA personnel. The system uses an Extended Kalman Filter to process measurements and determine the state. The design of the navigation system has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to show the efforts made to-date in tuning the filter for the EFT-1 mission and instilling appropriate robustness into the system to meet the requirements of manned space ight. Filter performance is affected by many factors: data rates, sensor measurement errors, tuning, and others. This paper focuses mainly on the error characterization and tuning portion. Traditional efforts at tuning a navigation filter have centered around the observation/measurement noise and Gaussian process noise of the Extended Kalman Filter. While the Orion MODE team must certainly address those factors, the team is also looking at residual edit thresholds and measurement underweighting as tuning tools. Tuning analysis is presented with open loop Monte-Carlo simulation results showing statistical errors bounded by the 3-sigma filter uncertainty covariance. The Orion filter design uses 24 Exponentially Correlated Random Variable (ECRV) parameters to estimate the accel/gyro misalignment and nonorthogonality. By design, the time constant and noise terms of these ECRV parameters were set to manufacturer specifications and not used as tuning parameters. They are included in the filter as a more analytically correct method of modeling uncertainties than ad-hoc tuning of the process noise. Tuning is explored for the powered-flight ascent phase, where measurements are scarce and unmodelled vehicle accelerations dominate. On orbit, there are important trade-off cases between process and measurement noise. On entry, there are considerations about trading performance accuracy for robustness. Process Noise is divided into powered flight and coasting ight and can be adjusted for each phase and mode of the Orion EFT-1 mission. Measurement noise is used for the integrated velocity measurements during pad alignment. It is also used for Global Positioning System (GPS) pseudorange and delta- range measurements during the rest of the flight. The robustness effort has been focused on maintaining filter convergence and performance in the presence of unmodeled error sources. These include unmodeled forces on the vehicle and uncorrected errors on the sensor measurements. Orion uses a single-frequency, non-keyed GPS receiver, so the effects due to signal distortion in Earth's ionosphere and troposphere are present in the raw measurements. Results are presented showing the efforts to compensate for these errors as well as characterize the residual effect for measurement noise tuning. Another robustness tool in use is tuning the residual edit thresholds. The trade-off between noise tuning and edit thresholds is explored in the context of robustness to errors in dynamics models and sensor measurements. Measurement underweighting is also presented as a method of additional robustness when processing highly accurate measurements in the presence of large filter uncertainties.
Electromagnetic pulsed thermography for natural cracks inspection
Gao, Yunlai; Tian, Gui Yun; Wang, Ping; Wang, Haitao; Gao, Bin; Woo, Wai Lok; Li, Kongjing
2017-01-01
Emerging integrated sensing and monitoring of material degradation and cracks are increasingly required for characterizing the structural integrity and safety of infrastructure. However, most conventional nondestructive evaluation (NDE) methods are based on single modality sensing which is not adequate to evaluate structural integrity and natural cracks. This paper proposed electromagnetic pulsed thermography for fast and comprehensive defect characterization. It hybrids multiple physical phenomena i.e. magnetic flux leakage, induced eddy current and induction heating linking to physics as well as signal processing algorithms to provide abundant information of material properties and defects. New features are proposed using 1st derivation that reflects multiphysics spatial and temporal behaviors to enhance the detection of cracks with different orientations. Promising results that robust to lift-off changes and invariant features for artificial and natural cracks detection have been demonstrated that the proposed method significantly improves defect detectability. It opens up multiphysics sensing and integrated NDE with potential impact for natural understanding and better quantitative evaluation of natural cracks including stress corrosion crack (SCC) and rolling contact fatigue (RCF). PMID:28169361
Translational Medicine Guide transforms drug development processes: the recent Merck experience.
Dolgos, Hugues; Trusheim, Mark; Gross, Dietmar; Halle, Joern-Peter; Ogden, Janet; Osterwalder, Bruno; Sedman, Ewen; Rossetti, Luciano
2016-03-01
Merck is implementing a question-based Translational Medicine Guide (TxM Guide) beginning as early as lead optimization into its stage-gate drug development process. Initial experiences with the TxM Guide, which is embedded into an integrated development plan tailored to each development program, demonstrated opportunities to improve target understanding, dose setting (i.e., therapeutic index), and patient subpopulation selection with more robust and relevant early human-based evidence, and increased use of biomarkers and simulations. The TxM Guide is also helping improve organizational learning, costs, and governance. It has also shown the need for stronger external resources for validating biomarkers, demonstrating clinical utility, tracking natural disease history, and biobanking. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Yu, Jue; Zhuang, Jian; Yu, Dehong
2015-01-01
This paper concerns a state feedback integral control using a Lyapunov function approach for a rotary direct drive servo valve (RDDV) while considering parameter uncertainties. Modeling of this RDDV servovalve reveals that its mechanical performance is deeply influenced by friction torques and flow torques; however, these torques are uncertain and mutable due to the nature of fluid flow. To eliminate load resistance and to achieve satisfactory position responses, this paper develops a state feedback control that integrates an integral action and a Lyapunov function. The integral action is introduced to address the nonzero steady-state error; in particular, the Lyapunov function is employed to improve control robustness by adjusting the varying parameters within their value ranges. This new controller also has the advantages of simple structure and ease of implementation. Simulation and experimental results demonstrate that the proposed controller can achieve higher control accuracy and stronger robustness. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Evolving Maturation of the Series-Bosch System
NASA Technical Reports Server (NTRS)
Stanley, Christine; Abney, Morgan B.; Barnett, Bill
2017-01-01
Human exploration missions to Mars and other destinations beyond low Earth orbit require highly robust, reliable, and maintainable life support systems that maximize recycling of water and oxygen. In order to meet this requirement, NASA has continued the development of a Series-Bosch System, a two stage reactor process that reduces carbon dioxide (CO2) with hydrogen (H2) to produce water and solid carbon. Theoretically, the Bosch process can recover 100% of the oxygen (O2) from CO2 in the form of water, making it an attractive option for long duration missions. The Series Bosch system includes a reverse water gas shift (RWGS) reactor, a carbon formation reactor (CFR), an H2 extraction membrane, and a CO2 extraction membrane. In 2016, the results of integrated testing of the Series Bosch system showed great promise and resulted in design modifications to the CFR to further improve performance. This year, integrated testing was conducted with the modified reactor to evaluate its performance and compare it with the performance of the previous configuration. Additionally, a CFR with the capability to load new catalyst and remove spent catalyst in-situ was built. Flow demonstrations were performed to evaluate both the catalyst loading and removal process and the hardware performance. The results of the integrated testing with the modified CFR as well as the flow demonstrations are discussed in this paper.
Ultra-thin solid oxide fuel cells: Materials and devices
NASA Astrophysics Data System (ADS)
Kerman, Kian
Solid oxide fuel cells are electrochemical energy conversion devices utilizing solid electrolytes transporting O2- that typically operate in the 800 -- 1000 °C temperature range due to the large activation barrier for ionic transport. Reducing electrolyte thickness or increasing ionic conductivity can enable lower temperature operation for both stationary and portable applications. This thesis is focused on the fabrication of free standing ultrathin (<100 nm) oxide membranes of prototypical O 2- conducting electrolytes, namely Y2O3-doped ZrO2 and Gd2O3-doped CeO2. Fabrication of such membranes requires an understanding of thin plate mechanics coupled with controllable thin film deposition processes. Integration of free standing membranes into proof-of-concept fuel cell devices necessitates ideal electrode assemblies as well as creative processing schemes to experimentally test devices in a high temperature dual environment chamber. We present a simple elastic model to determine stable buckling configurations for free standing oxide membranes. This guides the experimental methodology for Y 2O3-doped ZrO2 film processing, which enables tunable internal stress in the films. Using these criteria, we fabricate robust Y2O3-doped ZrO2 membranes on Si and composite polymeric substrates by semiconductor and micro-machining processes, respectively. Fuel cell devices integrating these membranes with metallic electrodes are demonstrated to operate in the 300 -- 500 °C range, exhibiting record performance at such temperatures. A model combining physical transport of electronic carriers in an insulating film and electrochemical aspects of transport is developed to determine the limits of performance enhancement expected via electrolyte thickness reduction. Free standing oxide heterostructures, i.e. electrolyte membrane and oxide electrodes, are demonstrated. Lastly, using Y2O3-doped ZrO2 and Gd2O 3-doped CeO2, novel electrolyte fabrication schemes are explored to develop oxide alloys and nanoscale compositionally graded membranes that are thermomechanically robust and provide added interfacial functionality. The work in this thesis advances experimental state-of-the-art with respect to solid oxide fuel cell operation temperature, provides fundamental boundaries expected for ultrathin electrolytes, develops the ability to integrate highly dissimilar material (such as oxide-polymer) heterostructures, and introduces nanoscale compositionally graded electrolyte membranes that can lead to monolithic materials having multiple functionalities.
Neural circuit mechanisms of short-term memory
NASA Astrophysics Data System (ADS)
Goldman, Mark
Memory over time scales of seconds to tens of seconds is thought to be maintained by neural activity that is triggered by a memorized stimulus and persists long after the stimulus is turned off. This presents a challenge to current models of memory-storing mechanisms, because the typical time scales associated with cellular and synaptic dynamics are two orders of magnitude smaller than this. While such long time scales can easily be achieved by bistable processes that toggle like a flip-flop between a baseline and elevated-activity state, many neuronal systems have been observed experimentally to be capable of maintaining a continuum of stable states. For example, in neural integrator networks involved in the accumulation of evidence for decision making and in motor control, individual neurons have been recorded whose activity reflects the mathematical integral of their inputs; in the absence of input, these neurons sustain activity at a level proportional to the running total of their inputs. This represents an analog form of memory whose dynamics can be conceptualized through an energy landscape with a continuum of lowest-energy states. Such continuous attractor landscapes are structurally non-robust, in seeming violation of the relative robustness of biological memory systems. In this talk, I will present and compare different biologically motivated circuit motifs for the accumulation and storage of signals in short-term memory. Challenges to generating robust memory maintenance will be highlighted and potential mechanisms for ameliorating the sensitivity of memory networks to perturbations will be discussed. Funding for this work was provided by NIH R01 MH065034, NSF IIS-1208218, Simons Foundation 324260, and a UC Davis Ophthalmology Research to Prevent Blindness Grant.
Precise Aperture-Dependent Motion Compensation with Frequency Domain Fast Back-Projection Algorithm.
Zhang, Man; Wang, Guanyong; Zhang, Lei
2017-10-26
Precise azimuth-variant motion compensation (MOCO) is an essential and difficult task for high-resolution synthetic aperture radar (SAR) imagery. In conventional post-filtering approaches, residual azimuth-variant motion errors are generally compensated through a set of spatial post-filters, where the coarse-focused image is segmented into overlapped blocks concerning the azimuth-dependent residual errors. However, image domain post-filtering approaches, such as precise topography- and aperture-dependent motion compensation algorithm (PTA), have difficulty of robustness in declining, when strong motion errors are involved in the coarse-focused image. In this case, in order to capture the complete motion blurring function within each image block, both the block size and the overlapped part need necessary extension leading to degeneration of efficiency and robustness inevitably. Herein, a frequency domain fast back-projection algorithm (FDFBPA) is introduced to deal with strong azimuth-variant motion errors. FDFBPA disposes of the azimuth-variant motion errors based on a precise azimuth spectrum expression in the azimuth wavenumber domain. First, a wavenumber domain sub-aperture processing strategy is introduced to accelerate computation. After that, the azimuth wavenumber spectrum is partitioned into a set of wavenumber blocks, and each block is formed into a sub-aperture coarse resolution image via the back-projection integral. Then, the sub-aperture images are straightforwardly fused together in azimuth wavenumber domain to obtain a full resolution image. Moreover, chirp-Z transform (CZT) is also introduced to implement the sub-aperture back-projection integral, increasing the efficiency of the algorithm. By disusing the image domain post-filtering strategy, robustness of the proposed algorithm is improved. Both simulation and real-measured data experiments demonstrate the effectiveness and superiority of the proposal.
NASA Astrophysics Data System (ADS)
Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III
2005-11-01
Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.
Using explanatory crop models to develop simple tools for Advanced Life Support system studies
NASA Technical Reports Server (NTRS)
Cavazzoni, J.
2004-01-01
System-level analyses for Advanced Life Support require mathematical models for various processes, such as for biomass production and waste management, which would ideally be integrated into overall system models. Explanatory models (also referred to as mechanistic or process models) would provide the basis for a more robust system model, as these would be based on an understanding of specific processes. However, implementing such models at the system level may not always be practicable because of their complexity. For the area of biomass production, explanatory models were used to generate parameters and multivariable polynomial equations for basic models that are suitable for estimating the direction and magnitude of daily changes in canopy gas-exchange, harvest index, and production scheduling for both nominal and off-nominal growing conditions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.
Second International Conference on Accelerating Biopharmaceutical Development
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme “Delivering cost-effective, robust processes and methods quickly and efficiently.” The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development. PMID:20065637
Borges, F S; Protachevicz, P R; Lameu, E L; Bonetti, R C; Iarosz, K C; Caldas, I L; Baptista, M S; Batista, A M
2017-06-01
We have studied neuronal synchronisation in a random network of adaptive exponential integrate-and-fire neurons. We study how spiking or bursting synchronous behaviour appears as a function of the coupling strength and the probability of connections, by constructing parameter spaces that identify these synchronous behaviours from measurements of the inter-spike interval and the calculation of the order parameter. Moreover, we verify the robustness of synchronisation by applying an external perturbation to each neuron. The simulations show that bursting synchronisation is more robust than spike synchronisation. Copyright © 2017 Elsevier Ltd. All rights reserved.
On-Chip Waveguide Coupling of a Layered Semiconductor Single-Photon Source.
Tonndorf, Philipp; Del Pozo-Zamudio, Osvaldo; Gruhler, Nico; Kern, Johannes; Schmidt, Robert; Dmitriev, Alexander I; Bakhtinov, Anatoly P; Tartakovskii, Alexander I; Pernice, Wolfram; Michaelis de Vasconcellos, Steffen; Bratschitsch, Rudolf
2017-09-13
Fully integrated quantum technology based on photons is in the focus of current research, because of its immense potential concerning performance and scalability. Ideally, the single-photon sources, the processing units, and the photon detectors are all combined on a single chip. Impressive progress has been made for on-chip quantum circuits and on-chip single-photon detection. In contrast, nonclassical light is commonly coupled onto the photonic chip from the outside, because presently only few integrated single-photon sources exist. Here, we present waveguide-coupled single-photon emitters in the layered semiconductor gallium selenide as promising on-chip sources. GaSe crystals with a thickness below 100 nm are placed on Si 3 N 4 rib or slot waveguides, resulting in a modified mode structure efficient for light coupling. Using optical excitation from within the Si 3 N 4 waveguide, we find nonclassicality of generated photons routed on the photonic chip. Thus, our work provides an easy-to-implement and robust light source for integrated quantum technology.
Kang, Dong-Ku; Ali, M. Monsur; Zhang, Kaixiang; Huang, Susan S.; Peterson, Ellena; Digman, Michelle A.; Gratton, Enrico; Zhao, Weian
2014-01-01
Blood stream infection or sepsis is a major health problem worldwide, with extremely high mortality, which is partly due to the inability to rapidly detect and identify bacteria in the early stages of infection. Here we present a new technology termed ‘Integrated Comprehensive Droplet Digital Detection’ (IC 3D) that can selectively detect bacteria directly from milliliters of diluted blood at single-cell sensitivity in a one-step, culture- and amplification-free process within 1.5–4 h. The IC 3D integrates real-time, DNAzyme-based sensors, droplet microencapsulation and a high-throughput 3D particle counter system. Using Escherichia coli as a target, we demonstrate that the IC 3D can provide absolute quantification of both stock and clinical isolates of E. coli in spiked blood within a broad range of extremely low concentration from 1 to 10,000 bacteria per ml with exceptional robustness and limit of detection in the single digit regime. PMID:25391809
Using a mixed-methods design to examine nurse practitioner integration in British Columbia.
Sangster-Gormley, Esther; Griffith, Janessa; Schreiber, Rita; Borycki, Elizabeth
2015-07-01
To discuss and provide examples of how mixed-methods research was used to evaluate the integration of nurse practitioners (NPs) into a Canadian province. Legislation enabling NPs to practise in British Columbia (BC) was enacted in 2005. This research evaluated the integration of NPs and their effect on the BC healthcare system. Data were collected using surveys, focus groups, participant interviews and case studies over three years. Data sources and methods were triangulated to determine how the findings addressed the research questions. The challenges and benefits of using the multiphase design are highlighted in the paper. The multiphase mixed-methods research design was selected because of its applicability to evaluation research. The design proved to be robust and flexible in answering research questions. As sub-studies within the multiphase design are often published separately, it can be difficult for researchers to find examples. This paper highlights ways that a multiphase mixed-methods design can be conducted for researchers unfamiliar with the process.
Silica Nanowires: Growth, Integration, and Sensing Applications
Kaushik, Ajeet; Kumar, Rajesh; Huey, Eric; Bhansali, Shekhar; Nair, Narayana; Nanir, Madhavan
2014-01-01
This review (with 129 refs.) gives an overview on how the integration of silica nanowires (NWs) into micro-scale devices has resulted, in recent years, in simple yet robust nano-instrumentation with improved performance in targeted application areas such as sensing. This has been achieved by the use of appropriate techniques such as di-electrophoresis and direct vapor-liquid-growth phenomena, to restrict the growth of NWs to site-specific locations. This also has eliminated the need for post-growth processing and enables nanostructures to be placed on pre-patterned substrates. Various kinds of NWs have been investigated to determine how their physical and chemical properties can be tuned for integration into sensing structures. NWs integrated onto interdigitated micro-electrodes have been applied to the determination of gases and biomarkers. The technique of directly growing NWs eliminates the need for their physical transfer and thus preserves their structure and performance, and further reduces the costs of fabrication. The biocompatibility of NWs also has been studied with respect to possible biological applications. This review addresses the challenges in growth and integration of NWs to understand related mechanism on biological contact or gas exposure and sensing performance for personalized health and environmental monitoring. PMID:25382871
Does Temporal Integration Occur for Unrecognizable Words in Visual Crowding?
Zhou, Jifan; Lee, Chia-Lin; Li, Kuei-An; Tien, Yung-Hsuan; Yeh, Su-Ling
2016-01-01
Visual crowding—the inability to see an object when it is surrounded by flankers in the periphery—does not block semantic activation: unrecognizable words due to visual crowding still generated robust semantic priming in subsequent lexical decision tasks. Based on the previous finding, the current study further explored whether unrecognizable crowded words can be temporally integrated into a phrase. By showing one word at a time, we presented Chinese four-word idioms with either a congruent or incongruent ending word in order to examine whether the three preceding crowded words can be temporally integrated to form a semantic context so as to affect the processing of the ending word. Results from both behavioral (Experiment 1) and Event-Related Potential (Experiment 2 and 3) measures showed congruency effect in only the non-crowded condition, which does not support the existence of unconscious multi-word integration. Aside from four-word idioms, we also found that two-word (modifier + adjective combination) integration—the simplest kind of temporal semantic integration—did not occur in visual crowding (Experiment 4). Our findings suggest that integration of temporally separated words might require conscious awareness, at least under the timing conditions tested in the current study. PMID:26890366
Liu, Xiaodong; Huang, Wanwei; Du, Lifu
2017-01-01
A new robust three-dimensional integrated guidance and control (3D-IGC) approach is investigated for sliding-to-turn (STT) hypersonic missile, which encounters high uncertainties and strict impact angle constraints. First, a nonlinear state-space model with more generality is established facing to the design of 3D-IGC law. With regard to the as-built nonlinear system, a robust dynamic inversion control (RDIC) approach is proposed to overcome the robustness deficiency of traditional DIC, and then it is applied to construct the basic 3D-IGC law combining with backstepping method. In order to avoid the problems of "explosion of terms" and high-frequency chattering, an improved 3D-IGC law is further proposed by introducing dynamic surface control and continuous approximation approaches. From the computer simulation on a hypersonic missile, the proposed 3D-IGC law not only guarantees the stable flight, but also presents the precise control on terminal locations and impact angles. Moreover, it possesses smooth control output and strong robustness. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Active Fault Tolerant Control for Ultrasonic Piezoelectric Motor
NASA Astrophysics Data System (ADS)
Boukhnifer, Moussa
2012-07-01
Ultrasonic piezoelectric motor technology is an important system component in integrated mechatronics devices working on extreme operating conditions. Due to these constraints, robustness and performance of the control interfaces should be taken into account in the motor design. In this paper, we apply a new architecture for a fault tolerant control using Youla parameterization for an ultrasonic piezoelectric motor. The distinguished feature of proposed controller architecture is that it shows structurally how the controller design for performance and robustness may be done separately which has the potential to overcome the conflict between performance and robustness in the traditional feedback framework. A fault tolerant control architecture includes two parts: one part for performance and the other part for robustness. The controller design works in such a way that the feedback control system will be solely controlled by the proportional plus double-integral
Conway, Sarah J; Himmelrich, Sarah; Feeser, Scott A; Flynn, John A; Kravet, Steven J; Bailey, Jennifer; Hebert, Lindsay C; Donovan, Susan H; Kachur, Sarah G; Brown, Patricia M C; Baumgartner, William A; Berkowitz, Scott A
2018-02-02
Accountable Care Organizations (ACOs), like other care entities, must be strategic about which initiatives they support in the quest for higher value. This article reviews the current strategic planning process for the Johns Hopkins Medicine Alliance for Patients (JMAP), a Medicare Shared Savings Program Track 1 ACO. It reviews the 3 focus areas for the 2017 strategic review process - (1) optimizing care coordination for complex, at-risk patients, (2) post-acute care, and (3) specialty care integration - reviewing cost savings and quality improvement opportunities, associated best practices from the literature, and opportunities to leverage and advance existing ACO and health system efforts in each area. It then reviews the ultimate selection of priorities for the coming year and early thoughts on implementation. After the robust review process, key stakeholders voted to select interventions targeted at care coordination, post-acute care, and specialty integration including Part B drug and imaging costs. The interventions selected incorporate a mixture of enhancing current ACO initiatives, working collaboratively and synergistically on other health system initiatives, and taking on new projects deemed targeted, cost-effective, and manageable in scope. The annual strategic review has been an essential and iterative process based on performance data and informed by the collective experience of other organizations. The process allows for an evidence-based strategic plan for the ACO in pursuit of the best care for patients.
NASA Astrophysics Data System (ADS)
Inam, Azhar; Adamowski, Jan; Prasher, Shiv; Halbe, Johannes; Malard, Julien; Albano, Raffaele
2017-08-01
Effective policies, leading to sustainable management solutions for land and water resources, require a full understanding of interactions between socio-economic and physical processes. However, the complex nature of these interactions, combined with limited stakeholder engagement, hinders the incorporation of socio-economic components into physical models. The present study addresses this challenge by integrating the physical Spatial Agro Hydro Salinity Model (SAHYSMOD) with a participatory group-built system dynamics model (GBSDM) that includes socio-economic factors. A stepwise process to quantify the GBSDM is presented, along with governing equations and model assumptions. Sub-modules of the GBSDM, describing agricultural, economic, water and farm management factors, are linked together with feedbacks and finally coupled with the physically based SAHYSMOD model through commonly used tools (i.e., MS Excel and a Python script). The overall integrated model (GBSDM-SAHYSMOD) can be used to help facilitate the role of stakeholders with limited expertise and resources in model and policy development and implementation. Following the development of the integrated model, a testing methodology was used to validate the structure and behavior of the integrated model. Model robustness under different operating conditions was also assessed. The model structure was able to produce anticipated real behaviours under the tested scenarios, from which it can be concluded that the formulated structures generate the right behaviour for the right reasons.
Analytical redundancy and the design of robust failure detection systems
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The Failure Detection and Identification (FDI) process is viewed as consisting of two stages: residual generation and decision making. It is argued that a robust FDI system can be achieved by designing a robust residual generation process. Analytical redundancy, the basis for residual generation, is characterized in terms of a parity space. Using the concept of parity relations, residuals can be generated in a number of ways and the design of a robust residual generation process can be formulated as a minimax optimization problem. An example is included to illustrate this design methodology. Previously announcedd in STAR as N83-20653
Advanced metal lift-off process using electron-beam flood exposure of single-layer photoresist
NASA Astrophysics Data System (ADS)
Minter, Jason P.; Ross, Matthew F.; Livesay, William R.; Wong, Selmer S.; Narcy, Mark E.; Marlowe, Trey
1999-06-01
In the manufacture of many types of integrated circuit and thin film devices, it is desirable to use a lift-of process for the metallization step to avoid manufacturing problems encountered when creating metal interconnect structures using plasma etch. These problems include both metal adhesion and plasma etch difficulties. Key to the success of the lift-off process is the creation of a retrograde or undercut profile in the photoresists before the metal deposition step. Until now, lift-off processing has relied on costly multi-layer photoresists schemes, image reversal, and non-repeatable photoresist processes to obtain the desired lift-off profiles in patterned photoresist. This paper present a simple, repeatable process for creating robust, user-defined lift-off profiles in single layer photoresist using a non-thermal electron beam flood exposure. For this investigation, lift-off profiles created using electron beam flood exposure of many popular photoresists were evaluated. Results of lift-off profiles created in positive tone AZ7209 and ip3250 are presented here.
Sticking together: building a biofilm the Bacillus subtilis way
Vlamakis, Hera; Chai, Yunrong; Beauregard, Pascale; Losick, Richard; Kolter, Roberto
2014-01-01
Preface Biofilms are ubiquitous communities of tightly associated bacteria encased in an extracellular matrix. Bacillus subtilis has long-served as a robust model organism to examine the molecular mechanisms of biofilm formation and a number of studies have revealed that this process is subject to a number of integrated regulatory pathways. In this Review, we focus on the molecular mechanisms controlling biofilm assembly and briefly summarize the current state of knowledge regarding their disassembly. We also discuss recent progress that has expanded our understanding of biofilm formation on plant roots, which are a natural habitat for this soil bacterium. PMID:23353768
Sticking together: building a biofilm the Bacillus subtilis way.
Vlamakis, Hera; Chai, Yunrong; Beauregard, Pascale; Losick, Richard; Kolter, Roberto
2013-03-01
Biofilms are ubiquitous communities of tightly associated bacteria encased in an extracellular matrix. Bacillus subtilis has long served as a robust model organism to examine the molecular mechanisms of biofilm formation, and a number of studies have revealed that this process is regulated by several integrated pathways. In this Review, we focus on the molecular mechanisms that control B. subtilis biofilm assembly, and then briefly summarize the current state of knowledge regarding biofilm disassembly. We also discuss recent progress that has expanded our understanding of B. subtilis biofilm formation on plant roots, which are a natural habitat for this soil bacterium.
Treatment wetlands in decentralised approaches for linking sanitation to energy and food security.
Langergraber, Guenter; Masi, Fabio
2018-02-01
Treatment wetlands (TWs) are engineered systems that mimic the processes in natural wetlands with the purpose of treating contaminated water. Being a simple and robust technology, TWs are applied worldwide to treat various types of water. Besides treated water for reuse, TWs can be used in resources-oriented sanitation systems for recovering nutrients and carbon, as well as for growing biomass for energy production. Additionally, TWs provide a large number of ecosystem services. Integrating green infrastructure into urban developments can thus facilitate circular economy approaches and has positive impacts on environment, economy and health.
Intelligent flight control systems
NASA Technical Reports Server (NTRS)
Stengel, Robert F.
1993-01-01
The capabilities of flight control systems can be enhanced by designing them to emulate functions of natural intelligence. Intelligent control functions fall in three categories. Declarative actions involve decision-making, providing models for system monitoring, goal planning, and system/scenario identification. Procedural actions concern skilled behavior and have parallels in guidance, navigation, and adaptation. Reflexive actions are spontaneous, inner-loop responses for control and estimation. Intelligent flight control systems learn knowledge of the aircraft and its mission and adapt to changes in the flight environment. Cognitive models form an efficient basis for integrating 'outer-loop/inner-loop' control functions and for developing robust parallel-processing algorithms.
Integrating security in a group oriented distributed system
NASA Technical Reports Server (NTRS)
Reiter, Michael; Birman, Kenneth; Gong, LI
1992-01-01
A distributed security architecture is proposed for incorporation into group oriented distributed systems, and in particular, into the Isis distributed programming toolkit. The primary goal of the architecture is to make common group oriented abstractions robust in hostile settings, in order to facilitate the construction of high performance distributed applications that can tolerate both component failures and malicious attacks. These abstractions include process groups and causal group multicast. Moreover, a delegation and access control scheme is proposed for use in group oriented systems. The focus is the security architecture; particular cryptosystems and key exchange protocols are not emphasized.
NASA Technical Reports Server (NTRS)
Hanson, Curt
2009-01-01
The NASA F/A-18 tail number (TN) 853 full-scale Integrated Resilient Aircraft Control (IRAC) testbed has been designed with a full array of capabilities in support of the Aviation Safety Program. Highlights of the system's capabilities include: 1) a quad-redundant research flight control system for safely interfacing controls experiments to the aircraft's control surfaces; 2) a dual-redundant airborne research test system for hosting multi-disciplinary state-of-the-art adaptive control experiments; 3) a robust reversionary configuration for recovery from unusual attitudes and configurations; 4) significant research instrumentation, particularly in the area of static loads; 5) extensive facilities for experiment simulation, data logging, real-time monitoring and post-flight analysis capabilities; and 6) significant growth capability in terms of interfaces and processing power.
Slow dynamics in translation-invariant quantum lattice models
NASA Astrophysics Data System (ADS)
Michailidis, Alexios A.; Žnidarič, Marko; Medvedyeva, Mariya; Abanin, Dmitry A.; Prosen, Tomaž; Papić, Z.
2018-03-01
Many-body quantum systems typically display fast dynamics and ballistic spreading of information. Here we address the open problem of how slow the dynamics can be after a generic breaking of integrability by local interactions. We develop a method based on degenerate perturbation theory that reveals slow dynamical regimes and delocalization processes in general translation invariant models, along with accurate estimates of their delocalization time scales. Our results shed light on the fundamental questions of the robustness of quantum integrable systems and the possibility of many-body localization without disorder. As an example, we construct a large class of one-dimensional lattice models where, despite the absence of asymptotic localization, the transient dynamics is exceptionally slow, i.e., the dynamics is indistinguishable from that of many-body localized systems for the system sizes and time scales accessible in experiments and numerical simulations.
Ultrastable assembly and integration technology for ground- and space-based optical systems.
Ressel, Simon; Gohlke, Martin; Rauen, Dominik; Schuldt, Thilo; Kronast, Wolfgang; Mescheder, Ulrich; Johann, Ulrich; Weise, Dennis; Braxmaier, Claus
2010-08-01
Optical metrology systems crucially rely on the dimensional stability of the optical path between their individual optical components. We present in this paper a novel adhesive bonding technology for setup of quasi-monolithic systems and compare selected characteristics to the well-established state-of-the-art technique of hydroxide-catalysis bonding. It is demonstrated that within the measurement resolution of our ultraprecise custom heterodyne interferometer, both techniques achieve an equivalent passive path length and tilt stability for time scales between 0.1 mHz and 1 Hz. Furthermore, the robustness of the adhesive bonds against mechanical and thermal inputs has been tested, making this new bonding technique in particular a potential option for interferometric applications in future space missions. The integration process itself is eased by long time scales for alignment, as well as short curing times.
640-Gbit/s fast physical random number generation using a broadband chaotic semiconductor laser
NASA Astrophysics Data System (ADS)
Zhang, Limeng; Pan, Biwei; Chen, Guangcan; Guo, Lu; Lu, Dan; Zhao, Lingjuan; Wang, Wei
2017-04-01
An ultra-fast physical random number generator is demonstrated utilizing a photonic integrated device based broadband chaotic source with a simple post data processing method. The compact chaotic source is implemented by using a monolithic integrated dual-mode amplified feedback laser (AFL) with self-injection, where a robust chaotic signal with RF frequency coverage of above 50 GHz and flatness of ±3.6 dB is generated. By using 4-least significant bits (LSBs) retaining from the 8-bit digitization of the chaotic waveform, random sequences with a bit-rate up to 640 Gbit/s (160 GS/s × 4 bits) are realized. The generated random bits have passed each of the fifteen NIST statistics tests (NIST SP800-22), indicating its randomness for practical applications.
Casal-Campos, Arturo; Fu, Guangtao; Butler, David; Moore, Andrew
2015-07-21
The robustness of a range of watershed-scale "green" and "gray" drainage strategies in the future is explored through comprehensive modeling of a fully integrated urban wastewater system case. Four socio-economic future scenarios, defined by parameters affecting the environmental performance of the system, are proposed to account for the uncertain variability of conditions in the year 2050. A regret-based approach is applied to assess the relative performance of strategies in multiple impact categories (environmental, economic, and social) as well as to evaluate their robustness across future scenarios. The concept of regret proves useful in identifying performance trade-offs and recognizing states of the world most critical to decisions. The study highlights the robustness of green strategies (particularly rain gardens, resulting in half the regret of most options) over end-of-pipe gray alternatives (surface water separation or sewer and storage rehabilitation), which may be costly (on average, 25% of the total regret of these options) and tend to focus on sewer flooding and CSO alleviation while compromising on downstream system performance (this accounts for around 50% of their total regret). Trade-offs and scenario regrets observed in the analysis suggest that the combination of green and gray strategies may still offer further potential for robustness.
Aucamp, Jean P; Davies, Richard; Hallet, Damien; Weiss, Amanda; Titchener-Hooker, Nigel J
2014-01-01
An ultra scale-down primary recovery sequence was established for a platform E. coli Fab production process. It was used to evaluate the process robustness of various bioengineered strains. Centrifugal discharge in the initial dewatering stage was determined to be the major cause of cell breakage. The ability of cells to resist breakage was dependant on a combination of factors including host strain, vector, and fermentation strategy. Periplasmic extraction studies were conducted in shake flasks and it was demonstrated that key performance parameters such as Fab titre and nucleic acid concentrations were mimicked. The shake flask system also captured particle aggregation effects seen in a large scale stirred vessel, reproducing the fine particle size distribution that impacts the final centrifugal clarification stage. The use of scale-down primary recovery process sequences can be used to screen a larger number of engineered strains. This can lead to closer integration with and better feedback between strain development, fermentation development, and primary recovery studies. Biotechnol. Bioeng. 2014;111: 1971–1981. © 2014 Wiley Periodicals, Inc. PMID:24838387
Loren, Bradley P; Wleklinski, Michael; Koswara, Andy; Yammine, Kathryn; Hu, Yanyang; Nagy, Zoltan K; Thompson, David H; Cooks, R Graham
2017-06-01
A highly integrated approach to the development of a process for the continuous synthesis and purification of diphenhydramine is reported. Mass spectrometry (MS) is utilized throughout the system for on-line reaction monitoring, off-line yield quantitation, and as a reaction screening module that exploits reaction acceleration in charged microdroplets for high throughput route screening. This effort has enabled the discovery and optimization of multiple routes to diphenhydramine in glass microreactors using MS as a process analytical tool (PAT). The ability to rapidly screen conditions in charged microdroplets was used to guide optimization of the process in a microfluidic reactor. A quantitative MS method was developed and used to measure the reaction kinetics. Integration of the continuous-flow reactor/on-line MS methodology with a miniaturized crystallization platform for continuous reaction monitoring and controlled crystallization of diphenhydramine was also achieved. Our findings suggest a robust approach for the continuous manufacture of pharmaceutical drug products, exemplified in the particular case of diphenhydramine, and optimized for efficiency and crystal size, and guided by real-time analytics to produce the agent in a form that is readily adapted to continuous synthesis.
Control of the exercise hyperpnoea in humans: a modeling perspective.
Ward, S A
2000-09-01
Models of the exercise hyperpnoea have classically incorporated elements of proportional feedback (carotid and medullary chemosensory) and feedforward (central and/or peripheral neurogenic) control. However, the precise details of the control process remain unresolved, reflecting in part both technical and interpretational limitations inherent in isolating putative control mechanisms in the intact human, and also the challenges to linear control theory presented by multiple-input integration, especially with regard to the ventilatory and gas-exchange complexities encountered at work rates which engender a metabolic acidosis. While some combination of neurogenic, chemoreflex and circulatory-coupled processes are likely to contribute to the control, the system appears to evidence considerable redundancy. This, coupled with the lack of appreciable error signals in the mean levels of arterial blood gas tensions and pH over a wide range of work rates, has motivated the formulation of innovative control models that reflect not only spatial interactions but also temporal interactions (i.e. memory). The challenge is to discriminate between robust competing control models that: (a) integrate such processes within plausible physiological equivalents; and (b) account for both the dynamic and steady-state system response over a range of exercise intensities. Such models are not yet available.
Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chiou, Jin-Chern
1990-01-01
Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.
Narasimhan, Seetharam; Chiel, Hillel J; Bhunia, Swarup
2009-01-01
For implantable neural interface applications, it is important to compress data and analyze spike patterns across multiple channels in real time. Such a computational task for online neural data processing requires an innovative circuit-architecture level design approach for low-power, robust and area-efficient hardware implementation. Conventional microprocessor or Digital Signal Processing (DSP) chips would dissipate too much power and are too large in size for an implantable system. In this paper, we propose a novel hardware design approach, referred to as "Preferential Design" that exploits the nature of the neural signal processing algorithm to achieve a low-voltage, robust and area-efficient implementation using nanoscale process technology. The basic idea is to isolate the critical components with respect to system performance and design them more conservatively compared to the noncritical ones. This allows aggressive voltage scaling for low power operation while ensuring robustness and area efficiency. We have applied the proposed approach to a neural signal processing algorithm using the Discrete Wavelet Transform (DWT) and observed significant improvement in power and robustness over conventional design.
Schlecht, Stephen H; Jepsen, Karl J
2013-09-01
Understanding the functional integration of skeletal traits and how they naturally vary within and across populations will benefit assessments of functional adaptation directed towards interpreting bone stiffness in contemporary and past humans. Moreover, investigating how these traits intraskeletally vary will guide us closer towards predicting fragility from a single skeletal site. Using an osteological collection of 115 young adult male and female African-Americans, we assessed the functional relationship between bone robustness (i.e. total area/length), cortical tissue mineral density (Ct.TMD), and cortical area (Ct.Ar) for the upper and lower limbs. All long bones demonstrated significant trait covariance (p < 0.005) independent of body size, with slender bones having 25-50% less Ct.Ar and 5-8% higher Ct.TMD compared to robust bones. Robustness statistically explained 10.2-28% of Ct.TMD and 26.6-64.6% of Ct.Ar within male and female skeletal elements. This covariance is systemic throughout the skeleton, with either the slender or robust phenotype consistently represented within all long bones for each individual. These findings suggest that each person attains a unique trait set by adulthood that is both predictable by robustness and partially independent of environmental influences. The variation in these functionally integrated traits allows for the maximization of tissue stiffness and minimization of mass so that regardless of which phenotype is present, a given bone is reasonably stiff and strong, and sufficiently adapted to perform routine, habitual loading activities. Covariation intrinsic to functional adaptation suggests that whole bone stiffness depends upon particular sets of traits acquired during growth, presumably through differing levels of cellular activity, resulting in differing tissue morphology and composition. The outcomes of this intraskeletal examination of robustness and its correlates may have significant value in our progression towards improved clinical assessments of bone strength and fragility. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.
2015-12-01
Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we contend that creating believable soil carbon predictions requires a robust, transparent, and community-available benchmarking framework. I will present an ILAMB evaluation of several of the above-mentioned approaches in ACME, and attempt to motivate community adoption of this evaluation approach.
Design, simulation and characterisation of integrated optics for a microfabricated flow cytometer
NASA Astrophysics Data System (ADS)
Barat, David; Benazzi, Giuseppe; Mowlem, Matthew Charles; Ruano, Jesus Miguel; Morgan, Hywel
2010-05-01
Flow cytometry is widely used for analyzing micro-particles such as cells and bacteria. Microfabricated flow cytometers promise reduced instrument size and cost with increased robustness and have application in medicine, life sciences and environmental metrology. Further miniaturisation and robustness can be achieved if integrated optics are used instead of traditional free space optics. We present designs simulation and experimental characterisation of integrated optics for a microfabricated cytometer made from SU-8 resin on a glass substrate. The optics constructed from combinations of optical fibres (positioned with microgrooves), waveguides, and microlenses enable analysis of scattered light and fluorescence from particles positioned near the centre of a microchannel using one dimensional sheath flow. Four different methods for directing the incident light onto the particles are examined and the optimum design discussed.
The performances of different overlay mark types at 65nm node on 300-mm wafers
NASA Astrophysics Data System (ADS)
Tseng, H. T.; Lin, Ling-Chieh; Huang, I. H.; Lin, Benjamin S.; Huang, Chin-Chou K.; Huang, Chien-Jen
2005-05-01
The integrated circuit (IC) manufacturing factories have measured overlay with conventional "box-in-box" (BiB) or "frame-in-frame" (FiF) structures for many years. Since UMC played as a roll of world class IC foundry service provider, tighter and tighter alignment accuracy specs need to be achieved from generation to generation to meet any kind of customers' requirement, especially according to International Technology Roadmap for Semiconductors (ITRS) 2003 METROLOGY section1. The process noises resulting from dishing, overlay mark damaging by chemical mechanism polishing (CMP), and the variation of film thickness during deposition are factors which can be very problematic in mark alignment. For example, the conventional "box-in-box" overlay marks could be damaged easily by CMP, because the less local pattern density and wide feature width of the box induce either dishing or asymmetric damages for the measurement targets, which will make the overlay measurement varied and difficult. After Advanced Imaging Metrology (AIM) overlay targets was introduced by KLA-Tencor, studies in the past shown AIM was more robust in overlay metrology than conventional FiF or BiB targets. In this study, the applications of AIM overlay marks under different process conditions will be discussed and compared with the conventional overlay targets. To evaluate the overlay mark performance against process variation on 65nm technology node in 300-mm wafer, three critical layers were chosen in this study. These three layers were Poly, Contact, and Cu-Metal. The overlay targets used for performance comparison were BiB and Non-Segmented AIM (NS AIM) marks. We compared the overlay mark performance on two main areas. The first one was total measurement uncertainty (TMU)3 related items that include Tool Induced Shift (TIS) variability, precision, and matching. The other area is the target robustness against process variations. Based on the present study AIM mark demonstrated an equal or better performance in the TMU related items under our process conditions. However, when non-optimized tungsten CMP was introduced in the tungsten contact process, due to the dense grating line structure design, we found that AIM mark was much more robust than BiB overlay target.
Understanding the source of multifractality in financial markets
NASA Astrophysics Data System (ADS)
Barunik, Jozef; Aste, Tomaso; Di Matteo, T.; Liu, Ruipeng
2012-09-01
In this paper, we use the generalized Hurst exponent approach to study the multi-scaling behavior of different financial time series. We show that this approach is robust and powerful in detecting different types of multi-scaling. We observe a puzzling phenomenon where an apparent increase in multifractality is measured in time series generated from shuffled returns, where all time-correlations are destroyed, while the return distributions are conserved. This effect is robust and it is reproduced in several real financial data including stock market indices, exchange rates and interest rates. In order to understand the origin of this effect we investigate different simulated time series by means of the Markov switching multifractal model, autoregressive fractionally integrated moving average processes with stable innovations, fractional Brownian motion and Levy flights. Overall we conclude that the multifractality observed in financial time series is mainly a consequence of the characteristic fat-tailed distribution of the returns and time-correlations have the effect to decrease the measured multifractality.
Wang, Min; Ma, Pengsha; Yin, Min; Lu, Linfeng; Lin, Yinyue; Chen, Xiaoyuan; Jia, Wei; Cao, Xinmin; Chang, Paichun; Li, Dongdong
2017-09-01
Antireflection (AR) at the interface between the air and incident window material is paramount to boost the performance of photovoltaic devices. 3D nanostructures have attracted tremendous interest to reduce reflection, while the structure is vulnerable to the harsh outdoor environment. Thus the AR film with improved mechanical property is desirable in an industrial application. Herein, a scalable production of flexible AR films is proposed with microsized structures by roll-to-roll imprinting process, which possesses hydrophobic property and much improved robustness. The AR films can be potentially used for a wide range of photovoltaic devices whether based on rigid or flexible substrates. As a demonstration, the AR films are integrated with commercial Si-based triple-junction thin film solar cells. The AR film works as an effective tool to control the light travel path and utilize the light inward more efficiently by exciting hybrid optical modes, which results in a broadband and omnidirectional enhanced performance.
Wang, Min; Ma, Pengsha; Lu, Linfeng; Lin, Yinyue; Chen, Xiaoyuan; Jia, Wei; Cao, Xinmin; Chang, Paichun
2017-01-01
Antireflection (AR) at the interface between the air and incident window material is paramount to boost the performance of photovoltaic devices. 3D nanostructures have attracted tremendous interest to reduce reflection, while the structure is vulnerable to the harsh outdoor environment. Thus the AR film with improved mechanical property is desirable in an industrial application. Herein, a scalable production of flexible AR films is proposed with microsized structures by roll‐to‐roll imprinting process, which possesses hydrophobic property and much improved robustness. The AR films can be potentially used for a wide range of photovoltaic devices whether based on rigid or flexible substrates. As a demonstration, the AR films are integrated with commercial Si‐based triple‐junction thin film solar cells. The AR film works as an effective tool to control the light travel path and utilize the light inward more efficiently by exciting hybrid optical modes, which results in a broadband and omnidirectional enhanced performance. PMID:28932667
Mapping population-based structural connectomes.
Zhang, Zhengwu; Descoteaux, Maxime; Zhang, Jingwen; Girard, Gabriel; Chamberland, Maxime; Dunson, David; Srivastava, Anuj; Zhu, Hongtu
2018-05-15
Advances in understanding the structural connectomes of human brain require improved approaches for the construction, comparison and integration of high-dimensional whole-brain tractography data from a large number of individuals. This article develops a population-based structural connectome (PSC) mapping framework to address these challenges. PSC simultaneously characterizes a large number of white matter bundles within and across different subjects by registering different subjects' brains based on coarse cortical parcellations, compressing the bundles of each connection, and extracting novel connection weights. A robust tractography algorithm and streamline post-processing techniques, including dilation of gray matter regions, streamline cutting, and outlier streamline removal are applied to improve the robustness of the extracted structural connectomes. The developed PSC framework can be used to reproducibly extract binary networks, weighted networks and streamline-based brain connectomes. We apply the PSC to Human Connectome Project data to illustrate its application in characterizing normal variations and heritability of structural connectomes in healthy subjects. Copyright © 2018 Elsevier Inc. All rights reserved.
Ye, Jingfei; Gao, Zhishan; Wang, Shuai; Cheng, Jinlong; Wang, Wei; Sun, Wenqing
2014-10-01
Four orthogonal polynomials for reconstructing a wavefront over a square aperture based on the modal method are currently available, namely, the 2D Chebyshev polynomials, 2D Legendre polynomials, Zernike square polynomials and Numerical polynomials. They are all orthogonal over the full unit square domain. 2D Chebyshev polynomials are defined by the product of Chebyshev polynomials in x and y variables, as are 2D Legendre polynomials. Zernike square polynomials are derived by the Gram-Schmidt orthogonalization process, where the integration region across the full unit square is circumscribed outside the unit circle. Numerical polynomials are obtained by numerical calculation. The presented study is to compare these four orthogonal polynomials by theoretical analysis and numerical experiments from the aspects of reconstruction accuracy, remaining errors, and robustness. Results show that the Numerical orthogonal polynomial is superior to the other three polynomials because of its high accuracy and robustness even in the case of a wavefront with incomplete data.
RoBuST: an integrated genomics resource for the root and bulb crop families Apiaceae and Alliaceae
2010-01-01
Background Root and bulb vegetables (RBV) include carrots, celeriac (root celery), parsnips (Apiaceae), onions, garlic, and leek (Alliaceae)—food crops grown globally and consumed worldwide. Few data analysis platforms are currently available where data collection, annotation and integration initiatives are focused on RBV plant groups. Scientists working on RBV include breeders, geneticists, taxonomists, plant pathologists, and plant physiologists who use genomic data for a wide range of activities including the development of molecular genetic maps, delineation of taxonomic relationships, and investigation of molecular aspects of gene expression in biochemical pathways and disease responses. With genomic data coming from such diverse areas of plant science, availability of a community resource focused on these RBV data types would be of great interest to this scientific community. Description The RoBuST database has been developed to initiate a platform for collecting and organizing genomic information useful for RBV researchers. The current release of RoBuST contains genomics data for 294 Alliaceae and 816 Apiaceae plant species and has the following features: (1) comprehensive sequence annotations of 3663 genes 5959 RNAs, 22,723 ESTs and 11,438 regulatory sequence elements from Apiaceae and Alliaceae plant families; (2) graphical tools for visualization and analysis of sequence data; (3) access to traits, biosynthetic pathways, genetic linkage maps and molecular taxonomy data associated with Alliaceae and Apiaceae plants; and (4) comprehensive plant splice signal repository of 659,369 splice signals collected from 6015 plant species for comparative analysis of plant splicing patterns. Conclusions RoBuST, available at http://robust.genome.com, provides an integrated platform for researchers to effortlessly explore and analyze genomic data associated with root and bulb vegetables. PMID:20691054
TaN resistor process development and integration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Kathleen; Martinez, Marino John; Clevenger, Jascinda
This paper describes the development and implementation of an integrated resistor process based on reactively sputtered tantalum nitride. Image reversal lithography was shown to be a superior method for liftoff patterning of these films. The results of a response surface DOE for the sputter deposition of the films are discussed. Several approaches to stabilization baking were examined and the advantages of the hot plate method are shown. In support of a new capability to produce special-purpose HBT-based Small-Scale Integrated Circuits (SSICs), we developed our existing TaN resistor process, designed for research prototyping, into one with greater maturity and robustness. Includedmore » in this work was the migration of our TaN deposition process from a research-oriented tool to a tool more suitable for production. Also included was implementation and optimization of a liftoff process for the sputtered TaN to avoid the complicating effects of subtractive etching over potentially sensitive surfaces. Finally, the method and conditions for stabilization baking of the resistors was experimentally determined to complete the full implementation of the resistor module. Much of the work to be described involves the migration between sputter deposition tools - from a Kurt J. Lesker CMS-18 to a Denton Discovery 550. Though they use nominally the same deposition technique (reactive sputtering of Ta with N{sup +} in a RF-excited Ar plasma), they differ substantially in their design and produce clearly different results in terms of resistivity, conformity of the film and the difference between as-deposited and stabilized films. We will describe the design of and results from the design of experiments (DOE)-based method of process optimization on the new tool and compare this to what had been used on the old tool.« less
Demonstration of Robustness and Integrated Operation of a Series-Bosch System
NASA Technical Reports Server (NTRS)
Abney, Morgan B.; Mansell, J. Matthew; Barnett, Bill; Stanley, Christine M.; Junaedi, Christian; Vilekar, Saurabh A.; Kent, Ryan
2016-01-01
Manned missions beyond low Earth orbit will require highly robust, reliable, and maintainable life support systems that maximize recycling of water and oxygen. Bosch technology is one option to maximize oxygen recovery, in the form of water, from metabolically-produced carbon dioxide (CO2). A two stage approach to Bosch, called Series-Bosch, reduces metabolic CO2 with hydrogen (H2) to produce water and solid carbon using two reactors: a Reverse Water-Gas Shift (RWGS) reactor and a carbon formation (CF) reactor. Previous development efforts demonstrated the stand-alone performance of a RWGS reactor containing Incofoam(TradeMark) catalyst and designed for robustness against carbon formation, two membrane separators intended to maximize single pass conversion of reactants, and a batch CF reactor with both transit and surface catalysts. In the past year, Precision Combustion, Inc. (PCI) developed and delivered a RWGS reactor for testing at NASA. The reactor design was based on their patented Microlith(TradeMark) technology and was first evaluated under a Phase I Small Business Innovative Research (SBIR) effort in 2010. The Microlith(TradeMark) RWGS reactor was recently evaluated at NASA to compare its performance and operating conditions with the Incofoam(TradeMark) RWGS reactor. Separately, in 2015, a fully integrated demonstration of an S-Bosch system was conducted. In an effort to mitigate risk, a second integrated test was conducted to evaluate the effect of membrane failure on a closed-loop Bosch system. Here, we report and discuss the performance and robustness to carbon formation of both RWGS reactors. We report the results of the integrated operation of a Series-Bosch system and we discuss the technology readiness level. 1
NASA Astrophysics Data System (ADS)
Mendoza, G.; Tkach, M.; Kucharski, J.; Chaudhry, R.
2017-12-01
This discussion is focused around the application of a bottom-up vulnerability assessment procedure for planning of climate resilience to a water treament plant for teh city of Iolanda, Zambia. This project is a Millennium Challenge Corporation (MCC) innitiaive with technical support by the UNESCO category II, International Center for Integrated Water Resources Management (ICIWaRM) with secretariat at the US Army Corps of Engineers Institute for Water Resources. The MCC is an innovative and independent U.S. foreign aid agency that is helping lead the fight against global poverty. The bottom-up vulnerability assessmentt framework examines critical performance thresholds, examines the external drivers that would lead to failure, establishes plausibility and analytical uncertainty that would lead to failure, and provides the economic justification for robustness or adaptability. This presentation will showcase the experiences in the application of the bottom-up framework to a region that is very vulnerable to climate variability, has poor instituional capacities, and has very limited data. It will illustrate the technical analysis and a decision process that led to a non-obvious climate robust solution. Most importantly it will highlight the challenges of utilizing discounted cash flow analysis (DCFA), such as net present value, in justifying robust or adaptive solutions, i.e. comparing solution under different future risks. We highlight a solution to manage the potential biases these DCFA procedures can incur.
Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.; Burnum-Johnson, Kristin E.; Kim, Young-Mo; Kyle, Jennifer E.; Matzke, Melissa M.; Shukla, Anil K.; Chu, Rosalie K.; Schepmoes, Athena A.; Jacobs, Jon M.; Baric, Ralph S.; Webb-Robertson, Bobbie-Jo; Smith, Richard D.
2016-01-01
ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. The metabolite, protein, and lipid extraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of this protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental, in vitro, and clinical). IMPORTANCE In systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample. Author Video: An author video summary of this article is available. PMID:27822525
NASA Astrophysics Data System (ADS)
Bosse, Stefan
2013-05-01
Sensorial materials consisting of high-density, miniaturized, and embedded sensor networks require new robust and reliable data processing and communication approaches. Structural health monitoring is one major field of application for sensorial materials. Each sensor node provides some kind of sensor, electronics, data processing, and communication with a strong focus on microchip-level implementation to meet the goals of miniaturization and low-power energy environments, a prerequisite for autonomous behaviour and operation. Reliability requires robustness of the entire system in the presence of node, link, data processing, and communication failures. Interaction between nodes is required to manage and distribute information. One common interaction model is the mobile agent. An agent approach provides stronger autonomy than a traditional object or remote-procedure-call based approach. Agents can decide for themselves, which actions are performed, and they are capable of flexible behaviour, reacting on the environment and other agents, providing some degree of robustness. Traditionally multi-agent systems are abstract programming models which are implemented in software and executed on program controlled computer architectures. This approach does not well scale to micro-chip level and requires full equipped computers and communication structures, and the hardware architecture does not consider and reflect the requirements for agent processing and interaction. We propose and demonstrate a novel design paradigm for reliable distributed data processing systems and a synthesis methodology and framework for multi-agent systems implementable entirely on microchip-level with resource and power constrained digital logic supporting Agent-On-Chip architectures (AoC). The agent behaviour and mobility is fully integrated on the micro-chip using pipelined communicating processes implemented with finite-state machines and register-transfer logic. The agent behaviour, interaction (communication), and mobility features are modelled and specified on a machine-independent abstract programming level using a state-based agent behaviour language (APL). With this APL a high-level agent compiler is able to synthesize a hardware model (RTL, VHDL), a software model (C, ML), or a simulation model (XML) suitable to simulate a multi-agent system using the SeSAm simulator framework. Agent communication is provided by a simple tuple-space database implemented on node level providing fault tolerant access of global data. A novel synthesis development kit (SynDK) based on a graph-structured database approach is introduced to support the rapid development of compilers and synthesis tools, used for example for the design and implementation of the APL compiler.
Audio-visual integration through the parallel visual pathways.
Kaposvári, Péter; Csete, Gergő; Bognár, Anna; Csibri, Péter; Tóth, Eszter; Szabó, Nikoletta; Vécsei, László; Sáry, Gyula; Tamás Kincses, Zsigmond
2015-10-22
Audio-visual integration has been shown to be present in a wide range of different conditions, some of which are processed through the dorsal, and others through the ventral visual pathway. Whereas neuroimaging studies have revealed integration-related activity in the brain, there has been no imaging study of the possible role of segregated visual streams in audio-visual integration. We set out to determine how the different visual pathways participate in this communication. We investigated how audio-visual integration can be supported through the dorsal and ventral visual pathways during the double flash illusion. Low-contrast and chromatic isoluminant stimuli were used to drive preferably the dorsal and ventral pathways, respectively. In order to identify the anatomical substrates of the audio-visual interaction in the two conditions, the psychophysical results were correlated with the white matter integrity as measured by diffusion tensor imaging.The psychophysiological data revealed a robust double flash illusion in both conditions. A correlation between the psychophysical results and local fractional anisotropy was found in the occipito-parietal white matter in the low-contrast condition, while a similar correlation was found in the infero-temporal white matter in the chromatic isoluminant condition. Our results indicate that both of the parallel visual pathways may play a role in the audio-visual interaction. Copyright © 2015. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Li, G. Q.; Zhu, Z. H.
2015-12-01
Dynamic modeling of tethered spacecraft with the consideration of elasticity of tether is prone to the numerical instability and error accumulation over long-term numerical integration. This paper addresses the challenges by proposing a globally stable numerical approach with the nodal position finite element method (NPFEM) and the implicit, symplectic, 2-stage and 4th order Gaussian-Legendre Runge-Kutta time integration. The NPFEM eliminates the numerical error accumulation by using the position instead of displacement of tether as the state variable, while the symplectic integration enforces the energy and momentum conservation of the discretized finite element model to ensure the global stability of numerical solution. The effectiveness and robustness of the proposed approach is assessed by an elastic pendulum problem, whose dynamic response resembles that of tethered spacecraft, in comparison with the commonly used time integrators such as the classical 4th order Runge-Kutta schemes and other families of non-symplectic Runge-Kutta schemes. Numerical results show that the proposed approach is accurate and the energy of the corresponding numerical model is conservative over the long-term numerical integration. Finally, the proposed approach is applied to the dynamic modeling of deorbiting process of tethered spacecraft over a long period.
Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.
Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry
Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.
NASA Astrophysics Data System (ADS)
Hsieh, Fu-Shiung
2011-03-01
Design of robust supervisory controllers for manufacturing systems with unreliable resources has received significant attention recently. Robustness analysis provides an alternative way to analyse a perturbed system to quickly respond to resource failures. Although we have analysed the robustness properties of several subclasses of ordinary Petri nets (PNs), analysis for non-ordinary PNs has not been done. Non-ordinary PNs have weighted arcs and have the advantage to compactly model operations requiring multiple parts or resources. In this article, we consider a class of flexible assembly/disassembly manufacturing systems and propose a non-ordinary flexible assembly/disassembly Petri net (NFADPN) model for this class of systems. As the class of flexible assembly/disassembly manufacturing systems can be regarded as the integration and interactions of a set of assembly/disassembly subprocesses, a bottom-up approach is adopted in this article to construct the NFADPN models. Due to the routing flexibility in NFADPN, there may exist different ways to accomplish the tasks. To characterise different ways to accomplish the tasks, we propose the concept of completely connected subprocesses. As long as there exists a set of completely connected subprocesses for certain type of products, the production of that type of products can still be maintained without requiring the whole NFADPN to be live. To take advantage of the alternative routes without enforcing liveness for the whole system, we generalise the concept of persistent production proposed to NFADPN. We propose a condition for persistent production based on the concept of completely connected subprocesses. We extend robustness analysis to NFADPN by exploiting its structure. We identify several patterns of resource failures and characterise the conditions to maintain operation in the presence of resource failures.
Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.
2014-01-01
With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials
Hybrid information privacy system: integration of chaotic neural network and RSA coding
NASA Astrophysics Data System (ADS)
Hsu, Ming-Kai; Willey, Jeff; Lee, Ting N.; Szu, Harold H.
2005-03-01
Electronic mails are adopted worldwide; most are easily hacked by hackers. In this paper, we purposed a free, fast and convenient hybrid privacy system to protect email communication. The privacy system is implemented by combining private security RSA algorithm with specific chaos neural network encryption process. The receiver can decrypt received email as long as it can reproduce the specified chaos neural network series, so called spatial-temporal keys. The chaotic typing and initial seed value of chaos neural network series, encrypted by the RSA algorithm, can reproduce spatial-temporal keys. The encrypted chaotic typing and initial seed value are hidden in watermark mixed nonlinearly with message media, wrapped with convolution error correction codes for wireless 3rd generation cellular phones. The message media can be an arbitrary image. The pattern noise has to be considered during transmission and it could affect/change the spatial-temporal keys. Since any change/modification on chaotic typing or initial seed value of chaos neural network series is not acceptable, the RSA codec system must be robust and fault-tolerant via wireless channel. The robust and fault-tolerant properties of chaos neural networks (CNN) were proved by a field theory of Associative Memory by Szu in 1997. The 1-D chaos generating nodes from the logistic map having arbitrarily negative slope a = p/q generating the N-shaped sigmoid was given first by Szu in 1992. In this paper, we simulated the robust and fault-tolerance properties of CNN under additive noise and pattern noise. We also implement a private version of RSA coding and chaos encryption process on messages.
76 FR 55456 - The Trade and Investment Partnership for the Middle East and North Africa
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-07
... facilitate more robust trade within the region and promote greater MENA integration with U.S. and other... integration with U.S. and European markets, and (3) open the door for those countries that adopt high... government actions that can enhance economic integration within the MENA region and increase trade and...
Ceramic Integration Technologies for Energy and Aerospace Applications
NASA Technical Reports Server (NTRS)
Singh, Mrityunjay; Asthana, Ralph N.
2007-01-01
Robust and affordable integration technologies for advanced ceramics are required to improve the performance, reliability, efficiency, and durability of components, devices, and systems based on them in a wide variety of energy, aerospace, and environmental applications. Many thermochemical and thermomechanical factors including joint design, analysis, and optimization must be considered in integration of similar and dissimilar material systems.
NASA Astrophysics Data System (ADS)
Innerkofler, Josef; Pock, Christian; Kirchengast, Gottfried; Schwaerz, Marc; Jaeggi, Adrian; Schwarz, Jakob
2016-04-01
The GNSS Radio Occultation (RO) measurement technique is highly valuable for climate monitoring of the atmosphere as it provides accurate and precise measurements in the troposphere and stratosphere regions with global coverage, long-term stability, and virtually all-weather capability. The novel Reference Occultation Processing System (rOPS), currently under development at the WEGC at University of Graz aims to process raw RO measurements into essential climate variables, such as temperature, pressure, and tropospheric water vapor, in a way which is SI-traceable to the universal time standard and which includes rigorous uncertainty propagation. As part of this rOPS climate-quality processing system, accurate atmospheric excess phase profiles with new approaches integrating uncertainty propagation are derived from the raw occultation tracking data and orbit data. Regarding the latter, highly accurate orbit positions and velocities of the GNSS transmitter satellites and the RO receiver satellites in low Earth orbit (LEO) need to be determined, in order to enable high accuracy of the excess phase profiles. Using several representative test days of GPS orbit data from the CODE and IGS archives, which are available at accuracies of about 3 cm (position) / 0.03 mm/s (velocity), and employing Bernese 5.2 and Napeos 3.3.1 software packages for the LEO orbit determination of the CHAMP, GRACE, and MetOp RO satellites, we achieved robust SI-traced LEO orbit uncertainty estimates of about 5 cm (position) / 0.05 mm/s (velocity) for the daily orbits, including estimates of systematic uncertainty bounds and of propagated random uncertainties. For COSMIC RO satellites, we found decreased accuracy estimates near 10-15 cm (position) / 0.1-0.15 mm/s (velocity), since the characteristics of the small COSMIC satellite platforms and antennas provide somewhat less favorable orbit determination conditions. We present the setup of how we (I) used the Bernese and Napeos package in mutual cross-check for this purpose, (II) integrated satellite laser-ranging validation of the estimated systematic uncertainty bounds, (III) expanded the Bernese 5.2 software for propagating random uncertainties from the GPS orbit data and LEO navigation tracking data input to the LEO data output. Preliminary excess phase results including propagated uncertainty estimates will also be shown. Except for disturbed space weather conditions, we expect a robust performance at millimeter level for the derived excess phases, which after large-scale processing of the RO data of many years can provide a new SI-traced fundamental climate data record.
Use of planar array electrophysiology for the development of robust ion channel cell lines.
Clare, Jeffrey J; Chen, Mao Xiang; Downie, David L; Trezise, Derek J; Powell, Andrew J
2009-01-01
The tractability of ion channels as drug targets has been significantly improved by the advent of planar array electrophysiology platforms which have dramatically increased the capacity for electrophysiological profiling of lead series compounds. However, the data quality and through-put obtained with these platforms is critically dependent on the robustness of the expression reagent being used. The generation of high quality, recombinant cell lines is therefore a key step in the early phase of ion channel drug discovery and this can present significant challenges due to the diversity and organisational complexity of many channel types. This article focuses on several complex and difficult to express ion channels and illustrates how improved stable cell lines can be obtained by integration of planar array electrophysiology systems into the cell line generation process per se. By embedding this approach at multiple stages (e.g., during development of the expression strategy, during screening and validation of clonal lines, and during characterisation of the final cell line), the cycle time and success rate in obtaining robust expression of complex multi-subunit channels can be significantly improved. We also review how recent advances in this technology (e.g., population patch clamp) have further widened the versatility and applicability of this approach.
Medical Imaging Lesion Detection Based on Unified Gravitational Fuzzy Clustering
Vianney Kinani, Jean Marie; Gallegos Funes, Francisco; Mújica Vargas, Dante; Ramos Díaz, Eduardo; Arellano, Alfonso
2017-01-01
We develop a swift, robust, and practical tool for detecting brain lesions with minimal user intervention to assist clinicians and researchers in the diagnosis process, radiosurgery planning, and assessment of the patient's response to the therapy. We propose a unified gravitational fuzzy clustering-based segmentation algorithm, which integrates the Newtonian concept of gravity into fuzzy clustering. We first perform fuzzy rule-based image enhancement on our database which is comprised of T1/T2 weighted magnetic resonance (MR) and fluid-attenuated inversion recovery (FLAIR) images to facilitate a smoother segmentation. The scalar output obtained is fed into a gravitational fuzzy clustering algorithm, which separates healthy structures from the unhealthy. Finally, the lesion contour is automatically outlined through the initialization-free level set evolution method. An advantage of this lesion detection algorithm is its precision and its simultaneous use of features computed from the intensity properties of the MR scan in a cascading pattern, which makes the computation fast, robust, and self-contained. Furthermore, we validate our algorithm with large-scale experiments using clinical and synthetic brain lesion datasets. As a result, an 84%–93% overlap performance is obtained, with an emphasis on robustness with respect to different and heterogeneous types of lesion and a swift computation time. PMID:29158887
Software Framework for Advanced Power Plant Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Widmann; Sorin Munteanu; Aseem Jain
2010-08-01
This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less
FACE-IT. A Science Gateway for Food Security Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montella, Raffaele; Kelly, David; Xiong, Wei
Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less
Temperature-Robust Neural Function from Activity-Dependent Ion Channel Regulation.
O'Leary, Timothy; Marder, Eve
2016-11-07
Many species of cold-blooded animals experience substantial and rapid fluctuations in body temperature. Because biological processes are differentially temperature dependent, it is difficult to understand how physiological processes in such animals can be temperature robust [1-8]. Experiments have shown that core neural circuits, such as the pyloric circuit of the crab stomatogastric ganglion (STG), exhibit robust neural activity in spite of large (20°C) temperature fluctuations [3, 5, 7, 8]. This robustness is surprising because (1) each neuron has many different kinds of ion channels with different temperature dependencies (Q 10 s) that interact in a highly nonlinear way to produce firing patterns and (2) across animals there is substantial variability in conductance densities that nonetheless produce almost identical firing properties. The high variability in conductance densities in these neurons [9, 10] appears to contradict the possibility that robustness is achieved through precise tuning of key temperature-dependent processes. In this paper, we develop a theoretical explanation for how temperature robustness can emerge from a simple regulatory control mechanism that is compatible with highly variable conductance densities [11-13]. The resulting model suggests a general mechanism for how nervous systems and excitable tissues can exploit degenerate relationships among temperature-sensitive processes to achieve robust function. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chopda, Viki R; Gomes, James; Rathore, Anurag S
2016-01-01
Bioreactor control significantly impacts both the amount and quality of the product being manufactured. The complexity of the control strategy that is implemented increases with reactor size, which may vary from thousands to tens of thousands of litres in commercial manufacturing. The Process Analytical Technology (PAT) initiative has highlighted the need for having robust monitoring tools and effective control schemes that are capable of taking real time information about the critical quality attributes (CQA) and the critical process parameters (CPP) and executing immediate response as soon as a deviation occurs. However, the limited flexibility that present commercial software packages offer creates a hurdle. Visual programming environments have gradually emerged as potential alternatives to the available text based languages. This paper showcases development of an integrated programme using a visual programming environment for a Sartorius BIOSTAT® B Plus 5L bioreactor through which various peripheral devices are interfaced. The proposed programme facilitates real-time access to data and allows for execution of control actions to follow the desired trajectory. Major benefits of such integrated software system include: (i) improved real time monitoring and control; (ii) reduced variability; (iii) improved performance; (iv) reduced operator-training time; (v) enhanced knowledge management; and (vi) easier PAT implementation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Collaborative environments for capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2005-05-01
Distributed collaboration is an emerging technology for the 21st century that will significantly change how business is conducted in the defense and commercial sectors. Collaboration involves two or more geographically dispersed entities working together to create a "product" by sharing and exchanging data, information, and knowledge. A product is defined broadly to include, for example, writing a report, creating software, designing hardware, or implementing robust systems engineering and capability planning processes in an organization. Collaborative environments provide the framework and integrate models, simulations, domain specific tools, and virtual test beds to facilitate collaboration between the multiple disciplines needed in the enterprise. The Air Force Research Laboratory (AFRL) is conducting a leading edge program in developing distributed collaborative technologies targeted to the Air Force's implementation of systems engineering for a simulation-aided acquisition and capability-based planning. The research is focusing on the open systems agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. In past four years, two live assessment events have been conducted to demonstrate the technology in support of research for the Air Force Agile Acquisition initiatives. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities conduct business.
Associative memory of phase-coded spatiotemporal patterns in leaky Integrate and Fire networks.
Scarpetta, Silvia; Giacco, Ferdinando
2013-04-01
We study the collective dynamics of a Leaky Integrate and Fire network in which precise relative phase relationship of spikes among neurons are stored, as attractors of the dynamics, and selectively replayed at different time scales. Using an STDP-based learning process, we store in the connectivity several phase-coded spike patterns, and we find that, depending on the excitability of the network, different working regimes are possible, with transient or persistent replay activity induced by a brief signal. We introduce an order parameter to evaluate the similarity between stored and recalled phase-coded pattern, and measure the storage capacity. Modulation of spiking thresholds during replay changes the frequency of the collective oscillation or the number of spikes per cycle, keeping preserved the phases relationship. This allows a coding scheme in which phase, rate and frequency are dissociable. Robustness with respect to noise and heterogeneity of neurons parameters is studied, showing that, since dynamics is a retrieval process, neurons preserve stable precise phase relationship among units, keeping a unique frequency of oscillation, even in noisy conditions and with heterogeneity of internal parameters of the units.
ANN-PSO Integrated Optimization Methodology for Intelligent Control of MMC Machining
NASA Astrophysics Data System (ADS)
Chandrasekaran, Muthumari; Tamang, Santosh
2017-08-01
Metal Matrix Composites (MMC) show improved properties in comparison with non-reinforced alloys and have found increased application in automotive and aerospace industries. The selection of optimum machining parameters to produce components of desired surface roughness is of great concern considering the quality and economy of manufacturing process. In this study, a surface roughness prediction model for turning Al-SiCp MMC is developed using Artificial Neural Network (ANN). Three turning parameters viz., spindle speed ( N), feed rate ( f) and depth of cut ( d) were considered as input neurons and surface roughness was an output neuron. ANN architecture having 3 -5 -1 is found to be optimum and the model predicts with an average percentage error of 7.72 %. Particle Swarm Optimization (PSO) technique is used for optimizing parameters to minimize machining time. The innovative aspect of this work is the development of an integrated ANN-PSO optimization method for intelligent control of MMC machining process applicable to manufacturing industries. The robustness of the method shows its superiority for obtaining optimum cutting parameters satisfying desired surface roughness. The method has better convergent capability with minimum number of iterations.
BIM Based Virtual Environment for Fire Emergency Evacuation
Rezgui, Yacine; Ong, Hoang N.
2014-01-01
Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management. PMID:25197704
Top-down and bottom-up analysis of commercial enoxaparins.
Liu, Xinyue; St Ange, Kalib; Lin, Lei; Zhang, Fuming; Chi, Lianli; Linhardt, Robert J
2017-01-13
A strategy for the comprehensive analysis of low molecular weight (LMW) heparins is described that relies on using an integrated top-down and bottom-up approach. Liquid chromatography-mass spectrometry, an essential component of this approach, is rapid, robust, and amenable to automated processing and interpretation. Nuclear magnetic resonance spectroscopy provides complementary top-down information on the chirality of the uronic acid residues comprising a low molecular weight heparin. Using our integrated approach four different low molecular weight heparins prepared from porcine heparin through chemical β-eliminative cleavage were comprehensively analyzed. Lovenox™ and Clexane™, the innovator versions of enoxaparin marketed in the US and Europe, respectively, and two generic enoxaparins, from Sandoz and Teva, were analyzed. The results which were supported by analysis of variation (ANOVA), while showing remarkable similarities between different versions of the product and good lot-to-lot consistency of each product, also detects subtle differences that may result from differences in their manufacturing processes or differences in the source (or parent) porcine heparin from which each product is prepared. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Shen, C.; Fang, K.
2017-12-01
Deep Learning (DL) methods have made revolutionary strides in recent years. A core value proposition of DL is that abstract notions and patterns can be extracted purely from data, without the need for domain expertise. Process-based models (PBM), on the other hand, can be regarded as repositories of human knowledge or hypotheses about how systems function. Here, through computational examples, we argue that there is merit in integrating PBMs with DL due to the imbalance and lack of data in many situations, especially in hydrology. We trained a deep-in-time neural network, the Long Short-Term Memory (LSTM), to learn soil moisture dynamics from Soil Moisture Active Passive (SMAP) Level 3 product. We show that when PBM solutions are integrated into LSTM, the network is able to better generalize across regions. LSTM is able to better utilize PBM solutions than simpler statistical methods. Our results suggest PBMs have generalization value which should be carefully assessed and utilized. We also emphasize that when properly regularized, the deep network is robust and is of superior testing performance compared to simpler methods.
Dimitrova, N; Nagaraj, A B; Razi, A; Singh, S; Kamalakaran, S; Banerjee, N; Joseph, P; Mankovich, A; Mittal, P; DiFeo, A; Varadan, V
2017-04-27
Characterizing the complex interplay of cellular processes in cancer would enable the discovery of key mechanisms underlying its development and progression. Published approaches to decipher driver mechanisms do not explicitly model tissue-specific changes in pathway networks and the regulatory disruptions related to genomic aberrations in cancers. We therefore developed InFlo, a novel systems biology approach for characterizing complex biological processes using a unique multidimensional framework integrating transcriptomic, genomic and/or epigenomic profiles for any given cancer sample. We show that InFlo robustly characterizes tissue-specific differences in activities of signalling networks on a genome scale using unique probabilistic models of molecular interactions on a per-sample basis. Using large-scale multi-omics cancer datasets, we show that InFlo exhibits higher sensitivity and specificity in detecting pathway networks associated with specific disease states when compared to published pathway network modelling approaches. Furthermore, InFlo's ability to infer the activity of unmeasured signalling network components was also validated using orthogonal gene expression signatures. We then evaluated multi-omics profiles of primary high-grade serous ovarian cancer tumours (N=357) to delineate mechanisms underlying resistance to frontline platinum-based chemotherapy. InFlo was the only algorithm to identify hyperactivation of the cAMP-CREB1 axis as a key mechanism associated with resistance to platinum-based therapy, a finding that we subsequently experimentally validated. We confirmed that inhibition of CREB1 phosphorylation potently sensitized resistant cells to platinum therapy and was effective in killing ovarian cancer stem cells that contribute to both platinum-resistance and tumour recurrence. Thus, we propose InFlo to be a scalable and widely applicable and robust integrative network modelling framework for the discovery of evidence-based biomarkers and therapeutic targets.
Modica, Maria Vittoria; Puillandre, Nicolas; Castelin, Magalie; Zhang, Yu; Holford, Mandë
2014-01-01
Devising a reproducible approach for species delimitation of hyperdiverse groups is an ongoing challenge in evolutionary biology. Speciation processes combine modes of passive and adaptive trait divergence requiring an integrative taxonomy approach to accurately generate robust species hypotheses. However, in light of the rapid decline of diversity on Earth, complete integrative approaches may not be practical in certain species-rich environments. As an alternative, we applied a two-step strategy combining ABGD (Automated Barcode Gap Discovery) and Klee diagrams, to balance speed and accuracy in producing primary species hypotheses (PSHs). Specifically, an ABGD/Klee approach was used for species delimitation in the Terebridae, a neurotoxin-producing marine snail family included in the Conoidea. Delimitation of species boundaries is problematic in the Conoidea, as traditional taxonomic approaches are hampered by the high levels of variation, convergence and morphological plasticity of shell characters. We used ABGD to analyze gaps in the distribution of pairwise distances of 454 COI sequences attributed to 87 morphospecies and obtained 98 to 125 Primary Species Hypotheses (PSHs). The PSH partitions were subsequently visualized as a Klee diagram color map, allowing easy detection of the incongruences that were further evaluated individually with two other species delimitation models, General Mixed Yule Coalescent (GMYC) and Poisson Tree Processes (PTP). GMYC and PTP results confirmed the presence of 17 putative cryptic terebrid species in our dataset. The consensus of GMYC, PTP, and ABGD/Klee findings suggest the combination of ABGD and Klee diagrams is an effective approach for rapidly proposing primary species proxies in hyperdiverse groups and a reliable first step for macroscopic biodiversity assessment.
Mullen, Kathy T; Chang, Dorita H F; Hess, Robert F
2015-12-01
There is controversy as to how responses to colour in the human brain are organized within the visual pathways. A key issue is whether there are modular pathways that respond selectively to colour or whether there are common neural substrates for both colour and achromatic (Ach) contrast. We used functional magnetic resonance imaging (fMRI) adaptation to investigate the responses of early and extrastriate visual areas to colour and Ach contrast. High-contrast red-green (RG) and Ach sinewave rings (0.5 cycles/degree, 2 Hz) were used as both adapting stimuli and test stimuli in a block design. We found robust adaptation to RG or Ach contrast in all visual areas. Cross-adaptation between RG and Ach contrast occurred in all areas indicating the presence of integrated, colour and Ach responses. Notably, we revealed contrasting trends for the two test stimuli. For the RG test, unselective processing (robust adaptation to both RG and Ach contrast) was most evident in the early visual areas (V1 and V2), but selective responses, revealed as greater adaptation between the same stimuli than cross-adaptation between different stimuli, emerged in the ventral cortex, in V4 and VO in particular. For the Ach test, unselective responses were again most evident in early visual areas but Ach selectivity emerged in the dorsal cortex (V3a and hMT+). Our findings support a strong presence of integrated mechanisms for colour and Ach contrast across the visual hierarchy, with a progression towards selective processing in extrastriate visual areas. © 2015 The Authors. European Journal of Neuroscience published by Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
In-process fault detection for textile fabric production: onloom imaging
NASA Astrophysics Data System (ADS)
Neumann, Florian; Holtermann, Timm; Schneider, Dorian; Kulczycki, Ashley; Gries, Thomas; Aach, Til
2011-05-01
Constant and traceable high fabric quality is of high importance both for technical and for high-quality conventional fabrics. Usually, quality inspection is carried out by trained personal, whose detection rate and maximum period of concentration are limited. Low resolution automated fabric inspection machines using texture analysis were developed. Since 2003, systems for the in-process inspection on weaving machines ("onloom") are commercially available. With these defects can be detected, but not measured quantitative precisely. Most systems are also prone to inevitable machine vibrations. Feedback loops for fault prevention are not established. Technology has evolved since 2003: Camera and computer prices dropped, resolutions were enhanced, recording speeds increased. These are the preconditions for real-time processing of high-resolution images. So far, these new technological achievements are not used in textile fabric production. For efficient use, a measurement system must be integrated into the weaving process; new algorithms for defect detection and measurement must be developed. The goal of the joint project is the development of a modern machine vision system for nondestructive onloom fabric inspection. The system consists of a vibration-resistant machine integration, a high-resolution machine vision system, and new, reliable, and robust algorithms with quality database for defect documentation. The system is meant to detect, measure, and classify at least 80 % of economically relevant defects. Concepts for feedback loops into the weaving process will be pointed out.
Mechanical design in embryos: mechanical signalling, robustness and developmental defects.
Davidson, Lance A
2017-05-19
Embryos are shaped by the precise application of force against the resistant structures of multicellular tissues. Forces may be generated, guided and resisted by cells, extracellular matrix, interstitial fluids, and how they are organized and bound within the tissue's architecture. In this review, we summarize our current thoughts on the multiple roles of mechanics in direct shaping, mechanical signalling and robustness of development. Genetic programmes of development interact with environmental cues to direct the composition of the early embryo and endow cells with active force production. Biophysical advances now provide experimental tools to measure mechanical resistance and collective forces during morphogenesis and are allowing integration of this field with studies of signalling and patterning during development. We focus this review on concepts that highlight this integration, and how the unique contributions of mechanical cues and gradients might be tested side by side with conventional signalling systems. We conclude with speculation on the integration of large-scale programmes of development, and how mechanical responses may ensure robust development and serve as constraints on programmes of tissue self-assembly.This article is part of the themed issue 'Systems morphodynamics: understanding the development of tissue hardware'. © 2017 The Author(s).
Sainath, Kamalesh; Teixeira, Fernando L; Donderici, Burkay
2014-01-01
We develop a general-purpose formulation, based on two-dimensional spectral integrals, for computing electromagnetic fields produced by arbitrarily oriented dipoles in planar-stratified environments, where each layer may exhibit arbitrary and independent anisotropy in both its (complex) permittivity and permeability tensors. Among the salient features of our formulation are (i) computation of eigenmodes (characteristic plane waves) supported in arbitrarily anisotropic media in a numerically robust fashion, (ii) implementation of an hp-adaptive refinement for the numerical integration to evaluate the radiation and weakly evanescent spectra contributions, and (iii) development of an adaptive extension of an integral convergence acceleration technique to compute the strongly evanescent spectrum contribution. While other semianalytic techniques exist to solve this problem, none have full applicability to media exhibiting arbitrary double anisotropies in each layer, where one must account for the whole range of possible phenomena (e.g., mode coupling at interfaces and nonreciprocal mode propagation). Brute-force numerical methods can tackle this problem but only at a much higher computational cost. The present formulation provides an efficient and robust technique for field computation in arbitrary planar-stratified environments. We demonstrate the formulation for a number of problems related to geophysical exploration.
Dutta, Abhijit; Dowe, Nancy; Ibsen, Kelly N; Schell, Daniel J; Aden, Andy
2010-01-01
Numerous routes are being explored to lower the cost of cellulosic ethanol production and enable large-scale production. One critical area is the development of robust cofermentative organisms to convert the multiple, mixed sugars found in biomass feedstocks to ethanol at high yields and titers without the need for processing to remove inhibitors. Until such microorganisms are commercialized, the challenge is to design processes that exploit the current microorganisms' strengths. This study explored various process configurations tailored to take advantage of the specific capabilities of three microorganisms, Z. mobilis 8b, S. cerevisiae, and S. pastorianus. A technoeconomic study, based on bench-scale experimental data generated by integrated process testing, was completed to understand the resulting costs of the different process configurations. The configurations included whole slurry fermentation with a coculture, and separate cellulose simultaneous saccharification and fermentation (SSF) and xylose fermentations with none, some or all of the water to the SSF replaced with the fermented liquor from the xylose fermentation. The difference between the highest and lowest ethanol cost for the different experimental process configurations studied was $0.27 per gallon ethanol. Separate fermentation of solid and liquor streams with recycle of fermented liquor to dilute the solids gave the lowest ethanol cost, primarily because this option achieved the highest concentrations of ethanol after fermentation. Further studies, using methods similar to ones employed here, can help understand and improve the performance and hence the economics of integrated processes involving enzymes and fermentative microorganisms.
Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems
NASA Astrophysics Data System (ADS)
Koch, Patrick Nathan
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.
Towards a framework for agent-based image analysis of remote-sensing data
Hofmann, Peter; Lettmayer, Paul; Blaschke, Thomas; Belgiu, Mariana; Wegenkittl, Stefan; Graf, Roland; Lampoltshammer, Thomas Josef; Andrejchenko, Vera
2015-01-01
Object-based image analysis (OBIA) as a paradigm for analysing remotely sensed image data has in many cases led to spatially and thematically improved classification results in comparison to pixel-based approaches. Nevertheless, robust and transferable object-based solutions for automated image analysis capable of analysing sets of images or even large image archives without any human interaction are still rare. A major reason for this lack of robustness and transferability is the high complexity of image contents: Especially in very high resolution (VHR) remote-sensing data with varying imaging conditions or sensor characteristics, the variability of the objects’ properties in these varying images is hardly predictable. The work described in this article builds on so-called rule sets. While earlier work has demonstrated that OBIA rule sets bear a high potential of transferability, they need to be adapted manually, or classification results need to be adjusted manually in a post-processing step. In order to automate these adaptation and adjustment procedures, we investigate the coupling, extension and integration of OBIA with the agent-based paradigm, which is exhaustively investigated in software engineering. The aims of such integration are (a) autonomously adapting rule sets and (b) image objects that can adopt and adjust themselves according to different imaging conditions and sensor characteristics. This article focuses on self-adapting image objects and therefore introduces a framework for agent-based image analysis (ABIA). PMID:27721916
Bowtle, William; Kanyowa, Lionel; Mackenzie, Mark; Higgins, Paul
2011-06-01
The industrial take-up of liquid-fill hard capsule technology is limited in part by lack of published long-term physical and chemical stability data which demonstrate the robustness of the system. To assess the effects of extreme long-term storage on liquid-fill capsule product quality and integrity, with respect to both the capsules per se and a standard blister-pack type (foil-film blister). Fourteen sets of stored peroxidation-sensitive liquid-fill hard gelatin capsule product samples, originating ~20 years from the current study, were examined with respect to physical and selected chemical properties, together with microbiological evaluation. All sets retained physical integrity of capsules and blister-packs. Capsules were free of leaks, gelatin cross-linking, and microbiological growth. Eight samples met a limit (anisidine value, 20) commonly used as an index of peroxidation for lipid-based products with shelf lives of 2-3 years. Foil-film blister-packs using PVC or PVC-PVdC as the thermoforming film were well-suited packaging components for the liquid-fill capsule format. The study confirms the long-term physical robustness of the liquid-fill hard capsule format, together with its manufacturing and banding processes. It also indicates that various peroxidation-sensitive products using the capsule format may be maintained satisfactorily over very prolonged storage periods.
Vistoli, D; Passerieux, C; El Zein, M; Clumeck, C; Braun, S; Brunet-Gouet, E
2015-08-01
Chronometric properties of theory of mind and intentions understanding more specifically are well documented. Notably, it was demonstrated using magnetoencephalography that the brain regions involved were recruited as soon as 200 ms post-stimulus. We used event-related potentials (ERPs) to characterize an electrophysiological marker of attribution of intentions. We also explored the robustness of this ERP signature under two conditions corresponding to either explicit instructions to focus on others' intentions or implicit instructions with no reference to mental states. Two matched groups of 16 healthy volunteers each received either explicit or no instructions about intentions and performed a nonverbal attribution of intentions task based on sequential four-image comic strips depicting either intentional or physical causality. A bilateral posterior positive component, ranging from 250 to 650 ms post-stimulus, showed greater amplitude in intentional than in physical condition (the intention ERP effect). This effect occurs during the third image only, suggesting that it reflects the integration of information depicted in the third image to the contextual cues given by the first two. The intention effect was similar in the two groups of subjects. Overall, our results identify a clear ERP marker of the first hundreds of milliseconds of intentions processing probably related to a contextual integrative mechanism and suggest its robustness by showing its blindness to task demands manipulation.
An Adaptive Complex Network Model for Brain Functional Networks
Gomez Portillo, Ignacio J.; Gleiser, Pablo M.
2009-01-01
Brain functional networks are graph representations of activity in the brain, where the vertices represent anatomical regions and the edges their functional connectivity. These networks present a robust small world topological structure, characterized by highly integrated modules connected sparsely by long range links. Recent studies showed that other topological properties such as the degree distribution and the presence (or absence) of a hierarchical structure are not robust, and show different intriguing behaviors. In order to understand the basic ingredients necessary for the emergence of these complex network structures we present an adaptive complex network model for human brain functional networks. The microscopic units of the model are dynamical nodes that represent active regions of the brain, whose interaction gives rise to complex network structures. The links between the nodes are chosen following an adaptive algorithm that establishes connections between dynamical elements with similar internal states. We show that the model is able to describe topological characteristics of human brain networks obtained from functional magnetic resonance imaging studies. In particular, when the dynamical rules of the model allow for integrated processing over the entire network scale-free non-hierarchical networks with well defined communities emerge. On the other hand, when the dynamical rules restrict the information to a local neighborhood, communities cluster together into larger ones, giving rise to a hierarchical structure, with a truncated power law degree distribution. PMID:19738902
Towards a framework for agent-based image analysis of remote-sensing data.
Hofmann, Peter; Lettmayer, Paul; Blaschke, Thomas; Belgiu, Mariana; Wegenkittl, Stefan; Graf, Roland; Lampoltshammer, Thomas Josef; Andrejchenko, Vera
2015-04-03
Object-based image analysis (OBIA) as a paradigm for analysing remotely sensed image data has in many cases led to spatially and thematically improved classification results in comparison to pixel-based approaches. Nevertheless, robust and transferable object-based solutions for automated image analysis capable of analysing sets of images or even large image archives without any human interaction are still rare. A major reason for this lack of robustness and transferability is the high complexity of image contents: Especially in very high resolution (VHR) remote-sensing data with varying imaging conditions or sensor characteristics, the variability of the objects' properties in these varying images is hardly predictable. The work described in this article builds on so-called rule sets. While earlier work has demonstrated that OBIA rule sets bear a high potential of transferability, they need to be adapted manually, or classification results need to be adjusted manually in a post-processing step. In order to automate these adaptation and adjustment procedures, we investigate the coupling, extension and integration of OBIA with the agent-based paradigm, which is exhaustively investigated in software engineering. The aims of such integration are (a) autonomously adapting rule sets and (b) image objects that can adopt and adjust themselves according to different imaging conditions and sensor characteristics. This article focuses on self-adapting image objects and therefore introduces a framework for agent-based image analysis (ABIA).
Robustness surfaces of complex networks
NASA Astrophysics Data System (ADS)
Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis
2014-09-01
Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared.
Robustness surfaces of complex networks.
Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis
2014-09-02
Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared.
Vehicle active steering control research based on two-DOF robust internal model control
NASA Astrophysics Data System (ADS)
Wu, Jian; Liu, Yahui; Wang, Fengbo; Bao, Chunjiang; Sun, Qun; Zhao, Youqun
2016-07-01
Because of vehicle's external disturbances and model uncertainties, robust control algorithms have obtained popularity in vehicle stability control. The robust control usually gives up performance in order to guarantee the robustness of the control algorithm, therefore an improved robust internal model control(IMC) algorithm blending model tracking and internal model control is put forward for active steering system in order to reach high performance of yaw rate tracking with certain robustness. The proposed algorithm inherits the good model tracking ability of the IMC control and guarantees robustness to model uncertainties. In order to separate the design process of model tracking from the robustness design process, the improved 2 degree of freedom(DOF) robust internal model controller structure is given from the standard Youla parameterization. Simulations of double lane change maneuver and those of crosswind disturbances are conducted for evaluating the robust control algorithm, on the basis of a nonlinear vehicle simulation model with a magic tyre model. Results show that the established 2-DOF robust IMC method has better model tracking ability and a guaranteed level of robustness and robust performance, which can enhance the vehicle stability and handling, regardless of variations of the vehicle model parameters and the external crosswind interferences. Contradiction between performance and robustness of active steering control algorithm is solved and higher control performance with certain robustness to model uncertainties is obtained.
Costa, Carlos E; Romaní, Aloia; Cunha, Joana T; Johansson, Björn; Domingues, Lucília
2017-03-01
In this work, four robust yeast chassis isolated from industrial environments were engineered with the same xylose metabolic pathway. The recombinant strains were physiologically characterized in synthetic xylose and xylose-glucose medium, on non-detoxified hemicellulosic hydrolysates of fast-growing hardwoods (Eucalyptus and Paulownia) and agricultural residues (corn cob and wheat straw) and on Eucalyptus hydrolysate at different temperatures. Results show that the co-consumption of xylose-glucose was dependent on the yeast background. Moreover, heterogeneous results were obtained among different hydrolysates and temperatures for each individual strain pointing to the importance of designing from the very beginning a tailor-made yeast considering the specific raw material and process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rocketdyne PSAM: In-house enhancement/application
NASA Technical Reports Server (NTRS)
Newell, J. F.; Rajagopal, K. R.; Ohara, K.
1991-01-01
The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.
Poole, Jennifer L; Donahue, Scott; Wilson, David; Li, Yuk Mun; Zhang, Qi; Gu, Yibei; Ferebee, Rachel; Lu, Zhao; Dorin, Rachel Mika; Hancock, Lawrence F; Takiff, Larry; Hakem, Ilhem F; Bockstaller, Michael R; Wiesner, Ulrich; Walker, Jeremy
2017-10-01
The functionalization with phosphotriesterase of poly(isoprene-b-styrene-b-4-vinylpyridine)-based nanoporous membranes fabricated by self-assembly and nonsolvent induced phase separation (SNIPS) is shown to enable dynamically responsive membranes capable of substrate-specific and localized gating response. Integration of the SNIPS process with macroporous nylon support layers yields mechanically robust textile-type films with high moisture vapor transport rates that display rapid and local order-of-magnitude modulation of permeability. The simplicity of the fabrication process that is compatible with large-area fabrication along with the versatility and efficacy of enzyme reactivity offers intriguing opportunities for engineered biomimetic materials that are tailored to respond to a complex range of external parameters, providing sensing, protection, and remediation capabilities. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Adalarasan, R.; Santhanakumar, M.
2015-01-01
In the present work, yield strength, ultimate strength and micro-hardness of the lap joints formed with Al 6061 alloy sheets by using the processes of Tungsten Inert Gas (TIG) welding and Metal Inert Gas (MIG) welding were studied for various combinations of the welding parameters. The parameters taken for study include welding current, voltage, welding speed and inert gas flow rate. Taguchi's L9 orthogonal array was used to conduct the experiments and an integrated technique of desirability grey relational analysis was disclosed for optimizing the welding parameters. The ignored robustness in desirability approach is compensated by the grey relational approach to predict the optimal setting of input parameters for the TIG and MIG welding processes which were validated through the confirmation experiments.
Robust flow of light in three-dimensional dielectric photonic crystals.
Chen, Wen-Jie; Jiang, Shao-Ji; Dong, Jian-Wen
2013-09-01
Chiral defect waveguides and waveguide bend geometry were designed in diamond photonic crystal to mold the flow of light in three dimensions. Propagations of electromagnetic waves in chiral waveguides are robust against isotropic obstacles, which would suppress backscattering in waveguides or integrated devices. Finite-difference time-domain simulations demonstrate that high coupling efficiency through the bend corner is preserved in the polarization gap, as it provides an additional constraint on the polarization state of the backscattered wave. Transport robustness is also demonstrated by inserting two metallic slabs into the waveguide bend.
NASA Astrophysics Data System (ADS)
Ferriere, Alain; Volut, Mikael; Perez, Antoine; Volut, Yann
2016-05-01
A flux mapping system has been designed, implemented and experimented at the top of the Themis solar tower in France. This system features a moving bar associated to a CCD video camera and a flux gauge mounted onto the bar used as reference measurement for calibration purpose. Images and flux signal are acquired separately. The paper describes the equipment and focus on the data processing to issue the distribution of flux density and concentration at the aperture of the solar receiver. Finally, the solar power entering into the receiver is estimated by integration of flux density. The processing is largely automated in the form of a dedicated software with fast execution. A special attention is paid to the accuracy of the results, to the robustness of the algorithm and to the velocity of the processing.
Reichert, Janice M; Jacob, Nitya; Amanullah, Ashraf
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.
Reichert, Janice M; Jacob, Nitya M; Amanullah, Ashraf
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.
The Contribution of Network Organization and Integration to the Development of Cognitive Control
Marek, Scott; Hwang, Kai; Foran, William; Hallquist, Michael N.; Luna, Beatriz
2015-01-01
Abstract Cognitive control, which continues to mature throughout adolescence, is supported by the ability for well-defined organized brain networks to flexibly integrate information. However, the development of intrinsic brain network organization and its relationship to observed improvements in cognitive control are not well understood. In the present study, we used resting state functional magnetic resonance imaging (RS-fMRI), graph theory, the antisaccade task, and rigorous head motion control to characterize and relate developmental changes in network organization, connectivity strength, and integration to inhibitory control development. Subjects were 192 10–26-y-olds who were imaged during 5 min of rest. In contrast to initial studies, our results indicate that network organization is stable throughout adolescence. However, cross-network integration, predominantly of the cingulo-opercular/salience network, increased with age. Importantly, this increased integration of the cingulo-opercular/salience network significantly moderated the robust effect of age on the latency to initiate a correct inhibitory control response. These results provide compelling evidence that the transition to adult-level inhibitory control is dependent upon the refinement and strengthening of integration between specialized networks. Our findings support a novel, two-stage model of neural development, in which networks stabilize prior to adolescence and subsequently increase their integration to support the cross-domain incorporation of information processing critical for mature cognitive control. PMID:26713863
The Contribution of Network Organization and Integration to the Development of Cognitive Control.
Marek, Scott; Hwang, Kai; Foran, William; Hallquist, Michael N; Luna, Beatriz
2015-12-01
Cognitive control, which continues to mature throughout adolescence, is supported by the ability for well-defined organized brain networks to flexibly integrate information. However, the development of intrinsic brain network organization and its relationship to observed improvements in cognitive control are not well understood. In the present study, we used resting state functional magnetic resonance imaging (RS-fMRI), graph theory, the antisaccade task, and rigorous head motion control to characterize and relate developmental changes in network organization, connectivity strength, and integration to inhibitory control development. Subjects were 192 10-26-y-olds who were imaged during 5 min of rest. In contrast to initial studies, our results indicate that network organization is stable throughout adolescence. However, cross-network integration, predominantly of the cingulo-opercular/salience network, increased with age. Importantly, this increased integration of the cingulo-opercular/salience network significantly moderated the robust effect of age on the latency to initiate a correct inhibitory control response. These results provide compelling evidence that the transition to adult-level inhibitory control is dependent upon the refinement and strengthening of integration between specialized networks. Our findings support a novel, two-stage model of neural development, in which networks stabilize prior to adolescence and subsequently increase their integration to support the cross-domain incorporation of information processing critical for mature cognitive control.
Scale-adaptive compressive tracking with feature integration
NASA Astrophysics Data System (ADS)
Liu, Wei; Li, Jicheng; Chen, Xiao; Li, Shuxin
2016-05-01
Numerous tracking-by-detection methods have been proposed for robust visual tracking, among which compressive tracking (CT) has obtained some promising results. A scale-adaptive CT method based on multifeature integration is presented to improve the robustness and accuracy of CT. We introduce a keypoint-based model to achieve the accurate scale estimation, which can additionally give a prior location of the target. Furthermore, by the high efficiency of data-independent random projection matrix, multiple features are integrated into an effective appearance model to construct the naïve Bayes classifier. At last, an adaptive update scheme is proposed to update the classifier conservatively. Experiments on various challenging sequences demonstrate substantial improvements by our proposed tracker over CT and other state-of-the-art trackers in terms of dealing with scale variation, abrupt motion, deformation, and illumination changes.
NASA Astrophysics Data System (ADS)
Costantini, Mario; Malvarosa, Fabio; Minati, Federico
2010-03-01
Phase unwrapping and integration of finite differences are key problems in several technical fields. In SAR interferometry and differential and persistent scatterers interferometry digital elevation models and displacement measurements can be obtained after unambiguously determining the phase values and reconstructing the mean velocities and elevations of the observed targets, which can be performed by integrating differential estimates of these quantities (finite differences between neighboring points).In this paper we propose a general formulation for robust and efficient integration of finite differences and phase unwrapping, which includes standard techniques methods as sub-cases. The proposed approach allows obtaining more reliable and accurate solutions by exploiting redundant differential estimates (not only between nearest neighboring points) and multi-dimensional information (e.g. multi-temporal, multi-frequency, multi-baseline observations), or external data (e.g. GPS measurements). The proposed approach requires the solution of linear or quadratic programming problems, for which computationally efficient algorithms exist.The validation tests obtained on real SAR data confirm the validity of the method, which was integrated in our production chain and successfully used also in massive productions.
Sustainable infrastructure system modeling under uncertainties and dynamics
NASA Astrophysics Data System (ADS)
Huang, Yongxi
Infrastructure systems support human activities in transportation, communication, water use, and energy supply. The dissertation research focuses on critical transportation infrastructure and renewable energy infrastructure systems. The goal of the research efforts is to improve the sustainability of the infrastructure systems, with an emphasis on economic viability, system reliability and robustness, and environmental impacts. The research efforts in critical transportation infrastructure concern the development of strategic robust resource allocation strategies in an uncertain decision-making environment, considering both uncertain service availability and accessibility. The study explores the performances of different modeling approaches (i.e., deterministic, stochastic programming, and robust optimization) to reflect various risk preferences. The models are evaluated in a case study of Singapore and results demonstrate that stochastic modeling methods in general offers more robust allocation strategies compared to deterministic approaches in achieving high coverage to critical infrastructures under risks. This general modeling framework can be applied to other emergency service applications, such as, locating medical emergency services. The development of renewable energy infrastructure system development aims to answer the following key research questions: (1) is the renewable energy an economically viable solution? (2) what are the energy distribution and infrastructure system requirements to support such energy supply systems in hedging against potential risks? (3) how does the energy system adapt the dynamics from evolving technology and societal needs in the transition into a renewable energy based society? The study of Renewable Energy System Planning with Risk Management incorporates risk management into its strategic planning of the supply chains. The physical design and operational management are integrated as a whole in seeking mitigations against the potential risks caused by feedstock seasonality and demand uncertainty. Facility spatiality, time variation of feedstock yields, and demand uncertainty are integrated into a two-stage stochastic programming (SP) framework. In the study of Transitional Energy System Modeling under Uncertainty, a multistage stochastic dynamic programming is established to optimize the process of building and operating fuel production facilities during the transition. Dynamics due to the evolving technologies and societal changes and uncertainty due to demand fluctuations are the major issues to be addressed.
A Practical Approach to Programmatic Assessment Design
ERIC Educational Resources Information Center
Timmerman, A. A.; Dijkstra, J.
2017-01-01
Assessment of complex tasks integrating several competencies calls for a programmatic design approach. As single instruments do not provide the information required to reach a robust judgment of integral performance, 73 guidelines for programmatic assessment design were developed. When simultaneously applying these interrelated guidelines, it is…
NASA Astrophysics Data System (ADS)
Sablik, Thomas; Velten, Jörg; Kummert, Anton
2015-03-01
An novel system for automatic privacy protection in digital media based on spectral domain watermarking and JPEG compression is described in the present paper. In a first step private areas are detected. Therefore a detection method is presented. The implemented method uses Haar cascades to detects faces. Integral images are used to speed up calculations and the detection. Multiple detections of one face are combined. Succeeding steps comprise embedding the data into the image as part of JPEG compression using spectral domain methods and protecting the area of privacy. The embedding process is integrated into and adapted to JPEG compression. A Spread Spectrum Watermarking method is used to embed the size and position of the private areas into the cover image. Different methods for embedding regarding their robustness are compared. Moreover the performance of the method concerning tampered images is presented.
Roca, Elisabet; Gamboa, Gonzalo; Tàbara, J David
2008-04-01
The complex and multidimensional nature of coastal erosion risks makes it necessary to move away from single-perspective assessment and management methods that have conventionally predominated in coastal management. This article explores the suitability of participatory multicriteria analysis (MCA) for improving the integration of diverse expertises and values and enhancing the social-ecological robustness of the processes that lead to the definition of relevant policy options to deal with those risks. We test this approach in the Mediterranean coastal locality of Lido de Sète in France. Results show that the more adaptive alternatives such as "retreating the shoreline" were preferred by our selected stakeholders to those corresponding to "protecting the shoreline" and the business as usual proposals traditionally put forward by experts and policymakers on these matters. Participative MCA contributed to represent coastal multidimensionality, elicit and integrate different views and preferences, facilitated knowledge exchange, and allowed highlighting existing uncertainties.
Biosensors-on-chip: a topical review
NASA Astrophysics Data System (ADS)
Chen, Sensen; Shamsi, Mohtashim H.
2017-08-01
This review will examine the integration of two fields that are currently at the forefront of science, i.e. biosensors and microfluidics. As a lab-on-a-chip (LOC) technology, microfluidics has been enriched by the integration of various detection tools for analyte detection and quantitation. The application of such microfluidic platforms is greatly increased in the area of biosensors geared towards point-of-care diagnostics. Together, the merger of microfluidics and biosensors has generated miniaturized devices for sample processing and sensitive detection with quantitation. We believe that microfluidic biosensors (biosensors-on-chip) are essential for developing robust and cost effective point-of-care diagnostics. This review is relevant to a variety of disciplines, such as medical science, clinical diagnostics, LOC technologies including MEMs/NEMs, and analytical science. Specifically, this review will appeal to scientists working in the two overlapping fields of biosensors and microfluidics, and will also help new scientists to find their directions in developing point-of-care devices.
Distributed wireless sensing for methane leak detection technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Levente; van Kesse, Theodor
Large scale environmental monitoring requires dynamic optimization of data transmission, power management, and distribution of the computational load. In this work, we demonstrate the use of a wireless sensor network for detection of chemical leaks on gas oil well pads. The sensor network consist of chemi-resistive and wind sensors and aggregates all the data and transmits it to the cloud for further analytics processing. The sensor network data is integrated with an inversion model to identify leak location and quantify leak rates. We characterize the sensitivity and accuracy of such system under multiple well controlled methane release experiments. It ismore » demonstrated that even 1 hour measurement with 10 sensors localizes leaks within 1 m and determines leak rate with an accuracy of 40%. This integrated sensing and analytics solution is currently refined to be a robust system for long term remote monitoring of methane leaks, generation of alarms, and tracking regulatory compliance.« less
Distributed wireless sensing for fugitive methane leak detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Levente J.; van Kessel, Theodore; Nair, Dhruv
Large scale environmental monitoring requires dynamic optimization of data transmission, power management, and distribution of the computational load. In this work, we demonstrate the use of a wireless sensor network for detection of chemical leaks on gas oil well pads. The sensor network consist of chemi-resistive and wind sensors and aggregates all the data and transmits it to the cloud for further analytics processing. The sensor network data is integrated with an inversion model to identify leak location and quantify leak rates. We characterize the sensitivity and accuracy of such system under multiple well controlled methane release experiments. It ismore » demonstrated that even 1 hour measurement with 10 sensors localizes leaks within 1 m and determines leak rate with an accuracy of 40%. This integrated sensing and analytics solution is currently refined to be a robust system for long term remote monitoring of methane leaks, generation of alarms, and tracking regulatory compliance.« less
A comprehensive strategy for designing a Web-based medical curriculum.
Zucker, J.; Chase, H.; Molholt, P.; Bean, C.; Kahn, R. M.
1996-01-01
In preparing for a full featured online curriculum, it is necessary to develop scaleable strategies for software design that will support the pedagogical goals of the curriculum and which will address the issues of acquisition and updating of materials, of robust content-based linking, and of integration of the online materials into other methods of learning. A complete online curriculum, as distinct from an individual computerized module, must provide dynamic updating of both content and structure and an easy pathway from the professor's notes to the finished online product. At the College of Physicians and Surgeons, we are developing such strategies including a scripted text conversion process that uses the Hypertext Markup Language (HTML) as structural markup rather than as display markup, automated linking by the use of relational databases and the Unified Medical Language System (UMLS), integration of text, images, and multimedia along with interface designs which promote multiple contexts and collaborative study. PMID:8947624
Distributed wireless sensing for fugitive methane leak detection
Klein, Levente J.; van Kessel, Theodore; Nair, Dhruv; ...
2017-12-11
Large scale environmental monitoring requires dynamic optimization of data transmission, power management, and distribution of the computational load. In this work, we demonstrate the use of a wireless sensor network for detection of chemical leaks on gas oil well pads. The sensor network consist of chemi-resistive and wind sensors and aggregates all the data and transmits it to the cloud for further analytics processing. The sensor network data is integrated with an inversion model to identify leak location and quantify leak rates. We characterize the sensitivity and accuracy of such system under multiple well controlled methane release experiments. It ismore » demonstrated that even 1 hour measurement with 10 sensors localizes leaks within 1 m and determines leak rate with an accuracy of 40%. This integrated sensing and analytics solution is currently refined to be a robust system for long term remote monitoring of methane leaks, generation of alarms, and tracking regulatory compliance.« less
III-V quantum light source and cavity-QED on silicon.
Luxmoore, I J; Toro, R; Del Pozo-Zamudio, O; Wasley, N A; Chekhovich, E A; Sanchez, A M; Beanland, R; Fox, A M; Skolnick, M S; Liu, H Y; Tartakovskii, A I
2013-01-01
Non-classical light sources offer a myriad of possibilities in both fundamental science and commercial applications. Single photons are the most robust carriers of quantum information and can be exploited for linear optics quantum information processing. Scale-up requires miniaturisation of the waveguide circuit and multiple single photon sources. Silicon photonics, driven by the incentive of optical interconnects is a highly promising platform for the passive optical components, but integrated light sources are limited by silicon's indirect band-gap. III-V semiconductor quantum-dots, on the other hand, are proven quantum emitters. Here we demonstrate single-photon emission from quantum-dots coupled to photonic crystal nanocavities fabricated from III-V material grown directly on silicon substrates. The high quality of the III-V material and photonic structures is emphasized by observation of the strong-coupling regime. This work opens-up the advantages of silicon photonics to the integration and scale-up of solid-state quantum optical systems.
Bürger, Raimund; Diehl, Stefan; Mejías, Camilo
2016-01-01
The main purpose of the recently introduced Bürger-Diehl simulation model for secondary settling tanks was to resolve spatial discretization problems when both hindered settling and the phenomena of compression and dispersion are included. Straightforward time integration unfortunately means long computational times. The next step in the development is to introduce and investigate time-integration methods for more efficient simulations, but where other aspects such as implementation complexity and robustness are equally considered. This is done for batch settling simulations. The key findings are partly a new time-discretization method and partly its comparison with other specially tailored and standard methods. Several advantages and disadvantages for each method are given. One conclusion is that the new linearly implicit method is easier to implement than another one (semi-implicit method), but less efficient based on two types of batch sedimentation tests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zheng; Ukida, H.; Ramuhalli, Pradeep
2010-06-05
Imaging- and vision-based techniques play an important role in industrial inspection. The sophistication of the techniques assures high- quality performance of the manufacturing process through precise positioning, online monitoring, and real-time classification. Advanced systems incorporating multiple imaging and/or vision modalities provide robust solutions to complex situations and problems in industrial applications. A diverse range of industries, including aerospace, automotive, electronics, pharmaceutical, biomedical, semiconductor, and food/beverage, etc., have benefited from recent advances in multi-modal imaging, data fusion, and computer vision technologies. Many of the open problems in this context are in the general area of image analysis methodologies (preferably in anmore » automated fashion). This editorial article introduces a special issue of this journal highlighting recent advances and demonstrating the successful applications of integrated imaging and vision technologies in industrial inspection.« less
Robust nano-fabrication of an integrated platform for spin control in a tunable microcavity
NASA Astrophysics Data System (ADS)
Bogdanović, Stefan; Liddy, Madelaine S. Z.; van Dam, Suzanne B.; Coenen, Lisanne C.; Fink, Thomas; Lončar, Marko; Hanson, Ronald
2017-12-01
Coupling nitrogen-vacancy (NV) centers in diamonds to optical cavities is a promising way to enhance the efficiency of diamond-based quantum networks. An essential aspect of the full toolbox required for the operation of these networks is the ability to achieve the microwave control of the electron spin associated with this defect within the cavity framework. Here, we report on the fabrication of an integrated platform for the microwave control of an NV center electron spin in an open, tunable Fabry-Pérot microcavity. A critical aspect of the measurements of the cavity's finesse reveals that the presented fabrication process does not compromise its optical properties. We provide a method to incorporate a thin diamond slab into the cavity architecture and demonstrate the control of the NV center spin. These results show the promise of this design for future cavity-enhanced NV center spin-photon entanglement experiments.
Sharpening of Hierarchical Visual Feature Representations of Blurred Images.
Abdelhack, Mohamed; Kamitani, Yukiyasu
2018-01-01
The robustness of the visual system lies in its ability to perceive degraded images. This is achieved through interacting bottom-up, recurrent, and top-down pathways that process the visual input in concordance with stored prior information. The interaction mechanism by which they integrate visual input and prior information is still enigmatic. We present a new approach using deep neural network (DNN) representation to reveal the effects of such integration on degraded visual inputs. We transformed measured human brain activity resulting from viewing blurred images to the hierarchical representation space derived from a feedforward DNN. Transformed representations were found to veer toward the original nonblurred image and away from the blurred stimulus image. This indicated deblurring or sharpening in the neural representation, and possibly in our perception. We anticipate these results will help unravel the interplay mechanism between bottom-up, recurrent, and top-down pathways, leading to more comprehensive models of vision.
Robust authentication through stochastic femtosecond laser filament induced scattering surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Haisu; Tzortzakis, Stelios, E-mail: stzortz@iesl.forth.gr; Materials Science and Technology Department, University of Crete, 71003 Heraklion
2016-05-23
We demonstrate a reliable authentication method by femtosecond laser filament induced scattering surfaces. The stochastic nonlinear laser fabrication nature results in unique authentication robust properties. This work provides a simple and viable solution for practical applications in product authentication, while also opens the way for incorporating such elements in transparent media and coupling those in integrated optical circuits.
Deep Coupled Integration of CSAC and GNSS for Robust PNT.
Ma, Lin; You, Zheng; Li, Bin; Zhou, Bin; Han, Runqi
2015-09-11
Global navigation satellite systems (GNSS) are the most widely used positioning, navigation, and timing (PNT) technology. However, a GNSS cannot provide effective PNT services in physical blocks, such as in a natural canyon, canyon city, underground, underwater, and indoors. With the development of micro-electromechanical system (MEMS) technology, the chip scale atomic clock (CSAC) gradually matures, and performance is constantly improved. A deep coupled integration of CSAC and GNSS is explored in this thesis to enhance PNT robustness. "Clock coasting" of CSAC provides time synchronized with GNSS and optimizes navigation equations. However, errors of clock coasting increase over time and can be corrected by GNSS time, which is stable but noisy. In this paper, weighted linear optimal estimation algorithm is used for CSAC-aided GNSS, while Kalman filter is used for GNSS-corrected CSAC. Simulations of the model are conducted, and field tests are carried out. Dilution of precision can be improved by integration. Integration is more accurate than traditional GNSS. When only three satellites are visible, the integration still works, whereas the traditional method fails. The deep coupled integration of CSAC and GNSS can improve the accuracy, reliability, and availability of PNT.
Deep Coupled Integration of CSAC and GNSS for Robust PNT
Ma, Lin; You, Zheng; Li, Bin; Zhou, Bin; Han, Runqi
2015-01-01
Global navigation satellite systems (GNSS) are the most widely used positioning, navigation, and timing (PNT) technology. However, a GNSS cannot provide effective PNT services in physical blocks, such as in a natural canyon, canyon city, underground, underwater, and indoors. With the development of micro-electromechanical system (MEMS) technology, the chip scale atomic clock (CSAC) gradually matures, and performance is constantly improved. A deep coupled integration of CSAC and GNSS is explored in this thesis to enhance PNT robustness. “Clock coasting” of CSAC provides time synchronized with GNSS and optimizes navigation equations. However, errors of clock coasting increase over time and can be corrected by GNSS time, which is stable but noisy. In this paper, weighted linear optimal estimation algorithm is used for CSAC-aided GNSS, while Kalman filter is used for GNSS-corrected CSAC. Simulations of the model are conducted, and field tests are carried out. Dilution of precision can be improved by integration. Integration is more accurate than traditional GNSS. When only three satellites are visible, the integration still works, whereas the traditional method fails. The deep coupled integration of CSAC and GNSS can improve the accuracy, reliability, and availability of PNT. PMID:26378542
Integration of solid-state nanopores in a 0.5 μm cmos foundry process
Uddin, A; Yemenicioglu, S; Chen, C-H; Corigliano, E; Milaninia, K; Theogarajan, L
2013-01-01
High-bandwidth and low-noise nanopore sensor and detection electronics are crucial in achieving single-DNA base resolution. A potential way to accomplish this goal is to integrate solid-state nanopores within a CMOS platform, in close proximity to the biasing electrodes and custom-designed amplifier electronics. Here we report the integration of solid-state nanopore devices in a commercial complementary metal-oxide semiconductor (CMOS) potentiostat chip implemented in On-Semiconductor’s 0.5 μm technology. Nanopore membranes incorporating electrodes are fabricated by post-CMOS micromachining utilizing the N+ polysilicon/SiO2/N+ polysilicon capacitor structure available in the aforementioned process. Nanopores are created in the CMOS process by drilling in a transmission electron microscope and shrinking by atomic layer deposition. We also describe a batch fabrication method to process a large of number of electrode-embedded nanopores with sub-10 nm diameter across CMOS-compatible wafers by electron beam lithography and atomic layer deposition. The CMOS-compatibility of our fabrication process is verified by testing the electrical functionality of on-chip circuitry. We observe high current leakage with the CMOS nanopore devices due to the ionic diffusion through the SiO2 membrane. To prevent this leakage, we coat the membrane with Al2O3 which acts as an efficient diffusion barrier against alkali ions. The resulting nanopore devices also exhibit higher robustness and lower 1/f noise as compared to SiO2 and SiNx. Furthermore, we propose a theoretical model for our low-capacitance CMOS nanopore devices, showing good agreement with the experimental value. In addition, experiments and theoretical models of translocation studies are presented using 48.5 kbp λ-DNA in order to prove the functionality of on-chip pores coated with Al2O3. PMID:23519330
The SeaView EarthCube project: Lessons Learned from Integrating Across Repositories
NASA Astrophysics Data System (ADS)
Diggs, S. C.; Stocks, K. I.; Arko, R. A.; Kinkade, D.; Shepherd, A.; Olson, C. J.; Pham, A.
2017-12-01
SeaView is an NSF-funded EarthCube Integrative Activity Project working with 5 existing data repositories* to provide oceanographers with highly integrated thematic data collections in user-requested formats. The project has three complementary goals: Supporting Scientists: SeaView targets scientists' need for easy access to data of interest that are ready to import into their preferred tool. Strengthening Repositories: By integrating data from multiple repositories for science use, SeaView is helping the ocean data repositories align their data and processes and make ocean data more accessible and easily integrated. Informing EarthCube (earthcube.org): SeaView's experience as an integration demonstration can inform the larger NSF EarthCube architecture and design effort. The challenges faced in this small-scale effort are informative to geosciences cyberinfrastructure more generally. Here we focus on the lessons learned that may inform other data facilities and integrative architecture projects. (The SeaView data collections will be presented at the Ocean Sciences 2018 meeting.) One example is the importance of shared semantics, with persistent identifiers, for key integration elements across the data sets (e.g. cruise, parameter, and project/program.) These must allow for revision through time and should have an agreed authority or process for resolving conflicts: aligning identifiers and correcting errors were time consuming and often required both deep domain knowledge and "back end" knowledge of the data facilities. Another example is the need for robust provenance, and tools that support automated or semi-automated data transform pipelines that capture provenance. Multiple copies and versions of data are now flowing into repositories, and onward to long-term archives such as NOAA NCEI and umbrella portals such as DataONE. Exact copies can be identified with hashes (for those that have the skills), but it can be painfully difficult to understand the processing or format changes that differentiates versions. As more sensors are deployed, and data re-use increases, this will only become more challenging. We will discuss these, and additional lessons learned, as well as invite discussion and solutions from others doing similar work. * BCO-DMO, CCHDO, OBIS, OOI, R2R
Integrating macromolecular X-ray diffraction data with the graphical user interface iMosflm.
Powell, Harold R; Battye, T Geoff G; Kontogiannis, Luke; Johnson, Owen; Leslie, Andrew G W
2017-07-01
X-ray crystallography is the predominant source of structural information for biological macromolecules, providing fundamental insights into biological function. The availability of robust and user-friendly software to process the collected X-ray diffraction images makes the technique accessible to a wider range of scientists. iMosflm/MOSFLM (http://www.mrc-lmb.cam.ac.uk/harry/imosflm) is a software package designed to achieve this goal. The graphical user interface (GUI) version of MOSFLM (called iMosflm) is designed to guide inexperienced users through the steps of data integration, while retaining powerful features for more experienced users. Images from almost all commercially available X-ray detectors can be handled using this software. Although the program uses only 2D profile fitting, it can readily integrate data collected in the 'fine phi-slicing' mode (in which the rotation angle per image is less than the crystal mosaic spread by a factor of at least 2), which is commonly used with modern very fast readout detectors. The GUI provides real-time feedback on the success of the indexing step and the progress of data processing. This feedback includes the ability to monitor detector and crystal parameter refinement and to display the average spot shape in different regions of the detector. Data scaling and merging tasks can be initiated directly from the interface. Using this protocol, a data set of 360 images with ∼2,000 reflections per image can be processed in ∼4 min.
Integrating macromolecular X-ray diffraction data with the graphical user interface iMOSFLM
Powell, Harold R; Battye, T Geoff G; Kontogiannis, Luke; Johnson, Owen; Leslie, Andrew GW
2017-01-01
X-ray crystallography is the overwhelmingly dominant source of structural information for biological macromolecules, providing fundamental insights into biological function. Collection of X-ray diffraction data underlies the technique, and robust and user-friendly software to process the diffraction images makes the technique accessible to a wider range of scientists. iMosflm/MOSFLM (www.mrc-lmb.cam.ac.uk/harry/imosflm) is a software package designed to achieve this goal. The graphical user interface (GUI) version of MOSFLM (called iMosflm) is designed to guide inexperienced users through the steps of data integration, while retaining powerful features for more experienced users. Images from almost all commercially available X-ray detectors can be handled. Although the program only utilizes two-dimensional profile fitting, it can readily integrate data collected in “fine phi-slicing” mode (where the rotation angle per image is less than the crystal mosaic spread by a factor of at least 2) that is commonly employed with modern very fast readout detectors. The graphical user interface provides real-time feedback on the success of the indexing step and the progress of data processing. This feedback includes the ability to monitor detector and crystal parameter refinement and to display the average spot shape in different regions of the detector. Data scaling and merging tasks can be initiated directly from the interface. Using this protocol, a dataset of 360 images with ~2000 reflections per image can be processed in approximately four minutes. PMID:28569763
Kovalenko, Lyudmyla Y; Chaumon, Maximilien; Busch, Niko A
2012-07-01
Semantic processing of verbal and visual stimuli has been investigated in semantic violation or semantic priming paradigms in which a stimulus is either related or unrelated to a previously established semantic context. A hallmark of semantic priming is the N400 event-related potential (ERP)--a deflection of the ERP that is more negative for semantically unrelated target stimuli. The majority of studies investigating the N400 and semantic integration have used verbal material (words or sentences), and standardized stimulus sets with norms for semantic relatedness have been published for verbal but not for visual material. However, semantic processing of visual objects (as opposed to words) is an important issue in research on visual cognition. In this study, we present a set of 800 pairs of semantically related and unrelated visual objects. The images were rated for semantic relatedness by a sample of 132 participants. Furthermore, we analyzed low-level image properties and matched the two semantic categories according to these features. An ERP study confirmed the suitability of this image set for evoking a robust N400 effect of semantic integration. Additionally, using a general linear modeling approach of single-trial data, we also demonstrate that low-level visual image properties and semantic relatedness are in fact only minimally overlapping. The image set is available for download from the authors' website. We expect that the image set will facilitate studies investigating mechanisms of semantic and contextual processing of visual stimuli.
On adaptive robustness approach to Anti-Jam signal processing
NASA Astrophysics Data System (ADS)
Poberezhskiy, Y. S.; Poberezhskiy, G. Y.
An effective approach to exploiting statistical differences between desired and jamming signals named adaptive robustness is proposed and analyzed in this paper. It combines conventional Bayesian, adaptive, and robust approaches that are complementary to each other. This combining strengthens the advantages and mitigates the drawbacks of the conventional approaches. Adaptive robustness is equally applicable to both jammers and their victim systems. The capabilities required for realization of adaptive robustness in jammers and victim systems are determined. The employment of a specific nonlinear robust algorithm for anti-jam (AJ) processing is described and analyzed. Its effectiveness in practical situations has been proven analytically and confirmed by simulation. Since adaptive robustness can be used by both sides in electronic warfare, it is more advantageous for the fastest and most intelligent side. Many results obtained and discussed in this paper are also applicable to commercial applications such as communications in unregulated or poorly regulated frequency ranges and systems with cognitive capabilities.
Bonilauri Ferreira, Ana Paula Ribeiro; Ferreira, Rodrigo Fernando; Rajgor, Dimple; Shah, Jatin; Menezes, Andrea; Pietrobon, Ricardo
2010-04-20
Little is known about the reasoning mechanisms used by physicians in decision-making and how this compares to diagnostic clinical practice guidelines. We explored the clinical reasoning process in a real life environment. This is a qualitative study evaluating transcriptions of sixteen physicians' reasoning during appointments with patients, clinical discussions between specialists, and personal interviews with physicians affiliated to a hospital in Brazil. FOUR MAIN THEMES WERE IDENTIFIED: simple and robust heuristics, extensive use of social environment rationality, attempts to prove diagnostic and therapeutic hypothesis while refuting potential contradictions using positive test strategy, and reaching the saturation point. Physicians constantly attempted to prove their initial hypothesis while trying to refute any contradictions. While social environment rationality was the main factor in the determination of all steps of the clinical reasoning process, factors such as referral letters and number of contradictions associated with the initial hypothesis had influence on physicians' confidence and determination of the threshold to reach a final decision. Physicians rely on simple heuristics associated with environmental factors. This model allows for robustness, simplicity, and cognitive energy saving. Since this model does not fit into current diagnostic clinical practice guidelines, we make some propositions to help its integration.
NASA Astrophysics Data System (ADS)
Zhao, Z.-G.; Chen, H.-J.; Yang, Y.-Y.; He, L.
2015-09-01
For a hybrid car equipped with dual clutch transmission (DCT), the coordination control problems of clutches and power sources are investigated while taking full advantage of the integrated starter generator motor's fast response speed and high accuracy (speed and torque). First, a dynamic model of the shifting process is established, the vehicle acceleration is quantified according to the intentions of the driver, and the torque transmitted by clutches is calculated based on the designed disengaging principle during the torque phase. Next, a robust H∞ controller is designed to ensure speed synchronisation despite the existence of model uncertainties, measurement noise, and engine torque lag. The engine torque lag and measurement noise are used as external disturbances to initially modify the output torque of the power source. Additionally, during the torque switch phase, the torque of the power sources is smoothly transitioned to the driver's demanded torque. Finally, the torque of the power sources is further distributed based on the optimisation of system efficiency, and the throttle opening of the engine is constrained to avoid sharp torque variations. The simulation results verify that the proposed control strategies effectively address the problem of coordinating control of clutches and power sources, establishing a foundation for the application of DCT in hybrid cars.
Bonilauri Ferreira, Ana Paula Ribeiro; Ferreira, Rodrigo Fernando; Rajgor, Dimple; Shah, Jatin; Menezes, Andrea; Pietrobon, Ricardo
2010-01-01
Background Little is known about the reasoning mechanisms used by physicians in decision-making and how this compares to diagnostic clinical practice guidelines. We explored the clinical reasoning process in a real life environment. Method This is a qualitative study evaluating transcriptions of sixteen physicians' reasoning during appointments with patients, clinical discussions between specialists, and personal interviews with physicians affiliated to a hospital in Brazil. Results Four main themes were identified: simple and robust heuristics, extensive use of social environment rationality, attempts to prove diagnostic and therapeutic hypothesis while refuting potential contradictions using positive test strategy, and reaching the saturation point. Physicians constantly attempted to prove their initial hypothesis while trying to refute any contradictions. While social environment rationality was the main factor in the determination of all steps of the clinical reasoning process, factors such as referral letters and number of contradictions associated with the initial hypothesis had influence on physicians' confidence and determination of the threshold to reach a final decision. Discussion Physicians rely on simple heuristics associated with environmental factors. This model allows for robustness, simplicity, and cognitive energy saving. Since this model does not fit into current diagnostic clinical practice guidelines, we make some propositions to help its integration. PMID:20421920
A Robust Self-Alignment Method for Ship's Strapdown INS Under Mooring Conditions
Sun, Feng; Lan, Haiyu; Yu, Chunyang; El-Sheimy, Naser; Zhou, Guangtao; Cao, Tong; Liu, Hang
2013-01-01
Strapdown inertial navigation systems (INS) need an alignment process to determine the initial attitude matrix between the body frame and the navigation frame. The conventional alignment process is to compute the initial attitude matrix using the gravity and Earth rotational rate measurements. However, under mooring conditions, the inertial measurement unit (IMU) employed in a ship's strapdown INS often suffers from both the intrinsic sensor noise components and the external disturbance components caused by the motions of the sea waves and wind waves, so a rapid and precise alignment of a ship's strapdown INS without any auxiliary information is hard to achieve. A robust solution is given in this paper to solve this problem. The inertial frame based alignment method is utilized to adapt the mooring condition, most of the periodical low-frequency external disturbance components could be removed by the mathematical integration and averaging characteristic of this method. A novel prefilter named hidden Markov model based Kalman filter (HMM-KF) is proposed to remove the relatively high-frequency error components. Different from the digital filters, the HMM-KF barely cause time-delay problem. The turntable, mooring and sea experiments favorably validate the rapidness and accuracy of the proposed self-alignment method and the good de-noising performance of HMM-KF. PMID:23799492
Panaceas, uncertainty, and the robust control framework in sustainability science
Anderies, John M.; Rodriguez, Armando A.; Janssen, Marco A.; Cifdaloz, Oguzhan
2007-01-01
A critical challenge faced by sustainability science is to develop strategies to cope with highly uncertain social and ecological dynamics. This article explores the use of the robust control framework toward this end. After briefly outlining the robust control framework, we apply it to the traditional Gordon–Schaefer fishery model to explore fundamental performance–robustness and robustness–vulnerability trade-offs in natural resource management. We find that the classic optimal control policy can be very sensitive to parametric uncertainty. By exploring a large class of alternative strategies, we show that there are no panaceas: even mild robustness properties are difficult to achieve, and increasing robustness to some parameters (e.g., biological parameters) results in decreased robustness with respect to others (e.g., economic parameters). On the basis of this example, we extract some broader themes for better management of resources under uncertainty and for sustainability science in general. Specifically, we focus attention on the importance of a continual learning process and the use of robust control to inform this process. PMID:17881574
Compensating Unknown Time-Varying Delay in Opto-Electronic Platform Tracking Servo System.
Xie, Ruihong; Zhang, Tao; Li, Jiaquan; Dai, Ming
2017-05-09
This paper investigates the problem of compensating miss-distance delay in opto-electronic platform tracking servo system. According to the characteristic of LOS (light-of-sight) motion, we setup the Markovian process model and compensate this unknown time-varying delay by feed-forward forecasting controller based on robust H∞ control. Finally, simulation based on double closed-loop PI (Proportion Integration) control system indicates that the proposed method is effective for compensating unknown time-varying delay. Tracking experiments on the opto-electronic platform indicate that RMS (root-mean-square) error is 1.253 mrad when tracking 10° 0.2 Hz signal.
Telescope Multi-Field Wavefront Control with a Kalman Filter
NASA Technical Reports Server (NTRS)
Lou, John Z.; Redding, David; Sigrist, Norbert; Basinger, Scott
2008-01-01
An effective multi-field wavefront control (WFC) approach is demonstrated for an actuated, segmented space telescope using wavefront measurements at the exit pupil, and the optical and computational implications of this approach are discussed. The integration of a Kalman Filter as an optical state estimator into the wavefront control process to further improve the robustness of the optical alignment of the telescope will also be discussed. Through a comparison of WFC performances between on-orbit and ground-test optical system configurations, the connection (and a possible disconnection) between WFC and optical system alignment under these circumstances are analyzed. Our MACOS-based computer simulation results will be presented and discussed.
RFID identity theft and countermeasures
NASA Astrophysics Data System (ADS)
Herrigel, Alexander; Zhao, Jian
2006-02-01
This paper reviews the ICAO security architecture for biometric passports. An attack enabling RFID identity theft for a later misuse is presented. Specific countermeasures against this attack are described. Furthermore, it is shown that robust high capacity digital watermarking for the embedding and retrieving of binary digital signature data can be applied as an effective mean against RFID identity theft. This approach requires only minimal modifications of the passport manufacturing process and is an enhancement of already proposed solutions. The approach may also be applied in combination with a RFID as a backup solution (damaged RFID chip) to verify with asymmetric cryptographic techniques the authenticity and the integrity of the passport data.
An Automated Mouse Tail Vascular Access System by Vision and Pressure Feedback.
Chang, Yen-Chi; Berry-Pusey, Brittany; Yasin, Rashid; Vu, Nam; Maraglia, Brandon; Chatziioannou, Arion X; Tsao, Tsu-Chin
2015-08-01
This paper develops an automated vascular access system (A-VAS) with novel vision-based vein and needle detection methods and real-time pressure feedback for murine drug delivery. Mouse tail vein injection is a routine but critical step for preclinical imaging applications. Due to the small vein diameter and external disturbances such as tail hair, pigmentation, and scales, identifying vein location is difficult and manual injections usually result in poor repeatability. To improve the injection accuracy, consistency, safety, and processing time, A-VAS was developed to overcome difficulties in vein detection noise rejection, robustness in needle tracking, and visual servoing integration with the mechatronics system.
Audiovisual Asynchrony Detection in Human Speech
ERIC Educational Resources Information Center
Maier, Joost X.; Di Luca, Massimiliano; Noppeney, Uta
2011-01-01
Combining information from the visual and auditory senses can greatly enhance intelligibility of natural speech. Integration of audiovisual speech signals is robust even when temporal offsets are present between the component signals. In the present study, we characterized the temporal integration window for speech and nonspeech stimuli with…
NASA Technical Reports Server (NTRS)
Williams-Byrd, Julie; Arney, Dale C.; Hay, Jason; Reeves, John D.; Craig, Douglas
2016-01-01
NASA is transforming human spaceflight. The Agency is shifting from an exploration-based program with human activities in low Earth orbit (LEO) and targeted robotic missions in deep space to a more sustainable and integrated pioneering approach. Through pioneering, NASA seeks to address national goals to develop the capacity for people to work, learn, operate, live, and thrive safely beyond Earth for extended periods of time. However, pioneering space involves daunting technical challenges of transportation, maintaining health, and enabling crew productivity for long durations in remote, hostile, and alien environments. Prudent investments in capability and technology developments, based on mission need, are critical for enabling a campaign of human exploration missions. There are a wide variety of capabilities and technologies that could enable these missions, so it is a major challenge for NASA's Human Exploration and Operations Mission Directorate (HEOMD) to make knowledgeable portfolio decisions. It is critical for this pioneering initiative that these investment decisions are informed with a prioritization process that is robust and defensible. It is NASA's role to invest in targeted technologies and capabilities that would enable exploration missions even though specific requirements have not been identified. To inform these investments decisions, NASA's HEOMD has supported a variety of analysis activities that prioritize capabilities and technologies. These activities are often based on input from subject matter experts within the NASA community who understand the technical challenges of enabling human exploration missions. This paper will review a variety of processes and methods that NASA has used to prioritize and rank capabilities and technologies applicable to human space exploration. The paper will show the similarities in the various processes and showcase instances were customer specified priorities force modifications to the process. Specifically, this paper will describe the processes that the NASA Langley Research Center (LaRC) Technology Assessment and Integration Team (TAIT) has used for several years and how those processes have been customized to meet customer needs while staying robust and defensible. This paper will show how HEOMD uses these analyses results to assist with making informed portfolio investment decisions. The paper will also highlight which human exploration capabilities and technologies typically rank high regardless of the specific design reference mission. The paper will conclude by describing future capability and technology ranking activities that will continue o leverage subject matter experts (SME) input while also incorporating more model-based analysis.
Waters, Katrina M.; Liu, Tao; Quesenberry, Ryan D.; Willse, Alan R.; Bandyopadhyay, Somnath; Kathmann, Loel E.; Weber, Thomas J.; Smith, Richard D.; Wiley, H. Steven; Thrall, Brian D.
2012-01-01
To understand how integration of multiple data types can help decipher cellular responses at the systems level, we analyzed the mitogenic response of human mammary epithelial cells to epidermal growth factor (EGF) using whole genome microarrays, mass spectrometry-based proteomics and large-scale western blots with over 1000 antibodies. A time course analysis revealed significant differences in the expression of 3172 genes and 596 proteins, including protein phosphorylation changes measured by western blot. Integration of these disparate data types showed that each contributed qualitatively different components to the observed cell response to EGF and that varying degrees of concordance in gene expression and protein abundance measurements could be linked to specific biological processes. Networks inferred from individual data types were relatively limited, whereas networks derived from the integrated data recapitulated the known major cellular responses to EGF and exhibited more highly connected signaling nodes than networks derived from any individual dataset. While cell cycle regulatory pathways were altered as anticipated, we found the most robust response to mitogenic concentrations of EGF was induction of matrix metalloprotease cascades, highlighting the importance of the EGFR system as a regulator of the extracellular environment. These results demonstrate the value of integrating multiple levels of biological information to more accurately reconstruct networks of cellular response. PMID:22479638
NOx Sensor for Direct Injection Emission Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Betteridge, William J
2006-02-28
The Electricore/Delphi team continues to leverage the electrochemical planar sensor technology that has produced stoichiometric planar and wide range oxygen sensors as the basis for development of a NOx sensor. Zirconia cell technology with an integrated heater will provide the foundation for the sensor structure. Proven materials and packaging technology will help to ensure a cost-effective approach to the manufacture of this sensor. The electronics technique and interface is considered to be an area where new strategies need to be employed to produce higher S/N ratios of the NOx signal with emphasis on signal stability over time for robustness andmore » durability Both continuous mode and pulse mode control techniques are being evaluated. Packaging the electronics requires careful design and circuit partitioning so that only the necessary signal conditioning electronics are coupled directly in the wiring harness, while the remainder is situated within the ECM for durability and costs reasons. This task continues to be on hold due to the limitation that the definition of the interface electronics was unavailable until very late in the project. The sense element is based on the amperometric method utilizing integrated alumina and zirconia ceramics. Precious metal electrodes are used to form the integrated heater, the cell electrodes and leads. Inside the actual sense cell structure, it is first necessary to separate NOx from the remaining oxygen constituents of the exhaust, without reducing the NOx. Once separated, the NOx will be measured using a measurement cell. Development or test coupons have been used to facilitate material selection and refinement, cell, diffusion barrier, and chamber development. The sense element currently requires elaborate interconnections. To facilitate a robust durable connection, mechanical and metallurgical connections are under investigation. Materials and process refinements continue to play an important role in the development of the sensor.« less
On Integration and Validation of a Very Low Complexity ATC UWB System for Muscle Force Transmission.
Sapienza, Stefano; Crepaldi, Marco; Motto Ros, Paolo; Bonanno, Alberto; Demarchi, Danilo
2016-04-01
The thresholding of Surface ElectroMyoGraphic (sEMG) signals, i.e., Average Threshold Crossing (ATC) technique, reduces the amount of data to be processed enabling circuit complexity reduction and low power consumption. This paper investigates the lowest level of complexity reachable by an ATC system through measurements and in-vivo experiments with an embedded prototype for wireless force transmission, based on asynchronous Impulse-Radio Ultra Wide Band (IR-UWB). The prototype is composed by the acquisition unit, a wearable PCB 23 × 34 mm, which includes a full custom IC integrating a UWB transmitter (chip active silicon area 0.016 mm(2), 1 mW power consumption), and the receiver. The system is completely asynchronous, it acquires a differential sEMG signal, generates the ATC events and triggers a 3.3 GHz IR-UWB transmission. ATC robustness relaxes filters constraints: two passive first order filters have been implemented, bandwidth from 10 Hz up to 1 kHz. Energy needed for the single pulse generation is 30 pJ while the whole PCB consumes 5.65 mW. The pulses radiated by the acquisition unit TX are received by a short-range and low complexity threshold-based 130 nm CMOS IR-UWB receiver with an Ultra Low Power (ULP) baseband unit capable of robustly receiving generic quasi-digital pulse sequences. The acquisition unit have been tested with 10 series of in vivo isometric and isotonic contractions, while the transmission channel with over-the-air and cable measurements obtained with a couple of planar monopole antennas and an integrated 0.004 mm(2) transmitter, the same used for the acquisition unit, with realistic channel conditions. The entire system, acquisition unit and receiver, consumes 15.49 mW.
NASA Astrophysics Data System (ADS)
Ardanuy, Philip; Bensman, Ed; Bergen, Bill; Chen, Bob; Griffith, Frank; Sutton, Cary; Hood, Carroll; Ritchie, Adrian; Tarro, Andre
2006-08-01
This paper considers an evolved technique for significantly enhanced enterprise-level data processing, reprocessing, archival, dissemination, and utilization. There is today a robust working paradigm established with the Advanced Weather Interactive Processing System (AWIPS)-NOAA/NWS's information integration and fusion capability. This process model extends vertically, and seamlessly, from environmental sensing through the direct delivery of societal benefit. NWS, via AWIPS, is the primary source of weather forecast and warning information in the nation. AWIPS is the tested and proven "the nerve center of operations" at all 122 NWS Weather Forecast Offices (WFOs) and 13 River Forecast Centers (RFCs). However, additional line organizations whose role in satisfying NOAA's five mission goals (ecosystems, climate, weather & water, commerce & transportation, and mission support) in multiple program areas might be facilitated through utilization of AWIPS-like functionalities, including the National Marine Fisheries Service (NMFS); National Environmental Satellite, Data, and Information Service (NESDIS); Office of Oceanic & Atmospheric Research (OAR); and the National Ocean Service (NOS). In addition to NOAA's mission goals, there are nine diverse, recommended, and important societal benefit areas in the US Integrated Earth Observation System (IEOS). This paper shows how the satisfaction of this suite of goals and benefit areas can be optimized by leveraging several key ingredients: (1) the evolution of AWIPS towards a net-centric system of services concept of operations; (2) infusion of technologies and concepts from pathfinder systems; (3) the development of new observing systems targeted at deliberate, and not just serendipitous, societal benefit; and (4) the diverse, nested local, regional, national, and international scales of the different benefits and goal areas, and their interoperability and interplay across the system of systems.
Robustness surfaces of complex networks
Manzano, Marc; Sahneh, Faryad; Scoglio, Caterina; Calle, Eusebi; Marzo, Jose Luis
2014-01-01
Despite the robustness of complex networks has been extensively studied in the last decade, there still lacks a unifying framework able to embrace all the proposed metrics. In the literature there are two open issues related to this gap: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the R*-value and introducing the concept of robustness surface (Ω). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and different realizations of the failure process. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows the robustness of different networks to be compared. PMID:25178402
Liu, Wan-Ting; Wang, Yang; Zhang, Jing; Ye, Fei; Huang, Xiao-Hui; Li, Bin; He, Qing-Yu
2018-07-01
Lung adenocarcinoma (LAC) is the most lethal cancer and the leading cause of cancer-related death worldwide. The identification of meaningful clusters of co-expressed genes or representative biomarkers may help improve the accuracy of LAC diagnoses. Public databases, such as the Gene Expression Omnibus (GEO), provide rich resources of valuable information for clinics, however, the integration of multiple microarray datasets from various platforms and institutes remained a challenge. To determine potential indicators of LAC, we performed genome-wide relative significance (GWRS), genome-wide global significance (GWGS) and support vector machine (SVM) analyses progressively to identify robust gene biomarker signatures from 5 different microarray datasets that included 330 samples. The top 200 genes with robust signatures were selected for integrative analysis according to "guilt-by-association" methods, including protein-protein interaction (PPI) analysis and gene co-expression analysis. Of these 200 genes, only 10 genes showed both intensive PPI network and high gene co-expression correlation (r > 0.8). IPA analysis of this regulatory networks suggested that the cell cycle process is a crucial determinant of LAC. CENPA, as well as two linked hub genes CDK1 and CDC20, are determined to be potential indicators of LAC. Immunohistochemical staining showed that CENPA, CDK1 and CDC20 were highly expressed in LAC cancer tissue with co-expression patterns. A Cox regression model indicated that LAC patients with CENPA + /CDK1 + and CENPA + /CDC20 + were high-risk groups in terms of overall survival. In conclusion, our integrated microarray analysis demonstrated that CENPA, CDK1 and CDC20 might serve as novel cluster of prognostic biomarkers for LAC, and the cooperative unit of three genes provides a technically simple approach for identification of LAC patients. Copyright © 2018 Elsevier B.V. All rights reserved.
Performance Optimization Control of ECH using Fuzzy Inference Application
NASA Astrophysics Data System (ADS)
Dubey, Abhay Kumar
Electro-chemical honing (ECH) is a hybrid electrolytic precision micro-finishing technology that, by combining physico-chemical actions of electro-chemical machining and conventional honing processes, provides the controlled functional surfaces-generation and fast material removal capabilities in a single operation. Process multi-performance optimization has become vital for utilizing full potential of manufacturing processes to meet the challenging requirements being placed on the surface quality, size, tolerances and production rate of engineering components in this globally competitive scenario. This paper presents an strategy that integrates the Taguchi matrix experimental design, analysis of variances and fuzzy inference system (FIS) to formulate a robust practical multi-performance optimization methodology for complex manufacturing processes like ECH, which involve several control variables. Two methodologies one using a genetic algorithm tuning of FIS (GA-tuned FIS) and another using an adaptive network based fuzzy inference system (ANFIS) have been evaluated for a multi-performance optimization case study of ECH. The actual experimental results confirm their potential for a wide range of machining conditions employed in ECH.
Guo, Xin-E; Zhao, Yu-Bin; Xie, Yan-Ming; Zhao, Li-Cai; Li, Yan-Feng; Hao, Zhe
2013-09-01
To establish a nurse based post-marketing safety surveillance model for traditional Chinese medicine injections (TCMIs). A TCMIs safety monitoring team and a research hospital team engaged in the research, monitoring processes, and quality control processes were established, in order to achieve comprehensive, timely, accurate and real-time access to research data, to eliminate errors in data collection. A triage system involving a study nurse, as the first point of contact, clinicians and clinical pharmacists was set up in a TCM hospital. Following the specified workflow involving labeling of TCM injections and using improved monitoring forms it was found that there were no missing reports at the ratio of error was zero. A research nurse as the first and main point of contact in post-marketing safety monitoring of TCM as part of a triage model, ensures that research data collected has the characteristics of authenticity, accuracy, timeliness, integrity, and eliminate errors during the process of data collection. Hospital based monitoring is a robust and operable process.
Individual Differences in Base Rate Neglect: A Fuzzy Processing Preference Index
Wolfe, Christopher R.; Fisher, Christopher R.
2013-01-01
Little is known about individual differences in integrating numeric base-rates and qualitative text in making probability judgments. Fuzzy-Trace Theory predicts a preference for fuzzy processing. We conducted six studies to develop the FPPI, a reliable and valid instrument assessing individual differences in this fuzzy processing preference. It consists of 19 probability estimation items plus 4 "M-Scale" items that distinguish simple pattern matching from “base rate respect.” Cronbach's Alpha was consistently above 0.90. Validity is suggested by significant correlations between FPPI scores and three other measurers: "Rule Based" Process Dissociation Procedure scores; the number of conjunction fallacies in joint probability estimation; and logic index scores on syllogistic reasoning. Replicating norms collected in a university study with a web-based study produced negligible differences in FPPI scores, indicating robustness. The predicted relationships between individual differences in base rate respect and both conjunction fallacies and syllogistic reasoning were partially replicated in two web-based studies. PMID:23935255
Integrated Low-Rank-Based Discriminative Feature Learning for Recognition.
Zhou, Pan; Lin, Zhouchen; Zhang, Chao
2016-05-01
Feature learning plays a central role in pattern recognition. In recent years, many representation-based feature learning methods have been proposed and have achieved great success in many applications. However, these methods perform feature learning and subsequent classification in two separate steps, which may not be optimal for recognition tasks. In this paper, we present a supervised low-rank-based approach for learning discriminative features. By integrating latent low-rank representation (LatLRR) with a ridge regression-based classifier, our approach combines feature learning with classification, so that the regulated classification error is minimized. In this way, the extracted features are more discriminative for the recognition tasks. Our approach benefits from a recent discovery on the closed-form solutions to noiseless LatLRR. When there is noise, a robust Principal Component Analysis (PCA)-based denoising step can be added as preprocessing. When the scale of a problem is large, we utilize a fast randomized algorithm to speed up the computation of robust PCA. Extensive experimental results demonstrate the effectiveness and robustness of our method.
Variable Structure PID Control to Prevent Integrator Windup
NASA Technical Reports Server (NTRS)
Hall, C. E.; Hodel, A. S.; Hung, J. Y.
1999-01-01
PID controllers are frequently used to control systems requiring zero steady-state error while maintaining requirements for settling time and robustness (gain/phase margins). PID controllers suffer significant loss of performance due to short-term integrator wind-up when used in systems with actuator saturation. We examine several existing and proposed methods for the prevention of integrator wind-up in both continuous and discrete time implementations.
Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M
2016-09-01
Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.
USDA-ARS?s Scientific Manuscript database
In agricultural settings, examples of effective control strategies using repellent chemicals in integrated pest management (IPM) are relatively scarce compared to those using attractants. This may be partly due to a poor understanding of how repellents affect insect behavior once they are deployed. ...
Millimeter And Submillimeter-Wave Integrated Circuits On Quartz
NASA Technical Reports Server (NTRS)
Mehdi, Imran; Mazed, Mohammad; Siegel, Peter; Smith, R. Peter
1995-01-01
Proposed Quartz substrate Upside-down Integrated Device (QUID) relies on UV-curable adhesive to bond semiconductor with quartz. Integrated circuits including planar GaAs Schottky diodes and passive circuit elements (such as bandpass filters) fabricated on quartz substrates. Circuits designed to operate as mixers in waveguide circuit at millimeter and submillimeter wavelengths. Integrated circuits mechanically more robust, larger, and easier to handle than planar Schottky diode chips. Quartz substrate more suitable for waveguide circuits than GaAs substrate.
Satellites, tweets, forecasts: the future of flood disaster management?
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Lorini, Valerio; Wania, Annett; Pappenberger, Florian; Salamon, Peter; Ramos, Maria Helena; Cloke, Hannah; Castillo, Carlos
2017-04-01
Floods have devastating effects on lives and livelihoods around the world. Structural flood defence measures such as dikes and dams can help protect people. However, it is the emerging science and technologies for flood disaster management and preparedness, such as increasingly accurate flood forecasting systems, high-resolution satellite monitoring, rapid risk mapping, and the unique strength of social media information and crowdsourcing, that are most promising for reducing the impacts of flooding. Here, we describe an innovative framework which integrates in real-time two components of the Copernicus Emergency mapping services, namely the European Flood Awareness System and the satellite-based Rapid Mapping, with new procedures for rapid risk assessment and social media and news monitoring. The integrated framework enables improved flood impact forecast, thanks to the real-time integration of forecasting and monitoring components, and increases the timeliness and efficiency of satellite mapping, with the aim of capturing flood peaks and following the evolution of flooding processes. Thanks to the proposed framework, emergency responders will have access to a broad range of timely and accurate information for more effective and robust planning, decision-making, and resource allocation.
Du, Xuemin; Wang, Juan; Cui, Huanqing; Zhao, Qilong; Chen, Hongxu; He, Le; Wang, Yunlong
2017-11-01
Surfaces patterned with hydrophilic and hydrophobic regions provide robust and versatile means for investigating the wetting behaviors of liquids, surface properties analysis, and producing patterned arrays. However, the fabrication of integral and uniform arrays onto these open systems remains a challenge, thus restricting them from being used in practical applications. Here, we present a simple yet powerful approach for the fabrication of water droplet arrays and the assembly of photonic crystal bead arrays based on hydrophilic-hydrophobic patterned substrates. Various integral arrays are simply prepared in a high-quality output with a low cost, large scale, and uniform size control. By simply taking a breath, which brings moisture to the substrate surface, complex hydrophilic-hydrophobic outlined images can be revisualized in the discontinuous hydrophilic regions. Integration of hydrogel photonic crystal bead arrays into the "breath-taking" process results in breath-responsive photonic crystal beads, which can change their colors upon a mild exhalation. This state-of-the-art technology not only provides an effective methodology for the preparation of patterned arrays but also demonstrates intriguing applications in information storage and biochemical sensors.
NASA Technical Reports Server (NTRS)
Havens, Glen G.
2007-01-01
MRO project is a system of systems requiring system engineering team to architect, design, integrate, test, and operate these systems at each level of the project. The challenge of system engineering mission objectives into a single mission architecture that can be integrated tested, launched, and operated. Systems engineering must translate high-level requirements into integrated mission design. Systems engineering challenges were overcome utilizing a combination by creative designs built into MRO's flight and ground systems: a) Design of sophisticated spacecraft targeting and data management capabilities b) Establishment of a strong operations team organization; c) Implementation of robust operational processes; and d) Development of strategic ground tools. The MRO system has met the challenge of its driving requirements: a) MRO began its two-year primary science phase on November 7, 2006, and by July 2007, met it minimum requirement to collect 15 Tbits of data after only eight months of operations. Currently we have collected 22 Tbits. b) Based on current performance, mission data return could return 70 Tbits of data by the end of the primary science phase in 2008.
An integrated semiconductor device enabling non-optical genome sequencing.
Rothberg, Jonathan M; Hinz, Wolfgang; Rearick, Todd M; Schultz, Jonathan; Mileski, William; Davey, Mel; Leamon, John H; Johnson, Kim; Milgrew, Mark J; Edwards, Matthew; Hoon, Jeremy; Simons, Jan F; Marran, David; Myers, Jason W; Davidson, John F; Branting, Annika; Nobile, John R; Puc, Bernard P; Light, David; Clark, Travis A; Huber, Martin; Branciforte, Jeffrey T; Stoner, Isaac B; Cawley, Simon E; Lyons, Michael; Fu, Yutao; Homer, Nils; Sedova, Marina; Miao, Xin; Reed, Brian; Sabina, Jeffrey; Feierstein, Erika; Schorn, Michelle; Alanjary, Mohammad; Dimalanta, Eileen; Dressman, Devin; Kasinskas, Rachel; Sokolsky, Tanya; Fidanza, Jacqueline A; Namsaraev, Eugeni; McKernan, Kevin J; Williams, Alan; Roth, G Thomas; Bustillo, James
2011-07-20
The seminal importance of DNA sequencing to the life sciences, biotechnology and medicine has driven the search for more scalable and lower-cost solutions. Here we describe a DNA sequencing technology in which scalable, low-cost semiconductor manufacturing techniques are used to make an integrated circuit able to directly perform non-optical DNA sequencing of genomes. Sequence data are obtained by directly sensing the ions produced by template-directed DNA polymerase synthesis using all-natural nucleotides on this massively parallel semiconductor-sensing device or ion chip. The ion chip contains ion-sensitive, field-effect transistor-based sensors in perfect register with 1.2 million wells, which provide confinement and allow parallel, simultaneous detection of independent sequencing reactions. Use of the most widely used technology for constructing integrated circuits, the complementary metal-oxide semiconductor (CMOS) process, allows for low-cost, large-scale production and scaling of the device to higher densities and larger array sizes. We show the performance of the system by sequencing three bacterial genomes, its robustness and scalability by producing ion chips with up to 10 times as many sensors and sequencing a human genome.
Kawakami, Eiryo; Singh, Vivek K; Matsubara, Kazuko; Ishii, Takashi; Matsuoka, Yukiko; Hase, Takeshi; Kulkarni, Priya; Siddiqui, Kenaz; Kodilkar, Janhavi; Danve, Nitisha; Subramanian, Indhupriya; Katoh, Manami; Shimizu-Yoshida, Yuki; Ghosh, Samik; Jere, Abhay; Kitano, Hiroaki
2016-01-01
Cellular stress responses require exquisite coordination between intracellular signaling molecules to integrate multiple stimuli and actuate specific cellular behaviors. Deciphering the web of complex interactions underlying stress responses is a key challenge in understanding robust biological systems and has the potential to lead to the discovery of targeted therapeutics for diseases triggered by dysregulation of stress response pathways. We constructed large-scale molecular interaction maps of six major stress response pathways in Saccharomyces cerevisiae (baker’s or budding yeast). Biological findings from over 900 publications were converted into standardized graphical formats and integrated into a common framework. The maps are posted at http://www.yeast-maps.org/yeast-stress-response/ for browse and curation by the research community. On the basis of these maps, we undertook systematic analyses to unravel the underlying architecture of the networks. A series of network analyses revealed that yeast stress response pathways are organized in bow–tie structures, which have been proposed as universal sub-systems for robust biological regulation. Furthermore, we demonstrated a potential role for complexes in stabilizing the conserved core molecules of bow–tie structures. Specifically, complex-mediated reversible reactions, identified by network motif analyses, appeared to have an important role in buffering the concentration and activity of these core molecules. We propose complex-mediated reactions as a key mechanism mediating robust regulation of the yeast stress response. Thus, our comprehensive molecular interaction maps provide not only an integrated knowledge base, but also a platform for systematic network analyses to elucidate the underlying architecture in complex biological systems. PMID:28725465
NASA Astrophysics Data System (ADS)
Cheng, Yung-Chang; Lee, Cheng-Kang
2017-10-01
This paper proposes a systematic method, integrating the uniform design (UD) of experiments and quantum-behaved particle swarm optimization (QPSO), to solve the problem of a robust design for a railway vehicle suspension system. Based on the new nonlinear creep model derived from combining Hertz contact theory, Kalker's linear theory and a heuristic nonlinear creep model, the modeling and dynamic analysis of a 24 degree-of-freedom railway vehicle system were investigated. The Lyapunov indirect method was used to examine the effects of suspension parameters, wheel conicities and wheel rolling radii on critical hunting speeds. Generally, the critical hunting speeds of a vehicle system resulting from worn wheels with different wheel rolling radii are lower than those of a vehicle system having original wheels without different wheel rolling radii. Because of worn wheels, the critical hunting speed of a running railway vehicle substantially declines over the long term. For safety reasons, it is necessary to design the suspension system parameters to increase the robustness of the system and decrease the sensitive of wheel noises. By applying UD and QPSO, the nominal-the-best signal-to-noise ratio of the system was increased from -48.17 to -34.05 dB. The rate of improvement was 29.31%. This study has demonstrated that the integration of UD and QPSO can successfully reveal the optimal solution of suspension parameters for solving the robust design problem of a railway vehicle suspension system.
NASA Astrophysics Data System (ADS)
Vandergoes, Marcus J.; Howarth, Jamie D.; Dunbar, Gavin B.; Turnbull, Jocelyn C.; Roop, Heidi A.; Levy, Richard H.; Li, Xun; Prior, Christine; Norris, Margaret; Keller, Liz D.; Baisden, W. Troy; Ditchburn, Robert; Fitzsimons, Sean J.; Bronk Ramsey, Christopher
2018-05-01
Annually resolved (varved) lake sequences are important palaeoenvironmental archives as they offer a direct incremental dating technique for high-frequency reconstruction of environmental and climate change. Despite the importance of these records, establishing a robust chronology and quantifying its precision and accuracy (estimations of error) remains an essential but challenging component of their development. We outline an approach for building reliable independent chronologies, testing the accuracy of layer counts and integrating all chronological uncertainties to provide quantitative age and error estimates for varved lake sequences. The approach incorporates (1) layer counts and estimates of counting precision; (2) radiometric and biostratigrapic dating techniques to derive independent chronology; and (3) the application of Bayesian age modelling to produce an integrated age model. This approach is applied to a case study of an annually resolved sediment record from Lake Ohau, New Zealand. The most robust age model provides an average error of 72 years across the whole depth range. This represents a fractional uncertainty of ∼5%, higher than the <3% quoted for most published varve records. However, the age model and reported uncertainty represent the best fit between layer counts and independent chronology and the uncertainties account for both layer counting precision and the chronological accuracy of the layer counts. This integrated approach provides a more representative estimate of age uncertainty and therefore represents a statistically more robust chronology.
NASA Astrophysics Data System (ADS)
Zarindast, Atousa; Seyed Hosseini, Seyed Mohamad; Pishvaee, Mir Saman
2017-06-01
Robust supplier selection problem, in a scenario-based approach has been proposed, when the demand and exchange rates are subject to uncertainties. First, a deterministic multi-objective mixed integer linear programming is developed; then, the robust counterpart of the proposed mixed integer linear programming is presented using the recent extension in robust optimization theory. We discuss decision variables, respectively, by a two-stage stochastic planning model, a robust stochastic optimization planning model which integrates worst case scenario in modeling approach and finally by equivalent deterministic planning model. The experimental study is carried out to compare the performances of the three models. Robust model resulted in remarkable cost saving and it illustrated that to cope with such uncertainties, we should consider them in advance in our planning. In our case study different supplier were selected due to this uncertainties and since supplier selection is a strategic decision, it is crucial to consider these uncertainties in planning approach.
Nonlinear dynamics of emotion-cognition interaction: when emotion does not destroy cognition?
Afraimovich, Valentin; Young, Todd; Muezzinoglu, Mehmet K; Rabinovich, Mikhail I
2011-02-01
Emotion (i.e., spontaneous motivation and subsequent implementation of a behavior) and cognition (i.e., problem solving by information processing) are essential to how we, as humans, respond to changes in our environment. Recent studies in cognitive science suggest that emotion and cognition are subserved by different, although heavily integrated, neural systems. Understanding the time-varying relationship of emotion and cognition is a challenging goal with important implications for neuroscience. We formulate here the dynamical model of emotion-cognition interaction that is based on the following principles: (1) the temporal evolution of cognitive and emotion modes are captured by the incoming stimuli and competition within and among themselves (competition principle); (2) metastable states exist in the unified emotion-cognition phase space; and (3) the brain processes information with robust and reproducible transients through the sequence of metastable states. Such a model can take advantage of the often ignored temporal structure of the emotion-cognition interaction to provide a robust and generalizable method for understanding the relationship between brain activation and complex human behavior. The mathematical image of the robust and reproducible transient dynamics is a Stable Heteroclinic Sequence (SHS), and the Stable Heteroclinic Channels (SHCs). These have been hypothesized to be possible mechanisms that lead to the sequential transient behavior observed in networks. We investigate the modularity of SHCs, i.e., given a SHS and a SHC that is supported in one part of a network, we study conditions under which the SHC pertaining to the cognition will continue to function in the presence of interfering activity with other parts of the network, i.e., emotion.
Haebig, Eileen; Leonard, Laurence; Usler, Evan; Deevy, Patricia; Weber, Christine
2018-03-15
Previous behavioral studies have found deficits in lexical-semantic abilities in children with specific language impairment (SLI), including reduced depth and breadth of word knowledge. This study explored the neural correlates of early emerging familiar word processing in preschoolers with SLI and typical development. Fifteen preschoolers with typical development and 15 preschoolers with SLI were presented with pictures followed after a brief delay by an auditory label that did or did not match. Event-related brain potentials were time locked to the onset of the auditory labels. Children provided verbal judgments of whether the label matched the picture. There were no group differences in the accuracy of identifying when pictures and labels matched or mismatched. Event-related brain potential data revealed that mismatch trials elicited a robust N400 in both groups, with no group differences in mean amplitude or peak latency. However, the typically developing group demonstrated a more robust late positive component, elicited by mismatch trials. These initial findings indicate that lexical-semantic access of early acquired words, indexed by the N400, does not differ between preschoolers with SLI and typical development when highly familiar words are presented in isolation. However, the typically developing group demonstrated a more mature profile of postlexical reanalysis and integration, indexed by an emerging late positive component. The findings lay the necessary groundwork for better understanding processing of newly learned words in children with SLI.
Barik, Anwesha; Banerjee, Satarupa; Dhara, Santanu; Chakravorty, Nishant
2017-04-01
Complexities in the full genome expression studies hinder the extraction of tracker genes to analyze the course of biological events. In this study, we demonstrate the applications of supervised machine learning methods to reduce the irrelevance in microarray data series and thereby extract robust molecular markers to track biological processes. The methodology has been illustrated by analyzing whole genome expression studies on bone-implant integration (ossointegration). Being a biological process, osseointegration is known to leave a trail of genetic footprint during the course. In spite of existence of enormous amount of raw data in public repositories, researchers still do not have access to a panel of genes that can definitively track osseointegration. The results from our study revealed panels comprising of matrix metalloproteinases and collagen genes were able to track osseointegration on implant surfaces (MMP9 and COL1A2 on micro-textured; MMP12 and COL6A3 on superimposed nano-textured surfaces) with 100% classification accuracy, specificity and sensitivity. Further, our analysis showed the importance of the progression of the duration in establishment of the mechanical connection at bone-implant surface. The findings from this study are expected to be useful to researchers investigating osseointegration of novel implant materials especially at the early stage. The methodology demonstrated can be easily adapted by scientists in different fields to analyze large databases for other biological processes. Copyright © 2017 Elsevier Inc. All rights reserved.
On-Line Robust Modal Stability Prediction using Wavelet Processing
NASA Technical Reports Server (NTRS)
Brenner, Martin J.; Lind, Rick
1998-01-01
Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.
Chen, Bor-Sen; Lin, Ying-Po
2011-01-01
In the evolutionary process, the random transmission and mutation of genes provide biological diversities for natural selection. In order to preserve functional phenotypes between generations, gene networks need to evolve robustly under the influence of random perturbations. Therefore, the robustness of the phenotype, in the evolutionary process, exerts a selection force on gene networks to keep network functions. However, gene networks need to adjust, by variations in genetic content, to generate phenotypes for new challenges in the network’s evolution, ie, the evolvability. Hence, there should be some interplay between the evolvability and network robustness in evolutionary gene networks. In this study, the interplay between the evolvability and network robustness of a gene network and a biochemical network is discussed from a nonlinear stochastic system point of view. It was found that if the genetic robustness plus environmental robustness is less than the network robustness, the phenotype of the biological network is robust in evolution. The tradeoff between the genetic robustness and environmental robustness in evolution is discussed from the stochastic stability robustness and sensitivity of the nonlinear stochastic biological network, which may be relevant to the statistical tradeoff between bias and variance, the so-called bias/variance dilemma. Further, the tradeoff could be considered as an antagonistic pleiotropic action of a gene network and discussed from the systems biology perspective. PMID:22084563
Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes
2012-01-01
Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.
Defining robustness protocols: a method to include and evaluate robustness in clinical plans
NASA Astrophysics Data System (ADS)
McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.
2015-04-01
We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.
Tumusiime, David Katuruba; Agaba, Gad; Kyomuhangi, Teddy; Finch, Jan; Kabakyenga, Jerome; MacLeod, Stuart
2014-01-01
A substantial literature suggests that mobile phones have great potential to improve management and survival of acutely ill children in rural Africa. The national strategy of the Ugandan Ministry of Health calls for employment of volunteer community health workers (CHWs) in implementation of Integrated Community Case Management (iCCM) of common illnesses (diarrhea, acute respiratory infection, pneumonia, fever/malaria) affecting children under five years of age. A mobile phone enabled system was developed within iCCM aiming to improve access by CHWs to medical advice and to strengthen reporting of data on danger signs and symptoms for acutely ill children under five years of age. Herein critical steps in development, implementation, and integration of mobile phone technology within iCCM are described. Mechanisms to improve diagnosis, treatment and referral of sick children under five were defined. Treatment algorithms were developed by the project technical team and mounted and piloted on the mobile phones, using an iterative process involving technical support personnel, health care providers, and academic support. Using a purposefully developed mobile phone training manual, CHWs were trained over an intensive five-day course to make timely diagnoses, recognize clinical danger signs, communicate about referrals and initiate treatment with appropriate essential drugs. Performance by CHWs and the accuracy and completeness of their submitted data was closely monitored post training test period and during the subsequent nine month community trial. In the full trial, the number of referrals and correctly treated children, based on the agreed treatment algorithms, was recorded. Births, deaths, and medication stocks were also tracked. Seven distinct phases were required to develop a robust mobile phone enabled system in support of the iCCM program. Over a nine month period, 96 CHWs were trained to use mobile phones and their competence to initiate a community trial was established through performance monitoring. Local information/communication consultants, working in concert with a university based department of pediatrics, can design and implement a robust mobile phone based system that may be anticipated to contribute to efficient delivery of iCCM by trained volunteer CHWs in rural settings in Uganda.
Plasma jet printing for flexible substrates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gandhiraman, Ram P.; Singh, Eric; Diaz-Cartagena, Diana C.
2016-03-21
Recent interest in flexible electronics and wearable devices has created a demand for fast and highly repeatable printing processes suitable for device manufacturing. Robust printing technology is critical for the integration of sensors and other devices on flexible substrates such as paper and textile. An atmospheric pressure plasma-based printing process has been developed to deposit different types of nanomaterials on flexible substrates. Multiwalled carbon nanotubes were deposited on paper to demonstrate site-selective deposition as well as direct printing without any type of patterning. Plasma-printed nanotubes were compared with non-plasma-printed samples under similar gas flow and other experimental conditions and foundmore » to be denser with higher conductivity. The utility of the nanotubes on the paper substrate as a biosensor and chemical sensor was demonstrated by the detection of dopamine, a neurotransmitter, and ammonia, respectively.« less
Active Multimodal Sensor System for Target Recognition and Tracking
Zhang, Guirong; Zou, Zhaofan; Liu, Ziyue; Mao, Jiansen
2017-01-01
High accuracy target recognition and tracking systems using a single sensor or a passive multisensor set are susceptible to external interferences and exhibit environmental dependencies. These difficulties stem mainly from limitations to the available imaging frequency bands, and a general lack of coherent diversity of the available target-related data. This paper proposes an active multimodal sensor system for target recognition and tracking, consisting of a visible, an infrared, and a hyperspectral sensor. The system makes full use of its multisensor information collection abilities; furthermore, it can actively control different sensors to collect additional data, according to the needs of the real-time target recognition and tracking processes. This level of integration between hardware collection control and data processing is experimentally shown to effectively improve the accuracy and robustness of the target recognition and tracking system. PMID:28657609