Sample records for reduce system complexity

  1. Lasercom system architecture with reduced complexity

    NASA Technical Reports Server (NTRS)

    Lesh, James R. (Inventor); Chen, Chien-Chung (Inventor); Ansari, Homayoon (Inventor)

    1994-01-01

    Spatial acquisition and precision beam pointing functions are critical to spaceborne laser communication systems. In the present invention, a single high bandwidth CCD detector is used to perform both spatial acquisition and tracking functions. Compared to previous lasercom hardware design, the array tracking concept offers reduced system complexity by reducing the number of optical elements in the design. Specifically, the design requires only one detector and one beam steering mechanism. It also provides the means to optically close the point-ahead control loop. The technology required for high bandwidth array tracking was examined and shown to be consistent with current state of the art. The single detector design can lead to a significantly reduced system complexity and a lower system cost.

  2. LaserCom System Architecture With Reduced Complexity

    NASA Technical Reports Server (NTRS)

    Lesh, James R. (Inventor); Chen, Chien-Chung (Inventor); Ansari, Homa-Yoon (Inventor)

    1996-01-01

    Spatial acquisition and precision beam pointing functions are critical to spaceborne laser communication systems. In the present invention a single high bandwidth CCD detector is used to perform both spatial acquisition and tracking functions. Compared to previous lasercom hardware design, the array tracking concept offers reduced system complexity by reducing the number of optical elements in the design. Specifically, the design requires only one detector and one beam steering mechanism. It also provides means to optically close the point-ahead control loop. The technology required for high bandwidth array tracking was examined and shown to be consistent with current state of the art. The single detector design can lead to a significantly reduced system complexity and a lower system cost.

  3. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems

    PubMed Central

    Takecian, Pedro L.; Oikawa, Marcio K.; Braghetto, Kelly R.; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S.; Acker, Susan; Carneiro-Proietti, Anna B. F.; Sabino, Ester C.; Custer, Brian; Busch, Michael P.; Ferreira, João E.

    2013-01-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development. PMID:23729945

  4. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems.

    PubMed

    Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E

    2013-06-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.

  5. Adaptive simplification of complex multiscale systems.

    PubMed

    Chiavazzo, Eliodoro; Karlin, Ilya

    2011-03-01

    A fully adaptive methodology is developed for reducing the complexity of large dissipative systems. This represents a significant step toward extracting essential physical knowledge from complex systems, by addressing the challenging problem of a minimal number of variables needed to exactly capture the system dynamics. Accurate reduced description is achieved, by construction of a hierarchy of slow invariant manifolds, with an embarrassingly simple implementation in any dimension. The method is validated with the autoignition of the hydrogen-air mixture where a reduction to a cascade of slow invariant manifolds is observed.

  6. Low complexity Reed-Solomon-based low-density parity-check design for software defined optical transmission system based on adaptive puncturing decoding algorithm

    NASA Astrophysics Data System (ADS)

    Pan, Xiaolong; Liu, Bo; Zheng, Jianglong; Tian, Qinghua

    2016-08-01

    We propose and demonstrate a low complexity Reed-Solomon-based low-density parity-check (RS-LDPC) code with adaptive puncturing decoding algorithm for elastic optical transmission system. Partial received codes and the relevant column in parity-check matrix can be punctured to reduce the calculation complexity by adaptive parity-check matrix during decoding process. The results show that the complexity of the proposed decoding algorithm is reduced by 30% compared with the regular RS-LDPC system. The optimized code rate of the RS-LDPC code can be obtained after five times iteration.

  7. Design considerations to improve cognitive ergonomic issues of unmanned vehicle interfaces utilizing video game controllers.

    PubMed

    Oppold, P; Rupp, M; Mouloua, M; Hancock, P A; Martin, J

    2012-01-01

    Unmanned (UAVs, UCAVs, and UGVs) systems still have major human factors and ergonomic challenges related to the effective design of their control interface systems, crucial to their efficient operation, maintenance, and safety. Unmanned system interfaces with a human centered approach promote intuitive interfaces that are easier to learn, and reduce human errors and other cognitive ergonomic issues with interface design. Automation has shifted workload from physical to cognitive, thus control interfaces for unmanned systems need to reduce mental workload on the operators and facilitate the interaction between vehicle and operator. Two-handed video game controllers provide wide usability within the overall population, prior exposure for new operators, and a variety of interface complexity levels to match the complexity level of the task and reduce cognitive load. This paper categorizes and provides taxonomy for 121 haptic interfaces from the entertainment industry that can be utilized as control interfaces for unmanned systems. Five categories of controllers were based on the complexity of the buttons, control pads, joysticks, and switches on the controller. This allows the selection of the level of complexity needed for a specific task without creating an entirely new design or utilizing an overly complex design.

  8. Complexity, Systems, and Software

    DTIC Science & Technology

    2014-08-14

    2014 Carnegie Mellon University Complexity, Systems, and Software Software Engineering Institute Carnegie Mellon University Pittsburgh, PA...this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services , Directorate for Information...OMB control number. 1. REPORT DATE 29 OCT 2014 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Complexity, Systems, and Software

  9. Decreasing the temporal complexity for nonlinear, implicit reduced-order models by forecasting

    DOE PAGES

    Carlberg, Kevin; Ray, Jaideep; van Bloemen Waanders, Bart

    2015-02-14

    Implicit numerical integration of nonlinear ODEs requires solving a system of nonlinear algebraic equations at each time step. Each of these systems is often solved by a Newton-like method, which incurs a sequence of linear-system solves. Most model-reduction techniques for nonlinear ODEs exploit knowledge of system's spatial behavior to reduce the computational complexity of each linear-system solve. However, the number of linear-system solves for the reduced-order simulation often remains roughly the same as that for the full-order simulation. We propose exploiting knowledge of the model's temporal behavior to (1) forecast the unknown variable of the reduced-order system of nonlinear equationsmore » at future time steps, and (2) use this forecast as an initial guess for the Newton-like solver during the reduced-order-model simulation. To compute the forecast, we propose using the Gappy POD technique. As a result, the goal is to generate an accurate initial guess so that the Newton solver requires many fewer iterations to converge, thereby decreasing the number of linear-system solves in the reduced-order-model simulation.« less

  10. Complexity, flow, and antifragile healthcare systems: implications for nurse executives.

    PubMed

    Clancy, Thomas R

    2015-04-01

    As systems evolve over time, their natural tendency is to become increasingly more complex. Studies in the field of complex systems have generated new perspectives on the application of management strategies in health systems. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. In this article, I further discuss the concept of fragility, its impact on system behavior, and ways to reduce it.

  11. Social networks as embedded complex adaptive systems.

    PubMed

    Benham-Hutchins, Marge; Clancy, Thomas R

    2010-09-01

    As systems evolve over time, their natural tendency is to become increasingly more complex. Studies in the field of complex systems have generated new perspectives on management in social organizations such as hospitals. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. This is the 15th in a series of articles applying complex systems science to the traditional management concepts of planning, organizing, directing, coordinating, and controlling. In this article, the authors discuss healthcare social networks as a hierarchy of embedded complex adaptive systems. The authors further examine the use of social network analysis tools as a means to understand complex communication patterns and reduce medical errors.

  12. System Complexity Reduction via Feature Selection

    ERIC Educational Resources Information Center

    Deng, Houtao

    2011-01-01

    This dissertation transforms a set of system complexity reduction problems to feature selection problems. Three systems are considered: classification based on association rules, network structure learning, and time series classification. Furthermore, two variable importance measures are proposed to reduce the feature selection bias in tree…

  13. Direct-to-digital holography reduction of reference hologram noise and fourier space smearing

    DOEpatents

    Voelkl, Edgar

    2006-06-27

    Systems and methods are described for reduction of reference hologram noise and reduction of Fourier space smearing, especially in the context of direct-to-digital holography (off-axis interferometry). A method of reducing reference hologram noise includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference image waves; and transforming the corresponding plurality of reference image waves into a reduced noise reference image wave. A method of reducing smearing in Fourier space includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference complex image waves; transforming the corresponding plurality of reference image waves into a reduced noise reference complex image wave; recording a hologram of an object; processing the hologram of the object into an object complex image wave; and dividing the complex image wave of the object by the reduced noise reference complex image wave to obtain a reduced smearing object complex image wave.

  14. Reduction of Subjective and Objective System Complexity

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.

    2015-01-01

    Occam's razor is often used in science to define the minimum criteria to establish a physical or philosophical idea or relationship. Albert Einstein is attributed the saying "everything should be made as simple as possible, but not simpler". These heuristic ideas are based on a belief that there is a minimum state or set of states for a given system or phenomena. In looking at system complexity, these heuristics point us to an idea that complexity can be reduced to a minimum. How then, do we approach a reduction in complexity? Complexity has been described as a subjective concept and an objective measure of a system. Subjective complexity is based on human cognitive comprehension of the functions and inter relationships of a system. Subjective complexity is defined by the ability to fully comprehend the system. Simplifying complexity, in a subjective sense, is thus gaining a deeper understanding of the system. As Apple's Jonathon Ive has stated," It's not just minimalism or the absence of clutter. It involves digging through the depth of complexity. To be truly simple, you have to go really deep". Simplicity is not the absence of complexity but a deeper understanding of complexity. Subjective complexity, based on this human comprehension, cannot then be discerned from the sociological concept of ignorance. The inability to comprehend a system can be either a lack of knowledge, an inability to understand the intricacies of a system, or both. Reduction in this sense is based purely on a cognitive ability to understand the system and no system then may be truly complex. From this view, education and experience seem to be the keys to reduction or eliminating complexity. Objective complexity, is the measure of the systems functions and interrelationships which exist independent of human comprehension. Jonathon Ive's statement does not say that complexity is removed, only that the complexity is understood. From this standpoint, reduction of complexity can be approached in finding the optimal or 'best balance' of the system functions and interrelationships. This is achievable following von Bertalanffy's approach of describing systems as a set of equations representing both the system functions and the system interrelationships. Reduction is found based on an objective function defining the system output given variations in the system inputs and the system operating environment. By minimizing the objective function with respect to these inputs and environments, a reduced system can be found. Thus, a reduction of the system complexity is feasible.

  15. Temporal Decompostion of a Distribution System Quasi-Static Time-Series Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry A; Hunsberger, Randolph J

    This paper documents the first phase of an investigation into reducing runtimes of complex OpenDSS models through parallelization. As the method seems promising, future work will quantify - and further mitigate - errors arising from this process. In this initial report, we demonstrate how, through the use of temporal decomposition, the run times of a complex distribution-system-level quasi-static time series simulation can be reduced roughly proportional to the level of parallelization. Using this method, the monolithic model runtime of 51 hours was reduced to a minimum of about 90 minutes. As expected, this comes at the expense of control- andmore » voltage-errors at the time-slice boundaries. All evaluations were performed using a real distribution circuit model with the addition of 50 PV systems - representing a mock complex PV impact study. We are able to reduce induced transition errors through the addition of controls initialization, though small errors persist. The time savings with parallelization are so significant that we feel additional investigation to reduce control errors is warranted.« less

  16. Complexity, Diversity and Ambiguity in Teaching and Teacher Education: Practical Wisdom, Pedagogical Fitness and Tact of Teaching

    ERIC Educational Resources Information Center

    Riedler, Martina; Eryaman, Mustafa Yunus

    2016-01-01

    There is consensus in the literature that teacher education programs exhibit the characteristics of complex systems. These characteristics of teacher education programs as complex systems challenges the conventional, teacher-directed/ textbook-based positivist approaches in teacher education literature which has tried to reduce the complexities…

  17. Energy conservation and analysis and evaluation. [specifically at Slidell Computer Complex

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The survey assembled and made recommendations directed at conserving utilities and reducing the use of energy at the Slidell Computer Complex. Specific items included were: (1) scheduling and controlling the use of gas and electricity, (2) building modifications to reduce energy, (3) replacement of old, inefficient equipment, (4) modifications to control systems, (5) evaluations of economizer cycles in HVAC systems, and (6) corrective settings for thermostats, ductstats, and other temperature and pressure control devices.

  18. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  19. A haptics-assisted cranio-maxillofacial surgery planning system for restoring skeletal anatomy in complex trauma cases.

    PubMed

    Olsson, Pontus; Nysjö, Fredrik; Hirsch, Jan-Michaél; Carlbom, Ingrid B

    2013-11-01

       Cranio-maxillofacial (CMF) surgery to restore normal skeletal anatomy in patients with serious trauma to the face can be both complex and time-consuming. But it is generally accepted that careful pre-operative planning leads to a better outcome with a higher degree of function and reduced morbidity in addition to reduced time in the operating room. However, today's surgery planning systems are primitive, relying mostly on the user's ability to plan complex tasks with a two-dimensional graphical interface.    A system for planning the restoration of skeletal anatomy in facial trauma patients using a virtual model derived from patient-specific CT data. The system combines stereo visualization with six degrees-of-freedom, high-fidelity haptic feedback that enables analysis, planning, and preoperative testing of alternative solutions for restoring bone fragments to their proper positions. The stereo display provides accurate visual spatial perception, and the haptics system provides intuitive haptic feedback when bone fragments are in contact as well as six degrees-of-freedom attraction forces for precise bone fragment alignment.    A senior surgeon without prior experience of the system received 45 min of system training. Following the training session, he completed a virtual reconstruction in 22 min of a complex mandibular fracture with an adequately reduced result.    Preliminary testing with one surgeon indicates that our surgery planning system, which combines stereo visualization with sophisticated haptics, has the potential to become a powerful tool for CMF surgery planning. With little training, it allows a surgeon to complete a complex plan in a short amount of time.

  20. Improvement in Titanium Complexes Bearing Schiff Base Ligands in the Ring-Opening Polymerization of L-Lactide: A Dinuclear System with Hydrazine-Bridging Schiff Base Ligands.

    PubMed

    Tseng, Hsi-Ching; Chen, Hsing-Yin; Huang, Yen-Tzu; Lu, Wei-Yi; Chang, Yu-Lun; Chiang, Michael Y; Lai, Yi-Chun; Chen, Hsuan-Ying

    2016-02-15

    A series of titanium (Ti) complexes bearing hydrazine-bridging Schiff base ligands were synthesized and investigated as catalysts for the ring-opening polymerization (ROP) of L-lactide (LA). Complexes with electron withdrawing or steric bulky groups reduced the catalytic activity. In addition, the steric bulky substituent on the imine groups reduced the space around the Ti atom and then reduced LA coordination with Ti atom, thereby reducing catalytic activity. All the dinuclear Ti complexes exhibited higher catalytic activity (approximately 10-60-fold) than mononuclear L(Cl-H)-TiOPr2 did. The strategy of bridging dinuclear Ti complexes with isopropoxide groups in the ROP of LA was successful, and adjusting the crowded heptacoordinated transition state by the bridging isopropoxide groups may be the key to our successful strategy.

  1. Reducing beam shaper alignment complexity: diagnostic techniques for alignment and tuning

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd E.

    2011-10-01

    Safe and efficient optical alignment is a critical requirement for industrial laser systems used in a high volume manufacturing environment. Of specific interest is the development of techniques to align beam shaping optics within a beam line; having the ability to instantly verify by a qualitative means that each element is in its proper position as the beam shaper module is being aligned. There is a need to reduce these types of alignment techniques down to a level where even a newbie to optical alignment will be able to complete the task. Couple this alignment need with the fact that most laser system manufacturers ship their products worldwide and the introduction of a new set of variables including cultural and language barriers, makes this a top priority for manufacturers. Tools and methodologies for alignment of complex optical systems need to be able to cross these barriers to ensure the highest degree of up time and reduce the cost of maintenance on the production floor. Customers worldwide, who purchase production laser equipment, understand that the majority of costs to a manufacturing facility is spent on system maintenance and is typically the largest single controllable expenditure in a production plant. This desire to reduce costs is driving the trend these days towards predictive and proactive, not reactive maintenance of laser based optical beam delivery systems [10]. With proper diagnostic tools, laser system developers can develop proactive approaches to reduce system down time, safe guard operational performance and reduce premature or catastrophic optics failures. Obviously analytical data will provide quantifiable performance standards which are more precise than qualitative standards, but each have a role in determining overall optical system performance [10]. This paper will discuss the use of film and fluorescent mirror devices as diagnostic tools for beam shaper module alignment off line or in-situ. The paper will also provide an overview methodology showing how it is possible to reduce complex alignment directions into a simplified set of instructions for layman service engineers.

  2. Application of simplified Complexity Theory concepts for healthcare social systems to explain the implementation of evidence into practice.

    PubMed

    Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane

    2016-02-01

    To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.

  3. 40-Gb/s PAM4 with low-complexity equalizers for next-generation PON systems

    NASA Astrophysics Data System (ADS)

    Tang, Xizi; Zhou, Ji; Guo, Mengqi; Qi, Jia; Hu, Fan; Qiao, Yaojun; Lu, Yueming

    2018-01-01

    In this paper, we demonstrate 40-Gb/s four-level pulse amplitude modulation (PAM4) transmission with 10 GHz devices and low-complexity equalizers for next-generation passive optical network (PON) systems. Simple feed-forward equalizer (FFE) and decision feedback equalizer (DFE) enable 20 km fiber transmission while high-complexity Volterra algorithm in combination with FFE and DFE can extend the transmission distance to 40 km. A simplified Volterra algorithm is proposed for reducing computational complexity. Simulation results show that the simplified Volterra algorithm reduces up to ∼75% computational complexity at a relatively low cost of only 0.4 dB power budget. At a forward error correction (FEC) threshold of 10-3 , we achieve 31.2 dB and 30.8 dB power budget over 40 km fiber transmission using traditional FFE-DFE-Volterra and our simplified FFE-DFE-Volterra, respectively.

  4. Levels of control exerted by the Isc iron-sulfur cluster system on biosynthesis of the formate hydrogenlyase complex.

    PubMed

    Pinske, Constanze; Jaroschinsky, Monique; Sawers, R Gary

    2013-06-01

    The membrane-associated formate hydrogenlyase (FHL) complex of bacteria like Escherichia coli is responsible for the disproportionation of formic acid into the gaseous products carbon dioxide and dihydrogen. It comprises minimally seven proteins including FdhF and HycE, the catalytic subunits of formate dehydrogenase H and hydrogenase 3, respectively. Four proteins of the FHL complex have iron-sulphur cluster ([Fe-S]) cofactors. Biosynthesis of [Fe-S] is principally catalysed by the Isc or Suf systems and each comprises proteins for assembly and for delivery of [Fe-S]. This study demonstrates that the Isc system is essential for biosynthesis of an active FHL complex. In the absence of the IscU assembly protein no hydrogen production or activity of FHL subcomponents was detected. A deletion of the iscU gene also resulted in reduced intracellular formate levels partially due to impaired synthesis of pyruvate formate-lyase, which is dependent on the [Fe-S]-containing regulator FNR. This caused reduced expression of the formate-inducible fdhF gene. The A-type carrier (ATC) proteins IscA and ErpA probably deliver [Fe-S] to specific apoprotein components of the FHL complex because mutants lacking either protein exhibited strongly reduced hydrogen production. Neither ATC protein could compensate for the lack of the other, suggesting that they had independent roles in [Fe-S] delivery to complex components. Together, the data indicate that the Isc system modulates FHL complex biosynthesis directly by provision of [Fe-S] as well as indirectly by influencing gene expression through the delivery of [Fe-S] to key regulators and enzymes that ultimately control the generation and oxidation of formate.

  5. Deconstructing the core dynamics from a complex time-lagged regulatory biological circuit.

    PubMed

    Eriksson, O; Brinne, B; Zhou, Y; Björkegren, J; Tegnér, J

    2009-03-01

    Complex regulatory dynamics is ubiquitous in molecular networks composed of genes and proteins. Recent progress in computational biology and its application to molecular data generate a growing number of complex networks. Yet, it has been difficult to understand the governing principles of these networks beyond graphical analysis or extensive numerical simulations. Here the authors exploit several simplifying biological circumstances which thereby enable to directly detect the underlying dynamical regularities driving periodic oscillations in a dynamical nonlinear computational model of a protein-protein network. System analysis is performed using the cell cycle, a mathematically well-described complex regulatory circuit driven by external signals. By introducing an explicit time delay and using a 'tearing-and-zooming' approach the authors reduce the system to a piecewise linear system with two variables that capture the dynamics of this complex network. A key step in the analysis is the identification of functional subsystems by identifying the relations between state-variables within the model. These functional subsystems are referred to as dynamical modules operating as sensitive switches in the original complex model. By using reduced mathematical representations of the subsystems the authors derive explicit conditions on how the cell cycle dynamics depends on system parameters, and can, for the first time, analyse and prove global conditions for system stability. The approach which includes utilising biological simplifying conditions, identification of dynamical modules and mathematical reduction of the model complexity may be applicable to other well-characterised biological regulatory circuits. [Includes supplementary material].

  6. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  7. Computer simulations of dendrimer-polyelectrolyte complexes.

    PubMed

    Pandav, Gunja; Ganesan, Venkat

    2014-08-28

    We carry out a systematic analysis of static properties of the clusters formed by complexation between charged dendrimers and linear polyelectrolyte (LPE) chains in a dilute solution under good solvent conditions. We use single chain in mean-field simulations and analyze the structure of the clusters through radial distribution functions of the dendrimer, cluster size, and charge distributions. The effects of LPE length, charge ratio between LPE and dendrimer, the influence of salt concentration, and the dendrimer generation number are examined. Systems with short LPEs showed a reduced propensity for aggregation with dendrimers, leading to formation of smaller clusters. In contrast, larger dendrimers and longer LPEs lead to larger clusters with significant bridging. Increasing salt concentration was seen to reduce aggregation between dendrimers as a result of screening of electrostatic interactions. Generally, maximum complexation was observed in systems with an equal amount of net dendrimer and LPE charges, whereas either excess LPE or dendrimer concentrations resulted in reduced clustering between dendrimers.

  8. Reduced complexity of multi-track joint 2-D Viterbi detectors for bit-patterned media recording channel

    NASA Astrophysics Data System (ADS)

    Myint, L. M. M.; Warisarn, C.

    2017-05-01

    Two-dimensional (2-D) interference is one of the prominent challenges in ultra-high density recording system such as bit patterned media recording (BPMR). The multi-track joint 2-D detection technique with the help of the array-head reading can tackle this problem effectively by jointly processing the multiple readback signals from the adjacent tracks. Moreover, it can robustly alleviate the impairments due to track mis-registration (TMR) and media noise. However, the computational complexity of such detectors is normally too high and hard to implement in a reality, even for a few multiple tracks. Therefore, in this paper, we mainly focus on reducing the complexity of multi-track joint 2-D Viterbi detector without paying a large penalty in terms of the performance. We propose a simplified multi-track joint 2-D Viterbi detector with a manageable complexity level for the BPMR's multi-track multi-head (MTMH) system. In the proposed method, the complexity of detector's trellis is reduced with the help of the joint-track equalization method which employs 1-D equalizers and 2-D generalized partial response (GPR) target. Moreover, we also examine the performance of a full-fledged multi-track joint 2-D detector and the conventional 2-D detection. The results show that the simplified detector can perform close to the full-fledge detector, especially when the system faces high media noise, with the significant low complexity.

  9. Describing the complexity of systems: multivariable "set complexity" and the information basis of systems biology.

    PubMed

    Galas, David J; Sakhanenko, Nikita A; Skupin, Alexander; Ignac, Tomasz

    2014-02-01

    Context dependence is central to the description of complexity. Keying on the pairwise definition of "set complexity," we use an information theory approach to formulate general measures of systems complexity. We examine the properties of multivariable dependency starting with the concept of interaction information. We then present a new measure for unbiased detection of multivariable dependency, "differential interaction information." This quantity for two variables reduces to the pairwise "set complexity" previously proposed as a context-dependent measure of information in biological systems. We generalize it here to an arbitrary number of variables. Critical limiting properties of the "differential interaction information" are key to the generalization. This measure extends previous ideas about biological information and provides a more sophisticated basis for the study of complexity. The properties of "differential interaction information" also suggest new approaches to data analysis. Given a data set of system measurements, differential interaction information can provide a measure of collective dependence, which can be represented in hypergraphs describing complex system interaction patterns. We investigate this kind of analysis using simulated data sets. The conjoining of a generalized set complexity measure, multivariable dependency analysis, and hypergraphs is our central result. While our focus is on complex biological systems, our results are applicable to any complex system.

  10. Development of Radio Frequency Diesel Particulate Filter Sensor and Controls for Advanced Low Pressure Drop Systems to Reduce Engine Fuel Consumption (06B)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sappok, Alexander; Ragaller, Paul; Bromberg, Leslie

    This project developed a radio frequencybased sensor for accurate measurement of diesel particulate filter (DPF) loading with advanced low pressuredrop aftertreatment systems. The resulting technology demonstrated engine efficiency improvements through optimization of the combined engineaftertreatment system while reducing emissions, system cost, and complexity to meet the DOE program objectives.

  11. Degradable polymeric carrier for the delivery of IL-10 plasmid DNA to prevent autoimmune insulitis of NOD mice.

    PubMed

    Koh, J J; Ko, K S; Lee, M; Han, S; Park, J S; Kim, S W

    2000-12-01

    Recently, we have reported that biodegradable poly [alpha-(4-aminobutyl)-L-glycolic acid] (PAGA) can condense and protect plasmid DNA from DNase I. In this study, we investigated whether the systemic administration of pCAGGS mouse IL-10 (mIL-10) expression plasmid complexed with PAGA can reduce the development of insulitis in non-obese diabetic (NOD) mice. PAGA/mIL-10 plasmid complexes were stable for more than 60 min, but the naked DNA was destroyed within 10 min by DNase I. The PAGA/DNA complexes were injected into the tail vein of 3-week-old NOD mice. Serum mIL-10 level peaked at 5 days after injection, and could be detected for more than 9 weeks. The prevalence of severe insulitis on 12-week-old NOD mice was markedly reduced by the intravenous injection of PAGA/DNA complex (15.7%) compared with that of naked DNA injection (34.5%) and non-treated controls (90.9%). In conclusion, systemic administration of pCAGGS mIL-10 plasmid/PAGA complexes can reduce the severity of insulitis in NOD mice. This study shows that the PAGA/DNA complex has the potential for the prevention of autoimmune diabetes mellitus. Gene Therapy (2000) 7, 2099-2104.

  12. Fluid Intelligence Predicts Novel Rule Implementation in a Distributed Frontoparietal Control Network.

    PubMed

    Tschentscher, Nadja; Mitchell, Daniel; Duncan, John

    2017-05-03

    Fluid intelligence has been associated with a distributed cognitive control or multiple-demand (MD) network, comprising regions of lateral frontal, insular, dorsomedial frontal, and parietal cortex. Human fluid intelligence is also intimately linked to task complexity, and the process of solving complex problems in a sequence of simpler, more focused parts. Here, a complex target detection task included multiple independent rules, applied one at a time in successive task epochs. Although only one rule was applied at a time, increasing task complexity (i.e., the number of rules) impaired performance in participants of lower fluid intelligence. Accompanying this loss of performance was reduced response to rule-critical events across the distributed MD network. The results link fluid intelligence and MD function to a process of attentional focus on the successive parts of complex behavior. SIGNIFICANCE STATEMENT Fluid intelligence is intimately linked to the ability to structure complex problems in a sequence of simpler, more focused parts. We examine the basis for this link in the functions of a distributed frontoparietal or multiple-demand (MD) network. With increased task complexity, participants of lower fluid intelligence showed reduced responses to task-critical events. Reduced responses in the MD system were accompanied by impaired behavioral performance. Low fluid intelligence is linked to poor foregrounding of task-critical information across a distributed MD system. Copyright © 2017 Tschentscher et al.

  13. Sensitivity based coupling strengths in complex engineering systems

    NASA Technical Reports Server (NTRS)

    Bloebaum, C. L.; Sobieszczanski-Sobieski, J.

    1993-01-01

    The iterative design scheme necessary for complex engineering systems is generally time consuming and difficult to implement. Although a decomposition approach results in a more tractable problem, the inherent couplings make establishing the interdependencies of the various subsystems difficult. Another difficulty lies in identifying the most efficient order of execution for the subsystem analyses. The paper describes an approach for determining the dependencies that could be suspended during the system analysis with minimal accuracy losses, thereby reducing the system complexity. A new multidisciplinary testbed is presented, involving the interaction of structures, aerodynamics, and performance disciplines. Results are presented to demonstrate the effectiveness of the system reduction scheme.

  14. Characterizing complexity in socio-technical systems: a case study of a SAMU Medical Regulation Center.

    PubMed

    Righi, Angela Weber; Wachs, Priscila; Saurin, Tarcísio Abreu

    2012-01-01

    Complexity theory has been adopted by a number of studies as a benchmark to investigate the performance of socio-technical systems, especially those that are characterized by relevant cognitive work. However, there is little guidance on how to assess, systematically, the extent to which a system is complex. The main objective of this study is to carry out a systematic analysis of a SAMU (Mobile Emergency Medical Service) Medical Regulation Center in Brazil, based on the core characteristics of complex systems presented by previous studies. The assessment was based on direct observations and nine interviews: three of them with regulator of emergencies medical doctor, three with radio operators and three with telephone attendants. The results indicated that, to a great extent, the core characteristics of complexity are magnified) due to basic shortcomings in the design of the work system. Thus, some recommendations are put forward with a view to reducing unnecessary complexity that hinders the performance of the socio-technical system.

  15. A New Low Complexity Angle of Arrival Algorithm for 1D and 2D Direction Estimation in MIMO Smart Antenna Systems

    PubMed Central

    Al-Sadoon, Mohammed A. G.; Zuid, Abdulkareim; Jones, Stephen M. R.; Noras, James M.

    2017-01-01

    This paper proposes a new low complexity angle of arrival (AOA) method for signal direction estimation in multi-element smart wireless communication systems. The new method estimates the AOAs of the received signals directly from the received signals with significantly reduced complexity since it does not need to construct the correlation matrix, invert the matrix or apply eigen-decomposition, which are computationally expensive. A mathematical model of the proposed method is illustrated and then verified using extensive computer simulations. Both linear and circular sensors arrays are studied using various numerical examples. The method is systematically compared with other common and recently introduced AOA methods over a wide range of scenarios. The simulated results show that the new method has several advantages in terms of reduced complexity and improved accuracy under the assumptions of correlated signals and limited numbers of snapshots. PMID:29140313

  16. A New Low Complexity Angle of Arrival Algorithm for 1D and 2D Direction Estimation in MIMO Smart Antenna Systems.

    PubMed

    Al-Sadoon, Mohammed A G; Ali, Nazar T; Dama, Yousf; Zuid, Abdulkareim; Jones, Stephen M R; Abd-Alhameed, Raed A; Noras, James M

    2017-11-15

    This paper proposes a new low complexity angle of arrival (AOA) method for signal direction estimation in multi-element smart wireless communication systems. The new method estimates the AOAs of the received signals directly from the received signals with significantly reduced complexity since it does not need to construct the correlation matrix, invert the matrix or apply eigen-decomposition, which are computationally expensive. A mathematical model of the proposed method is illustrated and then verified using extensive computer simulations. Both linear and circular sensors arrays are studied using various numerical examples. The method is systematically compared with other common and recently introduced AOA methods over a wide range of scenarios. The simulated results show that the new method has several advantages in terms of reduced complexity and improved accuracy under the assumptions of correlated signals and limited numbers of snapshots.

  17. Managing Complex Interoperability Solutions using Model-Driven Architecture

    DTIC Science & Technology

    2011-06-01

    such as Oracle or MySQL . Each data model for a specific RDBMS is a distinct PSM. Or the system may want to exchange information with other C2...reduced number of transformations, e.g., from an RDBMS physical schema to the corresponding SQL script needed to instantiate the tables in a relational...tance of models. In engineering, a model serves several purposes: 1. It presents an abstract view of a complex system or of a complex information

  18. Moving alcohol prevention research forward-Part I: introducing a complex systems paradigm.

    PubMed

    Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller

    2018-02-01

    The drinking environment is a complex system consisting of a number of heterogeneous, evolving and interacting components, which exhibit circular causality and emergent properties. These characteristics reduce the efficacy of commonly used research approaches, which typically do not account for the underlying dynamic complexity of alcohol consumption and the interdependent nature of diverse factors influencing misuse over time. We use alcohol misuse among college students in the United States as an example for framing our argument for a complex systems paradigm. A complex systems paradigm, grounded in socio-ecological and complex systems theories and computational modeling and simulation, is introduced. Theoretical, conceptual, methodological and analytical underpinnings of this paradigm are described in the context of college drinking prevention research. The proposed complex systems paradigm can transcend limitations of traditional approaches, thereby fostering new directions in alcohol prevention research. By conceptualizing student alcohol misuse as a complex adaptive system, computational modeling and simulation methodologies and analytical techniques can be used. Moreover, use of participatory model-building approaches to generate simulation models can further increase stakeholder buy-in, understanding and policymaking. A complex systems paradigm for research into alcohol misuse can provide a holistic understanding of the underlying drinking environment and its long-term trajectory, which can elucidate high-leverage preventive interventions. © 2017 Society for the Study of Addiction.

  19. When physics is not "just physics": complexity science invites new measurement frames for exploring the physics of cognitive and biological development.

    PubMed

    Kelty-Stephen, Damian; Dixon, James A

    2012-01-01

    The neurobiological sciences have struggled to resolve the physical foundations for biological and cognitive phenomena with a suspicion that biological and cognitive systems, capable of exhibiting and contributing to structure within themselves and through their contexts, are fundamentally distinct or autonomous from purely physical systems. Complexity science offers new physics-based approaches to explaining biological and cognitive phenomena. In response to controversy over whether complexity science might seek to "explain away" biology and cognition as "just physics," we propose that complexity science serves as an application of recent advances in physics to phenomena in biology and cognition without reducing or undermining the integrity of the phenomena to be explained. We highlight that physics is, like the neurobiological sciences, an evolving field and that the threat of reduction is overstated. We propose that distinctions between biological and cognitive systems from physical systems are pretheoretical and thus optional. We review our own work applying insights from post-classical physics regarding turbulence and fractal fluctuations to the problems of developing cognitive structure. Far from hoping to reduce biology and cognition to "nothing but" physics, we present our view that complexity science offers new explanatory frameworks for considering physical foundations of biological and cognitive phenomena.

  20. Effect of Work Complexity & Individual Differences on Nursing IT Utilization

    ERIC Educational Resources Information Center

    Tian, Renran

    2013-01-01

    Various healthcare IT systems have been developed to reduce medication errors. Although these systems can help to improve patient safety and reduce adverse medical events, new problems are also generated with their utilizations. One key problem during IT implementation is the change of working process. Although many of these changes are recorded…

  1. Structural reducibility of multilayer networks

    NASA Astrophysics Data System (ADS)

    de Domenico, Manlio; Nicosia, Vincenzo; Arenas, Alexandre; Latora, Vito

    2015-04-01

    Many complex systems can be represented as networks consisting of distinct types of interactions, which can be categorized as links belonging to different layers. For example, a good description of the full protein-protein interactome requires, for some organisms, up to seven distinct network layers, accounting for different genetic and physical interactions, each containing thousands of protein-protein relationships. A fundamental open question is then how many layers are indeed necessary to accurately represent the structure of a multilayered complex system. Here we introduce a method based on quantum theory to reduce the number of layers to a minimum while maximizing the distinguishability between the multilayer network and the corresponding aggregated graph. We validate our approach on synthetic benchmarks and we show that the number of informative layers in some real multilayer networks of protein-genetic interactions, social, economical and transportation systems can be reduced by up to 75%.

  2. Terpyridine complexes of first row transition metals and electrochemical reduction of CO₂ to CO.

    PubMed

    Elgrishi, Noémie; Chambers, Matthew B; Artero, Vincent; Fontecave, Marc

    2014-07-21

    Homoleptic terpyridine complexes of first row transition metals are evaluated as catalysts for the electrocatalytic reduction of CO2. Ni and Co-based catalytic systems are shown to reduce CO2 to CO under the conditions tested. The Ni complex was found to exhibit selectivity for CO2 over proton reduction while the Co-based system generates mixtures of CO and H2 with CO : H2 ratios being tuneable through variation of the applied potential.

  3. Anti-aliasing filter design on spaceborne digital receiver

    NASA Astrophysics Data System (ADS)

    Yu, Danru; Zhao, Chonghui

    2009-12-01

    In recent years, with the development of satellite observation technologies, more and more active remote sensing technologies are adopted in spaceborne system. The spaceborne precipitation radar will depend heavily on high performance digital processing to collect meaningful rain echo data. It will increase the complexity of the spaceborne system and need high-performance and reliable digital receiver. This paper analyzes the frequency aliasing in the intermediate frequency signal sampling of digital down conversion in spaceborne radar, and gives an effective digital filter. By analysis and calculation, we choose reasonable parameters of the half-band filters to suppress the frequency aliasing on DDC. Compared with traditional filter, the FPGA resources cost in our system are reduced by over 50%. This can effectively reduce the complexity in the spaceborne digital receiver and improve the reliability of system.

  4. Method of reduction of the number of driving system channels for phased-array transducers using isolation transformers.

    PubMed

    Fjield, T; Hynynen, K

    2000-01-01

    Phased-array technology offers an incredible advantage to therapeutic ultrasound due to the ability to electronically steer foci, create multiple foci, or to create an enlarged focal region by using phase cancellation. However, to take advantage of this flexibility, the phased-arrays generally consist of many elements. Each of these elements requires its own radio-frequency generator with independent amplitude and phase control, resulting in a large, complex, and expensive driving system. A method is presented here where in certain cases the number of amplifier channels can be reduced to a fraction of the number of transducer elements, thereby simplifying the driving system and reducing the overall system complexity and cost, by using isolation transformers to produce 180 degrees phase shifts.

  5. Managing Programmatic Risk for Complex Space System Developments

    NASA Technical Reports Server (NTRS)

    Panetta, Peter V.; Hastings, Daniel; Brumfield, Mark (Technical Monitor)

    2001-01-01

    Risk management strategies have become a recent important research topic to many aerospace organizations as they prepare to develop the revolutionary complex space systems of the future. Future multi-disciplinary complex space systems will make it absolutely essential for organizations to practice a rigorous, comprehensive risk management process, emphasizing thorough systems engineering principles to succeed. Project managers must possess strong leadership skills to direct high quality, cross-disciplinary teams for successfully developing revolutionary space systems that are ever increasing in complexity. Proactive efforts to reduce or eliminate risk throughout a project's lifecycle ideally must be practiced by all technical members in the organization. This paper discusses some of the risk management perspectives that were collected from senior managers and project managers of aerospace and aeronautical organizations by the use of interviews and surveys. Some of the programmatic risks which drive the success or failure of projects are revealed. Key findings lead to a number of insights for organizations to consider for proactively approaching the risks which face current and future complex space systems projects.

  6. Model predictive control based on reduced order models applied to belt conveyor system.

    PubMed

    Chen, Wei; Li, Xin

    2016-11-01

    In the paper, a model predictive controller based on reduced order model is proposed to control belt conveyor system, which is an electro-mechanics complex system with long visco-elastic body. Firstly, in order to design low-degree controller, the balanced truncation method is used for belt conveyor model reduction. Secondly, MPC algorithm based on reduced order model for belt conveyor system is presented. Because of the error bound between the full-order model and reduced order model, two Kalman state estimators are applied in the control scheme to achieve better system performance. Finally, the simulation experiments are shown that balanced truncation method can significantly reduce the model order with high-accuracy and model predictive control based on reduced-model performs well in controlling the belt conveyor system. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Data-assisted reduced-order modeling of extreme events in complex dynamical systems

    PubMed Central

    Koumoutsakos, Petros

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse. PMID:29795631

  8. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    PubMed

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse.

  9. RF Jitter Modulation Alignment Sensing

    NASA Astrophysics Data System (ADS)

    Ortega, L. F.; Fulda, P.; Diaz-Ortiz, M.; Perez Sanchez, G.; Ciani, G.; Voss, D.; Mueller, G.; Tanner, D. B.

    2017-01-01

    We will present the numerical and experimental results of a new alignment sensing scheme which can reduce the complexity of alignment sensing systems currently used, while maintaining the same shot noise limited sensitivity. This scheme relies on the ability of electro-optic beam deflectors to create angular modulation sidebands in radio frequency, and needs only a single-element photodiode and IQ demodulation to generate error signals for tilt and translation degrees of freedom in one dimension. It distances itself from current techniques by eliminating the need for beam centering servo systems, quadrant photodetectors and Gouy phase telescopes. RF Jitter alignment sensing can be used to reduce the complexity in the alignment systems of many laser optical experiments, including LIGO and the ALPS experiment.

  10. Iconic hyperlinks on e-commerce websites.

    PubMed

    Cheng, Hong-In; Patterson, Patrick E

    2007-01-01

    The proper use of iconic interfaces reduces system complexity and helps users interact with systems more easily. However, due to carelessness, inadequate research, and the web's relatively short history, the icons used on web sites often are ambiguous. Because non-identifiable icons may convey meanings other than those intended, designers must consider whether icons are easily identifiable when creating web sites. In this study, visual icons used on e-business web sites were examined by population stereotypy and categorized into three groups: identifiable, medium, and vague. Representative icons from each group were tested by comparing selection performance in groups of student volunteers, with identifiable and medium icons improving performance. We found that only easily identifiable icons can reduce complexity and increase system usability.

  11. A Reduced-Complexity Investigation of Blunt Leading-Edge Separation Motivated by UCAV Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, James M.; Boelens, Okko J.

    2015-01-01

    A reduced complexity investigation for blunt-leading-edge vortical separation has been undertaken. The overall approach is to design the fundamental work in such a way so that it relates to the aerodynamics of a more complex Uninhabited Combat Air Vehicle (UCAV) concept known as SACCON. Some of the challenges associated with both the vehicle-class aerodynamics and the fundamental vortical flows are reviewed, and principles from a hierarchical complexity approach are used to relate flow fundamentals to system-level interests. The work is part of roughly 6-year research program on blunt-leading-edge separation pertinent to UCAVs, and was conducted under the NATO Science and Technology Organization, Applied Vehicle Technology panel.

  12. Implementation of Complexity Analyzing Based on Additional Effect

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang

    According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.

  13. Enhancing Integrated Pest Management in GM Cotton Systems Using Host Plant Resistance

    PubMed Central

    Trapero, Carlos; Wilson, Iain W.; Stiller, Warwick N.; Wilson, Lewis J.

    2016-01-01

    Cotton has lost many ancestral defensive traits against key invertebrate pests. This is suggested by the levels of resistance to some pests found in wild cotton genotypes as well as in cultivated landraces and is a result of domestication and a long history of targeted breeding for yield and fiber quality, along with the capacity to control pests with pesticides. Genetic modification (GM) allowed integration of toxins from a bacteria into cotton to control key Lepidopteran pests. Since the mid-1990s, use of GM cotton cultivars has greatly reduced the amount of pesticides used in many cotton systems. However, pests not controlled by the GM traits have usually emerged as problems, especially the sucking bug complex. Control of this complex with pesticides often causes a reduction in beneficial invertebrate populations, allowing other secondary pests to increase rapidly and require control. Control of both sucking bug complex and secondary pests is problematic due to the cost of pesticides and/or high risk of selecting for pesticide resistance. Deployment of host plant resistance (HPR) provides an opportunity to manage these issues in GM cotton systems. Cotton cultivars resistant to the sucking bug complex and/or secondary pests would require fewer pesticide applications, reducing costs and risks to beneficial invertebrate populations and pesticide resistance. Incorporation of HPR traits into elite cotton cultivars with high yield and fiber quality offers the potential to further reduce pesticide use and increase the durability of pest management in GM cotton systems. We review the challenges that the identification and use of HPR against invertebrate pests brings to cotton breeding. We explore sources of resistance to the sucking bug complex and secondary pests, the mechanisms that control them and the approaches to incorporate these defense traits to commercial cultivars. PMID:27148323

  14. The Design, Development and Testing of a Multi-process Real-time Software System

    DTIC Science & Technology

    2007-03-01

    programming large systems stems from the complexity of dealing with many different details at one time. A sound engineering approach is to break...controls and 3) is portable to other OS platforms such as Microsoft Windows. Next, to reduce the complexity of the programming tasks, the system...processes depending on how often the process has to check to see if common data was modified. A good method for one process to quickly notify another

  15. The Difference between Uncertainty and Information, and Why This Matters

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2016-12-01

    Earth science investigation and arbitration (for decision making) is very often organized around a concept of uncertainty. It seems relatively straightforward that the purpose of our science is to reduce uncertainty about how environmental systems will react and evolve under different conditions. I propose here that approaching a science of complex systems as a process of quantifying and reducing uncertainty is a mistake, and specifically a mistake that is rooted in certain rather hisoric logical errors. Instead I propose that we should be asking questions about information. I argue here that an information-based perspective facilitates almost trivial answers to environmental science questions that are either difficult or theoretically impossible to answer when posed as questions about uncertainty. In particular, I propose that an information-centric perspective leads to: Coherent and non-subjective hypothesis tests for complex system models. Process-level diagnostics for complex systems models. Methods for building complex systems models that allow for inductive inference without the need for a priori specification of likelihood functions or ad hoc error metrics. Asymptotically correct quantification of epistemic uncertainty. To put this in slightly more basic terms, I propose that an information-theoretic philosophy of science has the potential to resolve certain important aspects of the Demarcation Problem and the Duhem-Quine Problem, and that Hydrology and other Earth Systems Sciences can immediately capitalize on this to address some of our most difficult and persistent problems.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Boston Christian Science Center recently purchased its own packaged boiler system to provide heating and cooling steam for its building complex. The system is expected to reduce the center's energy costs by $450,000 in the first year.

  17. Exploring Complex Systems Aspects of Blackout Risk and Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, David E; Carreras, Benjamin A; Lynch, Vickie E

    2011-01-01

    Electric power transmission systems are a key infrastructure, and blackouts of these systems have major consequences for the economy and national security. Analyses of blackout data suggest that blackout size distributions have a power law form over much of their range. This result is an indication that blackouts behave as a complex dynamical system. We use a simulation of an upgrading power transmission system to investigate how these complex system dynamics impact the assessment and mitigation of blackout risk. The mitigation of failures in complex systems needs to be approached with care. The mitigation efforts can move the system tomore » a new dynamic equilibrium while remaining near criticality and preserving the power law region. Thus, while the absolute frequency of blackouts of all sizes may be reduced, the underlying forces can still cause the relative frequency of large blackouts to small blackouts to remain the same. Moreover, in some cases, efforts to mitigate small blackouts can even increase the frequency of large blackouts. This result occurs because the large and small blackouts are not mutually independent, but are strongly coupled by the complex dynamics.« less

  18. Equation Discovery for Model Identification in Respiratory Mechanics of the Mechanically Ventilated Human Lung

    NASA Astrophysics Data System (ADS)

    Ganzert, Steven; Guttmann, Josef; Steinmann, Daniel; Kramer, Stefan

    Lung protective ventilation strategies reduce the risk of ventilator associated lung injury. To develop such strategies, knowledge about mechanical properties of the mechanically ventilated human lung is essential. This study was designed to develop an equation discovery system to identify mathematical models of the respiratory system in time-series data obtained from mechanically ventilated patients. Two techniques were combined: (i) the usage of declarative bias to reduce search space complexity and inherently providing the processing of background knowledge. (ii) A newly developed heuristic for traversing the hypothesis space with a greedy, randomized strategy analogical to the GSAT algorithm. In 96.8% of all runs the applied equation discovery system was capable to detect the well-established equation of motion model of the respiratory system in the provided data. We see the potential of this semi-automatic approach to detect more complex mathematical descriptions of the respiratory system from respiratory data.

  19. A development framework for artificial intelligence based distributed operations support systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1990-01-01

    Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.

  20. A program-level management system for the life cycle environmental and economic assessment of complex building projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Chan-Joong; Kim, Jimin; Hong, Taehoon

    Climate change has become one of the most significant environmental issues, of which about 40% come from the building sector. In particular, complex building projects with various functions have increased, which should be managed from a program-level perspective. Therefore, this study aimed to develop a program-level management system for the life-cycle environmental and economic assessment of complex building projects. The developed system consists of three parts: (i) input part: database server and input data; (ii) analysis part: life cycle assessment and life cycle cost; and (iii) result part: microscopic analysis and macroscopic analysis. To analyze the applicability of the developedmore » system, this study selected ‘U’ University, a complex building project consisting of research facility and residential facility. Through value engineering with experts, a total of 137 design alternatives were established. Based on these alternatives, the macroscopic analysis results were as follows: (i) at the program-level, the life-cycle environmental and economic cost in ‘U’ University were reduced by 6.22% and 2.11%, respectively; (ii) at the project-level, the life-cycle environmental and economic cost in research facility were reduced 6.01% and 1.87%, respectively; and those in residential facility, 12.01% and 3.83%, respective; and (iii) for the mechanical work at the work-type-level, the initial cost was increased 2.9%; but the operation and maintenance phase was reduced by 20.0%. As a result, the developed system can allow the facility managers to establish the operation and maintenance strategies for the environmental and economic aspects from a program-level perspective. - Highlights: • A program-level management system for complex building projects was developed. • Life-cycle environmental and economic assessment can be conducted using the system. • The design alternatives can be analyzed from the microscopic perspective. • The system can be used to establish the optimal O&M strategy at the program-level. • It can be applied to any other country or sector in the global environment.« less

  1. Computationally efficient algorithm for high sampling-frequency operation of active noise control

    NASA Astrophysics Data System (ADS)

    Rout, Nirmal Kumar; Das, Debi Prasad; Panda, Ganapati

    2015-05-01

    In high sampling-frequency operation of active noise control (ANC) system the length of the secondary path estimate and the ANC filter are very long. This increases the computational complexity of the conventional filtered-x least mean square (FXLMS) algorithm. To reduce the computational complexity of long order ANC system using FXLMS algorithm, frequency domain block ANC algorithms have been proposed in past. These full block frequency domain ANC algorithms are associated with some disadvantages such as large block delay, quantization error due to computation of large size transforms and implementation difficulties in existing low-end DSP hardware. To overcome these shortcomings, the partitioned block ANC algorithm is newly proposed where the long length filters in ANC are divided into a number of equal partitions and suitably assembled to perform the FXLMS algorithm in the frequency domain. The complexity of this proposed frequency domain partitioned block FXLMS (FPBFXLMS) algorithm is quite reduced compared to the conventional FXLMS algorithm. It is further reduced by merging one fast Fourier transform (FFT)-inverse fast Fourier transform (IFFT) combination to derive the reduced structure FPBFXLMS (RFPBFXLMS) algorithm. Computational complexity analysis for different orders of filter and partition size are presented. Systematic computer simulations are carried out for both the proposed partitioned block ANC algorithms to show its accuracy compared to the time domain FXLMS algorithm.

  2. Complex Mobile Independent Power Station for Urban Areas

    NASA Astrophysics Data System (ADS)

    Tunik, A. A.; Tolstoy, M. Y.

    2017-11-01

    A new type of a complex mobile independent power station developed in the Department of Engineering Communications and Life-Support Systems of Irkutsk National Research Technical University, is presented in this article. This station contains only solar panel, wind turbine, accumulator, diesel generator and microbial fuel cell for to produce electric energy, heat pump and solar collector to generate heat energy and also wastewater treatment plant and new complex control system. The complex mobile independent power station is intended for full power supply of a different kind of consumers located even in remote areas thus reducing their dependence from centralized energy supply systems, decrease the fossil fuel consumption, improve the environment of urban areas and solve the problems of the purification of industrial and municipal wastewater.

  3. Air traffic control : role of FAA's modernization program in reducing delays and congestion

    DOT National Transportation Integrated Search

    2001-05-10

    The National Airspace System (NAS) is a complex collection of systems, procedures, facilities, aircraft, and people. Because these components are interconnected and interdependent, they must work together as one system to ensure safe operations. The ...

  4. Uncertainty Reduction for Stochastic Processes on Complex Networks

    NASA Astrophysics Data System (ADS)

    Radicchi, Filippo; Castellano, Claudio

    2018-05-01

    Many real-world systems are characterized by stochastic dynamical rules where a complex network of interactions among individual elements probabilistically determines their state. Even with full knowledge of the network structure and of the stochastic rules, the ability to predict system configurations is generally characterized by a large uncertainty. Selecting a fraction of the nodes and observing their state may help to reduce the uncertainty about the unobserved nodes. However, choosing these points of observation in an optimal way is a highly nontrivial task, depending on the nature of the stochastic process and on the structure of the underlying interaction pattern. In this paper, we introduce a computationally efficient algorithm to determine quasioptimal solutions to the problem. The method leverages network sparsity to reduce computational complexity from exponential to almost quadratic, thus allowing the straightforward application of the method to mid-to-large-size systems. Although the method is exact only for equilibrium stochastic processes defined on trees, it turns out to be effective also for out-of-equilibrium processes on sparse loopy networks.

  5. A Biomimetic Nickel Complex with a Reduced CO2 Ligand Generated by Formate Deprotonation and Its Behaviour towards CO2.

    PubMed

    Zimmermann, Philipp; Hoof, Santina; Braun-Cula, Beatrice; Herwig, Christian; Limberg, Christian

    2018-04-10

    Reduced CO 2 species are key intermediates in a variety of natural and synthetic processes. In the majority of systems, however, they elude isolation or characterisation owing to high reactivity or limited accessibility (heterogeneous systems), and their formulations thus often remain uncertain or are based on calculations only. We herein report on a Ni-CO 2 2- complex that is unique in many ways. While its structural and electronic features help understand the CO 2 -bound state in Ni,Fe carbon monoxide dehydrogenases, its reactivity sheds light on how CO 2 can be converted into CO/CO 3 2- by nickel complexes. In addition, the complex was generated by a rare example of formate β-deprotonation, a mechanistic step relevant to the nickel-catalysed conversion of H x CO y z- at electrodes and formate oxidation in formate dehydrogenases. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Habitat complexity and sex-dependent predation of mosquito larvae in containers

    PubMed Central

    Griswold, Marcus W.; Lounibos, L. Philip

    2012-01-01

    Studies in aquatic systems have shown that habitat complexity may provide refuge or reduce the number of encounters prey have with actively searching predators. For ambush predators, habitat complexity may enhance or have no effect on predation rates because it conceals predators, reduces prey detection by predators, or visually impairs both predators and prey. We investigated the effects of habitat complexity and predation by the ambush predators Toxorhynchites rutilus and Corethrella appendiculata on their mosquito prey Aedes albopictus and Ochlerotatus triseriatus in container analogs of treeholes. As in other ambush predator-prey systems, habitat complexity did not alter the effects of T. rutilus or C. appendiculata whose presence decreased prey survivorship, shortened development time, and increased adult size compared to treatments where predators were absent. Faster growth and larger size were due to predator-mediated release from competition among surviving prey. Male and female prey survivorship were similar in the absence of predators, however when predators were present, survivorship of both prey species was skewed in favor of males. We conclude that habitat complexity is relatively unimportant in shaping predator-prey interactions in this treehole community, where predation risk differs between prey sexes. PMID:16041612

  7. High-resolution imaging using a wideband MIMO radar system with two distributed arrays.

    PubMed

    Wang, Dang-wei; Ma, Xiao-yan; Chen, A-Lei; Su, Yi

    2010-05-01

    Imaging a fast maneuvering target has been an active research area in past decades. Usually, an array antenna with multiple elements is implemented to avoid the motion compensations involved in the inverse synthetic aperture radar (ISAR) imaging. Nevertheless, there is a price dilemma due to the high level of hardware complexity compared to complex algorithm implemented in the ISAR imaging system with only one antenna. In this paper, a wideband multiple-input multiple-output (MIMO) radar system with two distributed arrays is proposed to reduce the hardware complexity of the system. Furthermore, the system model, the equivalent array production method and the imaging procedure are presented. As compared with the classical real aperture radar (RAR) imaging system, there is a very important contribution in our method that the lower hardware complexity can be involved in the imaging system since many additive virtual array elements can be obtained. Numerical simulations are provided for testing our system and imaging method.

  8. A supramolecular photosensitizer system based on the host-guest complexation between water-soluble pillar[6]arene and methylene blue for durable photodynamic therapy.

    PubMed

    Yang, Kui; Wen, Jia; Chao, Shuang; Liu, Jing; Yang, Ke; Pei, Yuxin; Pei, Zhichao

    2018-06-05

    A supramolecular photosensitizer system WP6-MB was synthesized based on water-soluble pillar[6]arene and the photosensitizer methylene blue (MB) via host-guest interaction. MB can complex with WP6 directly with a high complex constant without further modification. In particular, WP6-MB can reduce the dark toxicity of MB remarkably. Furthermore, it can efficiently overcome photobleaching and extend the time for singlet oxygen production of MB upon light irradiation, which is significant for durable photodynamic therapy.

  9. Application of advanced multidisciplinary analysis and optimization methods to vehicle design synthesis

    NASA Technical Reports Server (NTRS)

    Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.

  10. Analysis and design of algorithm-based fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. Sukumaran

    1990-01-01

    An important consideration in the design of high performance multiprocessor systems is to ensure the correctness of the results computed in the presence of transient and intermittent failures. Concurrent error detection and correction have been applied to such systems in order to achieve reliability. Algorithm Based Fault Tolerance (ABFT) was suggested as a cost-effective concurrent error detection scheme. The research was motivated by the complexity involved in the analysis and design of ABFT systems. To that end, a matrix-based model was developed and, based on that, algorithms for both the design and analysis of ABFT systems are formulated. These algorithms are less complex than the existing ones. In order to reduce the complexity further, a hierarchical approach is developed for the analysis of large systems.

  11. Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach

    NASA Technical Reports Server (NTRS)

    Mak, Victor W. K.

    1986-01-01

    Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.

  12. Human Error In Complex Systems

    NASA Technical Reports Server (NTRS)

    Morris, Nancy M.; Rouse, William B.

    1991-01-01

    Report presents results of research aimed at understanding causes of human error in such complex systems as aircraft, nuclear powerplants, and chemical processing plants. Research considered both slips (errors of action) and mistakes (errors of intention), and influence of workload on them. Results indicated that: humans respond to conditions in which errors expected by attempting to reduce incidence of errors; and adaptation to conditions potent influence on human behavior in discretionary situations.

  13. SeaFrame: Sustaining Today’s Fleet Efficiently and Effectively. Volume 5, Issue 1, 2009

    DTIC Science & Technology

    2009-01-01

    Maneuvering 11 Shipboard Launch and Recovery Systems 13 Integrated Logistics System 15 Special Hull Treatment Tile Manufacturing 17 Navy Shipboard Oil ...Developing advanced blade section design technology for propulsors that reduces cavitation damage and required repair cost and time. - Conducting...complex we have ever written.” Ammeen adds that steering and diving algorithms are also very complex, because hydrodynamic effects of a submarine

  14. Persistent model order reduction for complex dynamical systems using smooth orthogonal decomposition

    NASA Astrophysics Data System (ADS)

    Ilbeigi, Shahab; Chelidze, David

    2017-11-01

    Full-scale complex dynamic models are not effective for parametric studies due to the inherent constraints on available computational power and storage resources. A persistent reduced order model (ROM) that is robust, stable, and provides high-fidelity simulations for a relatively wide range of parameters and operating conditions can provide a solution to this problem. The fidelity of a new framework for persistent model order reduction of large and complex dynamical systems is investigated. The framework is validated using several numerical examples including a large linear system and two complex nonlinear systems with material and geometrical nonlinearities. While the framework is used for identifying the robust subspaces obtained from both proper and smooth orthogonal decompositions (POD and SOD, respectively), the results show that SOD outperforms POD in terms of stability, accuracy, and robustness.

  15. Thrust distribution for attitude control in a variable thrust propulsion system with four ACS nozzles

    NASA Astrophysics Data System (ADS)

    Lim, Yeerang; Lee, Wonsuk; Bang, Hyochoong; Lee, Hosung

    2017-04-01

    A thrust distribution approach is proposed in this paper for a variable thrust solid propulsion system with an attitude control system (ACS) that uses a reduced number of nozzles for a three-axis attitude maneuver. Although a conventional variable thrust solid propulsion system needs six ACS nozzles, this paper proposes a thrust system with four ACS nozzles to reduce the complexity and mass of the system. The performance of the new system was analyzed with numerical simulations, and the results show that the performance of the system with four ACS nozzles was similar to the original system while the mass of the whole system was simultaneously reduced. Moreover, a feasibility analysis was performed to determine whether a thrust system with three ACS nozzles is possible.

  16. Antenna complexes protect Photosystem I from Photoinhibition

    PubMed Central

    Alboresi, Alessandro; Ballottari, Matteo; Hienerwadel, Rainer; Giacometti, Giorgio M; Morosinotto, Tomas

    2009-01-01

    Background Photosystems are composed of two moieties, a reaction center and a peripheral antenna system. In photosynthetic eukaryotes the latter system is composed of proteins belonging to Lhc family. An increasing set of evidences demonstrated how these polypeptides play a relevant physiological function in both light harvesting and photoprotection. Despite the sequence similarity between antenna proteins associated with the two Photosystems, present knowledge on their physiological role is mostly limited to complexes associated to Photosystem II. Results In this work we analyzed the physiological role of Photosystem I antenna system in Arabidopsis thaliana both in vivo and in vitro. Plants depleted in individual antenna polypeptides showed a reduced capacity for photoprotection and an increased production of reactive oxygen species upon high light exposure. In vitro experiments on isolated complexes confirmed that depletion of antenna proteins reduced the resistance of isolated Photosystem I particles to high light and that the antenna is effective in photoprotection only upon the interaction with the core complex. Conclusion We show that antenna proteins play a dual role in Arabidopsis thaliana Photosystem I photoprotection: first, a Photosystem I with an intact antenna system is more resistant to high light because of a reduced production of reactive oxygen species and, second, antenna chlorophyll-proteins are the first target of high light damages. When photoprotection mechanisms become insufficient, the antenna chlorophyll proteins act as fuses: LHCI chlorophylls are degraded while the reaction center photochemical activity is maintained. Differences with respect to photoprotection strategy in Photosystem II, where the reaction center is the first target of photoinhibition, are discussed. PMID:19508723

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholbrock, A. K.; Fleming, P. A.; Fingersh, L. J.

    Wind turbines are complex, nonlinear, dynamic systems driven by aerodynamic, gravitational, centrifugal, and gyroscopic forces. The aerodynamics of wind turbines are nonlinear, unsteady, and complex. Turbine rotors are subjected to a chaotic three-dimensional (3-D) turbulent wind inflow field with imbedded coherent vortices that drive fatigue loads and reduce lifetime. In order to reduce cost of energy, future large multimegawatt turbines must be designed with lighter weight structures, using active controls to mitigate fatigue loads, maximize energy capture, and add active damping to maintain stability for these dynamically active structures operating in a complex environment. Researchers at the National Renewable Energymore » Laboratory (NREL) and University of Stuttgart are designing, implementing, and testing advanced feed-back and feed-forward controls in order to reduce the cost of energy for wind turbines.« less

  18. Connectivity in the human brain dissociates entropy and complexity of auditory inputs☆

    PubMed Central

    Nastase, Samuel A.; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-01-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. PMID:25536493

  19. A low-power and high-quality implementation of the discrete cosine transformation

    NASA Astrophysics Data System (ADS)

    Heyne, B.; Götze, J.

    2007-06-01

    In this paper a computationally efficient and high-quality preserving DCT architecture is presented. It is obtained by optimizing the Loeffler DCT based on the Cordic algorithm. The computational complexity is reduced from 11 multiply and 29 add operations (Loeffler DCT) to 38 add and 16 shift operations (which is similar to the complexity of the binDCT). The experimental results show that the proposed DCT algorithm not only reduces the computational complexity significantly, but also retains the good transformation quality of the Loeffler DCT. Therefore, the proposed Cordic based Loeffler DCT is especially suited for low-power and high-quality CODECs in battery-based systems.

  20. Knowledge base rule partitioning design for CLIPS

    NASA Technical Reports Server (NTRS)

    Mainardi, Joseph D.; Szatkowski, G. P.

    1990-01-01

    This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.

  1. RT-Syn: A real-time software system generator

    NASA Technical Reports Server (NTRS)

    Setliff, Dorothy E.

    1992-01-01

    This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.

  2. Development of Boolean calculus and its applications. [digital systems design

    NASA Technical Reports Server (NTRS)

    Tapia, M. A.

    1980-01-01

    The development of Boolean calculus for its application to developing digital system design methodologies that would reduce system complexity, size, cost, speed, power requirements, etc., is discussed. Synthesis procedures for logic circuits are examined particularly asynchronous circuits using clock triggered flip flops.

  3. System Theoretic Frameworks for Mitigating Risk Complexity in the Nuclear Fuel Cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Adam David; Mohagheghi, Amir H.; Cohn, Brian

    In response to the expansion of nuclear fuel cycle (NFC) activities -- and the associated suite of risks -- around the world, this project evaluated systems-based solutions for managing such risk complexity in multimodal and multi-jurisdictional international spent nuclear fuel (SNF) transportation. By better understanding systemic risks in SNF transportation, developing SNF transportation risk assessment frameworks, and evaluating these systems-based risk assessment frameworks, this research illustrated interdependency between safety, security, and safeguards risks is inherent in NFC activities and can go unidentified when each "S" is independently evaluated. Two novel system-theoretic analysis techniques -- dynamic probabilistic risk assessment (DPRA) andmore » system-theoretic process analysis (STPA) -- provide integrated "3S" analysis to address these interdependencies and the research results suggest a need -- and provide a way -- to reprioritize United States engagement efforts to reduce global nuclear risks. Lastly, this research identifies areas where Sandia National Laboratories can spearhead technical advances to reduce global nuclear dangers.« less

  4. GESA--a two-dimensional processing system using knowledge base techniques.

    PubMed

    Rowlands, D G; Flook, A; Payne, P I; van Hoff, A; Niblett, T; McKee, S

    1988-12-01

    The successful analysis of two-dimensional (2-D) polyacrylamide electrophoresis gels demands considerable experience and understanding of the protein system under investigation as well as knowledge of the separation technique itself. The present work concerns the development of a computer system for analysing 2-D electrophoretic separations which incorporates concepts derived from artificial intelligence research such that non-experts can use the technique as a diagnostic or identification tool. Automatic analysis of 2-D gel separations has proved to be extremely difficult using statistical methods. Non-reproducibility of gel separations is also difficult to overcome using automatic systems. However, the human eye is extremely good at recognising patterns in images, and human intervention in semi-automatic computer systems can reduce the computational complexities of fully automatic systems. Moreover, the expertise and understanding of an "expert" is invaluable in reducing system complexity if it can be encapsulated satisfactorily in an expert system. The combination of user-intervention in the computer system together with the encapsulation of expert knowledge characterises the present system. The domain within which the system has been developed is that of wheat grain storage proteins (gliadins) which exhibit polymorphism to such an extent that cultivars can be uniquely identified by their gliadin patterns. The system can be adapted to other domains where a range of polymorpic protein sub-units exist. In its generalised form, the system can also be used for comparing more complex 2-D gel electrophoretic separations.

  5. On the Origin of Complex Adaptive Traits: Progress Since the Darwin Versus Mivart Debate.

    PubMed

    Suzuki, Takao K

    2017-06-01

    The evolutionary origin of complex adaptive traits has been a controversial topic in the history of evolutionary biology. Although Darwin argued for the gradual origins of complex adaptive traits within the theory of natural selection, Mivart insisted that natural selection could not account for the incipient stages of complex traits. The debate starting from Darwin and Mivart eventually engendered two opposite views: gradualism and saltationism. Although this has been a long-standing debate, the issue remains unresolved. However, recent studies have interrogated classic examples of complex traits, such as the asymmetrical eyes of flatfishes and leaf mimicry of butterfly wings, whose origins were debated by Darwin and Mivart. Here, I review recent findings as a starting point to provide a modern picture of the evolution of complex adaptive traits. First, I summarize the empirical evidence that unveils the evolutionary steps toward complex traits. I then argue that the evolution of complex traits could be understood within the concept of "reducible complexity." Through these discussions, I propose a conceptual framework for the formation of complex traits, named as reducible-composable multicomponent systems, that satisfy two major characteristics: reducibility into a sum of subcomponents and composability to construct traits from various additional and combinatorial arrangements of the subcomponents. This conceptual framework provides an analytical foundation for exploring evolutionary pathways to build up complex traits. This review provides certain essential avenues for deciphering the origin of complex adaptive traits. © 2017 Wiley Periodicals, Inc.

  6. Adaptive identifier for uncertain complex nonlinear systems based on continuous neural networks.

    PubMed

    Alfaro-Ponce, Mariel; Cruz, Amadeo Argüelles; Chairez, Isaac

    2014-03-01

    This paper presents the design of a complex-valued differential neural network identifier for uncertain nonlinear systems defined in the complex domain. This design includes the construction of an adaptive algorithm to adjust the parameters included in the identifier. The algorithm is obtained based on a special class of controlled Lyapunov functions. The quality of the identification process is characterized using the practical stability framework. Indeed, the region where the identification error converges is derived by the same Lyapunov method. This zone is defined by the power of uncertainties and perturbations affecting the complex-valued uncertain dynamics. Moreover, this convergence zone is reduced to its lowest possible value using ideas related to the so-called ellipsoid methodology. Two simple but informative numerical examples are developed to show how the identifier proposed in this paper can be used to approximate uncertain nonlinear systems valued in the complex domain.

  7. Receiver bandwidth effects on complex modulation and detection using directly modulated lasers.

    PubMed

    Yuan, Feng; Che, Di; Shieh, William

    2016-05-01

    Directly modulated lasers (DMLs) have long been employed for short- and medium-reach optical communications due to their low cost. Recently, a new modulation scheme called complex modulated DMLs has been demonstrated showing a significant optical signal to noise ratio sensitivity enhancement compared with the traditional intensity-only detection scheme. However, chirp-induced optical spectrum broadening is inevitable in complex modulated systems, which may imply a need for high-bandwidth receivers. In this Letter, we study the impact of receiver bandwidth effects on the performance of complex modulation and coherent detection systems based on DMLs. We experimentally demonstrate that such systems exhibit a reasonable tolerance for the reduced receiver bandwidth. For 10 Gbaud 4-level pulse amplitude modulation signals, the required electrical bandwidth is as low as 8.5 and 7.5 GHz for 7% and 20% forward error correction, respectively. Therefore, it is feasible to realize DML-based complex modulated systems using cost-effective receivers with narrow bandwidth.

  8. Assessing health system interventions: key points when considering the value of randomization

    PubMed Central

    Schellenberg, Joanna; Todd, Jim

    2011-01-01

    Abstract Research is needed to help identify interventions that will improve the capacity or functioning of health systems and thereby contribute to achieving global health goals. Well conducted, randomized controlled trials (RCTs), insofar as they reduce bias and confounding, provide the strongest evidence for identifying which interventions delivered directly to individuals are safe and effective. When ethically feasible, they can also help reduce bias and confounding when assessing interventions targeting entire health systems. However, additional challenges emerge when research focuses on interventions that target the multiple units of organization found within health systems. Hence, one cannot complacently assume that randomization can reduce or eliminate bias and confounding to the same degree in every instance. While others have articulated arguments in favour of alternative designs, this paper is intended to help people understand why the potential value afforded by RCTs may be threatened. Specifically, it suggests six points to be borne in mind when exploring the challenges entailed in designing or evaluating RCTs on health system interventions: (i) the number of units available for randomization; (ii) the complexity of the organizational unit under study; (iii) the complexity of the intervention; (iv) the complexity of the cause–effect pathway, (v) contamination; and (vi) outcome heterogeneity. The authors suggest that the latter may be informative and that the reasons behind it should be explored and not ignored. Based on improved understanding of the value and possible limitations of RCTs on health system interventions, the authors show why we need broader platforms of research to complement RCTs. PMID:22271948

  9. Design optimization of transmitting antennas for weakly coupled magnetic induction communication systems

    PubMed Central

    2017-01-01

    This work focuses on the design of transmitting coils in weakly coupled magnetic induction communication systems. We propose several optimization methods that reduce the active, reactive and apparent power consumption of the coil. These problems are formulated as minimization problems, in which the power consumed by the transmitting coil is minimized, under the constraint of providing a required magnetic field at the receiver location. We develop efficient numeric and analytic methods to solve the resulting problems, which are of high dimension, and in certain cases non-convex. For the objective of minimal reactive power an analytic solution for the optimal current distribution in flat disc transmitting coils is provided. This problem is extended to general three-dimensional coils, for which we develop an expression for the optimal current distribution. Considering the objective of minimal apparent power, a method is developed to reduce the computational complexity of the problem by transforming it to an equivalent problem of lower dimension, allowing a quick and accurate numeric solution. These results are verified experimentally by testing a number of coil geometries. The results obtained allow reduced power consumption and increased performances in magnetic induction communication systems. Specifically, for wideband systems, an optimal design of the transmitter coil reduces the peak instantaneous power provided by the transmitter circuitry, and thus reduces its size, complexity and cost. PMID:28192463

  10. Collaborative Systems Thinking: A Response to the Problems Faced by Systems Engineering's 'Middle Tier'

    NASA Technical Reports Server (NTRS)

    Phfarr, Barbara B.; So, Maria M.; Lamb, Caroline Twomey; Rhodes, Donna H.

    2009-01-01

    Experienced systems engineers are adept at more than implementing systems engineering processes: they utilize systems thinking to solve complex engineering problems. Within the space industry demographics and economic pressures are reducing the number of experienced systems engineers that will be available in the future. Collaborative systems thinking within systems engineering teams is proposed as a way to integrate systems engineers of various experience levels to handle complex systems engineering challenges. This paper uses the GOES-R Program Systems Engineering team to illustrate the enablers and barriers to team level systems thinking and to identify ways in which performance could be improved. Ways NASA could expand its engineering training to promote team-level systems thinking are proposed.

  11. Control of complex physically simulated robot groups

    NASA Astrophysics Data System (ADS)

    Brogan, David C.

    2001-10-01

    Actuated systems such as robots take many forms and sizes but each requires solving the difficult task of utilizing available control inputs to accomplish desired system performance. Coordinated groups of robots provide the opportunity to accomplish more complex tasks, to adapt to changing environmental conditions, and to survive individual failures. Similarly, groups of simulated robots, represented as graphical characters, can test the design of experimental scenarios and provide autonomous interactive counterparts for video games. The complexity of writing control algorithms for these groups currently hinders their use. A combination of biologically inspired heuristics, search strategies, and optimization techniques serve to reduce the complexity of controlling these real and simulated characters and to provide computationally feasible solutions.

  12. A channel estimation scheme for MIMO-OFDM systems

    NASA Astrophysics Data System (ADS)

    He, Chunlong; Tian, Chu; Li, Xingquan; Zhang, Ce; Zhang, Shiqi; Liu, Chaowen

    2017-08-01

    In view of the contradiction of the time-domain least squares (LS) channel estimation performance and the practical realization complexity, a reduced complexity channel estimation method for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) based on pilot is obtained. This approach can transform the complexity of MIMO-OFDM channel estimation problem into a simple single input single output-orthogonal frequency division multiplexing (SISO-OFDM) channel estimation problem and therefore there is no need for large matrix pseudo-inverse, which greatly reduces the complexity of algorithms. Simulation results show that the bit error rate (BER) performance of the obtained method with time orthogonal training sequences and linear minimum mean square error (LMMSE) criteria is better than that of time-domain LS estimator and nearly optimal performance.

  13. Determination of effective loss factors in reduced SEA models

    NASA Astrophysics Data System (ADS)

    Chimeno Manguán, M.; Fernández de las Heras, M. J.; Roibás Millán, E.; Simón Hidalgo, F.

    2017-01-01

    The definition of Statistical Energy Analysis (SEA) models for large complex structures is highly conditioned by the classification of the structure elements into a set of coupled subsystems and the subsequent determination of the loss factors representing both the internal damping and the coupling between subsystems. The accurate definition of the complete system can lead to excessively large models as the size and complexity increases. This fact can also rise practical issues for the experimental determination of the loss factors. This work presents a formulation of reduced SEA models for incomplete systems defined by a set of effective loss factors. This reduced SEA model provides a feasible number of subsystems for the application of the Power Injection Method (PIM). For structures of high complexity, their components accessibility can be restricted, for instance internal equipments or panels. For these cases the use of PIM to carry out an experimental SEA analysis is not possible. New methods are presented for this case in combination with the reduced SEA models. These methods allow defining some of the model loss factors that could not be obtained through PIM. The methods are validated with a numerical analysis case and they are also applied to an actual spacecraft structure with accessibility restrictions: a solar wing in folded configuration.

  14. Managing Complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Posse, Christian; Malard, Joel M.

    2004-08-01

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today’s most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically-based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This paper explores the state of the art in the use physical analogs for understanding the behavior of some econophysical systems and to deriving stable and robust controlmore » strategies for them. In particular we review and discussion applications of some analytic methods based on the thermodynamic metaphor according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood.« less

  15. Carbohydrate-Based Host-Guest Complexation of Hydrophobic Antibiotics for the Enhancement of Antibacterial Activity.

    PubMed

    Jeong, Daham; Joo, Sang-Woo; Shinde, Vijay Vilas; Cho, Eunae; Jung, Seunho

    2017-08-08

    Host-guest complexation with various hydrophobic drugs has been used to enhance the solubility, permeability, and stability of guest drugs. Physical changes in hydrophobic drugs by complexation have been related to corresponding increases in the bioavailability of these drugs. Carbohydrates, including various derivatives of cyclodextrins, cyclosophoraoses, and some linear oligosaccharides, are generally used as host complexation agents in drug delivery systems. Many antibiotics with low bioavailability have some limitations to their clinical use due to their intrinsically poor aqueous solubility. Bioavailability enhancement is therefore an important step to achieve the desired concentration of antibiotics in the treatment of bacterial infections. Antibiotics encapsulated in a complexation-based drug delivery system will display improved antibacterial activity making it possible to reduce dosages and overcome the serious global problem of antibiotic resistance. Here, we review the present research trends in carbohydrate-based host-guest complexation of various hydrophobic antibiotics as an efficient delivery system to improve solubility, permeability, stability, and controlled release.

  16. An Efficient Model-based Diagnosis Engine for Hybrid Systems Using Structural Model Decomposition

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Narasimhan, Sriram; Roychoudhury, Indranil; Daigle, Matthew; Pulido, Belarmino

    2013-01-01

    Complex hybrid systems are present in a large range of engineering applications, like mechanical systems, electrical circuits, or embedded computation systems. The behavior of these systems is made up of continuous and discrete event dynamics that increase the difficulties for accurate and timely online fault diagnosis. The Hybrid Diagnosis Engine (HyDE) offers flexibility to the diagnosis application designer to choose the modeling paradigm and the reasoning algorithms. The HyDE architecture supports the use of multiple modeling paradigms at the component and system level. However, HyDE faces some problems regarding performance in terms of complexity and time. Our focus in this paper is on developing efficient model-based methodologies for online fault diagnosis in complex hybrid systems. To do this, we propose a diagnosis framework where structural model decomposition is integrated within the HyDE diagnosis framework to reduce the computational complexity associated with the fault diagnosis of hybrid systems. As a case study, we apply our approach to a diagnostic testbed, the Advanced Diagnostics and Prognostics Testbed (ADAPT), using real data.

  17. Spatiotemporal control to eliminate cardiac alternans using isostable reduction

    NASA Astrophysics Data System (ADS)

    Wilson, Dan; Moehlis, Jeff

    2017-03-01

    Cardiac alternans, an arrhythmia characterized by a beat-to-beat alternation of cardiac action potential durations, is widely believed to facilitate the transition from normal cardiac function to ventricular fibrillation and sudden cardiac death. Alternans arises due to an instability of a healthy period-1 rhythm, and most dynamical control strategies either require extensive knowledge of the cardiac system, making experimental validation difficult, or are model independent and sacrifice important information about the specific system under study. Isostable reduction provides an alternative approach, in which the response of a system to external perturbations can be used to reduce the complexity of a cardiac system, making it easier to work with from an analytical perspective while retaining many of its important features. Here, we use isostable reduction strategies to reduce the complexity of partial differential equation models of cardiac systems in order to develop energy optimal strategies for the elimination of alternans. Resulting control strategies require significantly less energy to terminate alternans than comparable strategies and do not require continuous state feedback.

  18. Communication Network Integration and Group Uniformity in a Complex Organization.

    ERIC Educational Resources Information Center

    Danowski, James A.; Farace, Richard V.

    This paper contains a discussion of the limitations of research on group processes in complex organizations and the manner in which a procedure for network analysis in on-going systems can reduce problems. The research literature on group uniformity processes and on theoretical models of these processes from an information processing perspective…

  19. Physiological complexity and system adaptability: evidence from postural control dynamics of older adults.

    PubMed

    Manor, Brad; Costa, Madalena D; Hu, Kun; Newton, Elizabeth; Starobinets, Olga; Kang, Hyun Gu; Peng, C K; Novak, Vera; Lipsitz, Lewis A

    2010-12-01

    The degree of multiscale complexity in human behavioral regulation, such as that required for postural control, appears to decrease with advanced aging or disease. To help delineate causes and functional consequences of complexity loss, we examined the effects of visual and somatosensory impairment on the complexity of postural sway during quiet standing and its relationship to postural adaptation to cognitive dual tasking. Participants of the MOBILIZE Boston Study were classified into mutually exclusive groups: controls [intact vision and foot somatosensation, n = 299, 76 ± 5 (SD) yr old], visual impairment only (<20/40 vision, n = 81, 77 ± 4 yr old), somatosensory impairment only (inability to perceive 5.07 monofilament on plantar halluxes, n = 48, 80 ± 5 yr old), and combined impairments (n = 25, 80 ± 4 yr old). Postural sway (i.e., center-of-pressure) dynamics were assessed during quiet standing and cognitive dual tasking, and a complexity index was quantified using multiscale entropy analysis. Postural sway speed and area, which did not correlate with complexity, were also computed. During quiet standing, the complexity index (mean ± SD) was highest in controls (9.5 ± 1.2) and successively lower in the visual (9.1 ± 1.1), somatosensory (8.6 ± 1.6), and combined (7.8 ± 1.3) impairment groups (P = 0.001). Dual tasking resulted in increased sway speed and area but reduced complexity (P < 0.01). Lower complexity during quiet standing correlated with greater absolute (R = -0.34, P = 0.002) and percent (R = -0.45, P < 0.001) increases in postural sway speed from quiet standing to dual-tasking conditions. Sensory impairments contributed to decreased postural sway complexity, which reflected reduced adaptive capacity of the postural control system. Relatively low baseline complexity may, therefore, indicate control systems that are more vulnerable to cognitive and other stressors.

  20. Physiological complexity and system adaptability: evidence from postural control dynamics of older adults

    PubMed Central

    Costa, Madalena D.; Hu, Kun; Newton, Elizabeth; Starobinets, Olga; Kang, Hyun Gu; Peng, C. K.; Novak, Vera; Lipsitz, Lewis A.

    2010-01-01

    The degree of multiscale complexity in human behavioral regulation, such as that required for postural control, appears to decrease with advanced aging or disease. To help delineate causes and functional consequences of complexity loss, we examined the effects of visual and somatosensory impairment on the complexity of postural sway during quiet standing and its relationship to postural adaptation to cognitive dual tasking. Participants of the MOBILIZE Boston Study were classified into mutually exclusive groups: controls [intact vision and foot somatosensation, n = 299, 76 ± 5 (SD) yr old], visual impairment only (<20/40 vision, n = 81, 77 ± 4 yr old), somatosensory impairment only (inability to perceive 5.07 monofilament on plantar halluxes, n = 48, 80 ± 5 yr old), and combined impairments (n = 25, 80 ± 4 yr old). Postural sway (i.e., center-of-pressure) dynamics were assessed during quiet standing and cognitive dual tasking, and a complexity index was quantified using multiscale entropy analysis. Postural sway speed and area, which did not correlate with complexity, were also computed. During quiet standing, the complexity index (mean ± SD) was highest in controls (9.5 ± 1.2) and successively lower in the visual (9.1 ± 1.1), somatosensory (8.6 ± 1.6), and combined (7.8 ± 1.3) impairment groups (P = 0.001). Dual tasking resulted in increased sway speed and area but reduced complexity (P < 0.01). Lower complexity during quiet standing correlated with greater absolute (R = −0.34, P = 0.002) and percent (R = −0.45, P < 0.001) increases in postural sway speed from quiet standing to dual-tasking conditions. Sensory impairments contributed to decreased postural sway complexity, which reflected reduced adaptive capacity of the postural control system. Relatively low baseline complexity may, therefore, indicate control systems that are more vulnerable to cognitive and other stressors. PMID:20947715

  1. Connectivity in the human brain dissociates entropy and complexity of auditory inputs.

    PubMed

    Nastase, Samuel A; Iacovella, Vittorio; Davis, Ben; Hasson, Uri

    2015-03-01

    Complex systems are described according to two central dimensions: (a) the randomness of their output, quantified via entropy; and (b) their complexity, which reflects the organization of a system's generators. Whereas some approaches hold that complexity can be reduced to uncertainty or entropy, an axiom of complexity science is that signals with very high or very low entropy are generated by relatively non-complex systems, while complex systems typically generate outputs with entropy peaking between these two extremes. In understanding their environment, individuals would benefit from coding for both input entropy and complexity; entropy indexes uncertainty and can inform probabilistic coding strategies, whereas complexity reflects a concise and abstract representation of the underlying environmental configuration, which can serve independent purposes, e.g., as a template for generalization and rapid comparisons between environments. Using functional neuroimaging, we demonstrate that, in response to passively processed auditory inputs, functional integration patterns in the human brain track both the entropy and complexity of the auditory signal. Connectivity between several brain regions scaled monotonically with input entropy, suggesting sensitivity to uncertainty, whereas connectivity between other regions tracked entropy in a convex manner consistent with sensitivity to input complexity. These findings suggest that the human brain simultaneously tracks the uncertainty of sensory data and effectively models their environmental generators. Copyright © 2014. Published by Elsevier Inc.

  2. Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications

    PubMed Central

    Stoppe, Jannis; Drechsler, Rolf

    2015-01-01

    The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC. PMID:25946632

  3. Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications.

    PubMed

    Stoppe, Jannis; Drechsler, Rolf

    2015-05-04

    The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC.

  4. Reduction of a linear complex model for respiratory system during Airflow Interruption.

    PubMed

    Jablonski, Ireneusz; Mroczka, Janusz

    2010-01-01

    The paper presents methodology of a complex model reduction to its simpler version - an identifiable inverse model. Its main tool is a numerical procedure of sensitivity analysis (structural and parametric) applied to the forward linear equivalent designed for the conditions of interrupter experiment. Final result - the reduced analog for the interrupter technique is especially worth of notice as it fills a major gap in occlusional measurements, which typically use simple, one- or two-element physical representations. Proposed electrical reduced circuit, being structural combination of resistive, inertial and elastic properties, can be perceived as a candidate for reliable reconstruction and quantification (in the time and frequency domain) of dynamical behavior of the respiratory system in response to a quasi-step excitation by valve closure.

  5. Spatial Characterization of Riparian Buffer Effects on Sediment Loads from Watershed Systems

    EPA Science Inventory

    Understanding all watershed systems and their interactions is a complex, but critical, undertaking when developing practices designed to reduce topsoil loss and chemical/nutrient transport from agricultural fields. The presence of riparian buffer vegetation in agricultural lands...

  6. Spatial Distributions of Guest Molecule and Hydration Level in Dendrimer-Based Guest–Host Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chih-Ying; Chen, Hsin-Lung; Do, Changwoo

    2016-08-09

    Using the electrostatic complex of G4 poly(amidoamine) (PAMAM) dendrimer with an amphiphilic surfactant as a model system, contrast variation small angle neutron scattering (SANS) is implemented to resolve the key structural characteristics of dendrimer-based guest–host system. Quantifications of the radial distributions of the scattering length density and the hydration level within the complex molecule reveal that the surfactant is embedded in the peripheral region of dendrimer and the steric crowding in this region increases the backfolding of the dendritic segments, thereby reducing the hydration level throughout the complex molecule. Here, the insights into the spatial location of the guest moleculesmore » as well as the perturbations of dendrimer conformation and hydration level deduced here are crucial for the delicate design of dendrimer-based guest–host system for biomedical applications.« less

  7. Resource Management for Distributed Parallel Systems

    NASA Technical Reports Server (NTRS)

    Neuman, B. Clifford; Rao, Santosh

    1993-01-01

    Multiprocessor systems should exist in the the larger context of distributed systems, allowing multiprocessor resources to be shared by those that need them. Unfortunately, typical multiprocessor resource management techniques do not scale to large networks. The Prospero Resource Manager (PRM) is a scalable resource allocation system that supports the allocation of processing resources in large networks and multiprocessor systems. To manage resources in such distributed parallel systems, PRM employs three types of managers: system managers, job managers, and node managers. There exist multiple independent instances of each type of manager, reducing bottlenecks. The complexity of each manager is further reduced because each is designed to utilize information at an appropriate level of abstraction.

  8. Building Enterprise Transition Plans Through the Development of Collapsing Design Structure Matrices

    DTIC Science & Technology

    2015-09-17

    processes from the earliest input to the final output to evaluate where change is needed to reduce costs, reduce waste, and improve the flow of information...from) integrating a large complex enterprise? • How should firms/enterprises evaluate systems prior to integration? What are some valid taxonomies

  9. Complexity in electronic negotiation support systems.

    PubMed

    Griessmair, Michele; Strunk, Guido; Vetschera, Rudolf; Koeszegi, Sabine T

    2011-10-01

    It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.

  10. Making sense in a complex landscape: how the Cynefin Framework from Complex Adaptive Systems Theory can inform health promotion practice.

    PubMed

    Van Beurden, Eric K; Kia, Annie M; Zask, Avigdor; Dietrich, Uta; Rose, Lauren

    2013-03-01

    Health promotion addresses issues from the simple (with well-known cause/effect links) to the highly complex (webs and loops of cause/effect with unpredictable, emergent properties). Yet there is no conceptual framework within its theory base to help identify approaches appropriate to the level of complexity. The default approach favours reductionism--the assumption that reducing a system to its parts will inform whole system behaviour. Such an approach can yield useful knowledge, yet is inadequate where issues have multiple interacting causes, such as social determinants of health. To address complex issues, there is a need for a conceptual framework that helps choose action that is appropriate to context. This paper presents the Cynefin Framework, informed by complexity science--the study of Complex Adaptive Systems (CAS). It introduces key CAS concepts and reviews the emergence and implications of 'complex' approaches within health promotion. It explains the framework and its use with examples from contemporary practice, and sets it within the context of related bodies of health promotion theory. The Cynefin Framework, especially when used as a sense-making tool, can help practitioners understand the complexity of issues, identify appropriate strategies and avoid the pitfalls of applying reductionist approaches to complex situations. The urgency to address critical issues such as climate change and the social determinants of health calls for us to engage with complexity science. The Cynefin Framework helps practitioners make the shift, and enables those already engaged in complex approaches to communicate the value and meaning of their work in a system that privileges reductionist approaches.

  11. The role of geochemistry and energetics in the evolution of modern respiratory complexes from a proton-reducing ancestor.

    PubMed

    Schut, Gerrit J; Zadvornyy, Oleg; Wu, Chang-Hao; Peters, John W; Boyd, Eric S; Adams, Michael W W

    2016-07-01

    Complex I or NADH quinone oxidoreductase (NUO) is an integral component of modern day respiratory chains and has a close evolutionary relationship with energy-conserving [NiFe]-hydrogenases of anaerobic microorganisms. Specifically, in all of biology, the quinone-binding subunit of Complex I, NuoD, is most closely related to the proton-reducing, H2-evolving [NiFe]-containing catalytic subunit, MbhL, of membrane-bound hydrogenase (MBH), to the methanophenzine-reducing subunit of a methanogenic respiratory complex (FPO) and to the catalytic subunit of an archaeal respiratory complex (MBX) involved in reducing elemental sulfur (S°). These complexes also pump ions and have at least 10 homologous subunits in common. As electron donors, MBH and MBX use ferredoxin (Fd), FPO uses either Fd or cofactor F420, and NUO uses either Fd or NADH. In this review, we examine the evolutionary trajectory of these oxidoreductases from a proton-reducing ancestral respiratory complex (ARC). We hypothesize that the diversification of ARC to MBH, MBX, FPO and eventually NUO was driven by the larger energy yields associated with coupling Fd oxidation to the reduction of oxidants with increasing electrochemical potential, including protons, S° and membrane soluble organic compounds such as phenazines and quinone derivatives. Importantly, throughout Earth's history, the availability of these oxidants increased as the redox state of the atmosphere and oceans became progressively more oxidized as a result of the origin and ecological expansion of oxygenic photosynthesis. ARC-derived complexes are therefore remarkably stable respiratory systems with little diversity in core structure but whose general function appears to have co-evolved with the redox state of the biosphere. This article is part of a Special Issue entitled Respiratory Complex I, edited by Volker Zickermann and Ulrich Brandt. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. GaAs VLSI technology and circuit elements for DSP

    NASA Astrophysics Data System (ADS)

    Mikkelson, James M.

    1990-10-01

    Recent progress in digital GaAs circuit performance and complexity is presented to demonstrate the current capabilities of GaAs components. High density GaAs process technology and circuit design techniques are described and critical issues for achieving favorable complexity speed power and cost tradeoffs are reviewed. Some DSP building blocks are described to provide examples of what types of DSP systems could be implemented with present GaAs technology. DIGITAL GaAs CIRCUIT CAPABILITIES In the past few years the capabilities of digital GaAs circuits have dramatically increased to the VLSI level. Major gains in circuit complexity and power-delay products have been achieved by the use of silicon-like process technologies and simple circuit topologies. The very high speed and low power consumption of digital GaAs VLSI circuits have made GaAs a desirable alternative to high performance silicon in hardware intensive high speed system applications. An example of the performance and integration complexity available with GaAs VLSI circuits is the 64x64 crosspoint switch shown in figure 1. This switch which is the most complex GaAs circuit currently available is designed on a 30 gate GaAs gate array. It operates at 200 MHz and dissipates only 8 watts of power. The reasons for increasing the level of integration of GaAs circuits are similar to the reasons for the continued increase of silicon circuit complexity. The market factors driving GaAs VLSI are system design methodology system cost power and reliability. System designers are hesitant or unwilling to go backwards to previous design techniques and lower levels of integration. A more highly integrated system in a lower performance technology can often approach the performance of a system in a higher performance technology at a lower level of integration. Higher levels of integration also lower the system component count which reduces the system cost size and power consumption while improving the system reliability. For large gate count circuits the power per gate must be minimized to prevent reliability and cooling problems. The technical factors which favor increasing GaAs circuit complexity are primarily related to reducing the speed and power penalties incurred when crossing chip boundaries. Because the internal GaAs chip logic levels are not compatible with standard silicon I/O levels input receivers and output drivers are needed to convert levels. These I/O circuits add significant delay to logic paths consume large amounts of power and use an appreciable portion of the die area. The effects of these I/O penalties can be reduced by increasing the ratio of core logic to I/O on a chip. DSP operations which have a large number of logic stages between the input and the output are ideal candidates to take advantage of the performance of GaAs digital circuits. Figure 2 is a schematic representation of the I/O penalties encountered when converting from ECL levels to GaAs

  13. Modeling Power Systems as Complex Adaptive Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Malard, Joel M.; Posse, Christian

    2004-12-30

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today's most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This report explores the state-of-the-art physical analogs for understanding the behavior of some econophysical systems and deriving stable and robust control strategies for using them. We reviewmore » and discuss applications of some analytic methods based on a thermodynamic metaphor, according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood. We apply these methods to the question of how power markets can be expected to behave under a variety of conditions.« less

  14. Toward Implementing Patient Flow in a Cancer Treatment Center to Reduce Patient Waiting Time and Improve Efficiency.

    PubMed

    Suss, Samuel; Bhuiyan, Nadia; Demirli, Kudret; Batist, Gerald

    2017-06-01

    Outpatient cancer treatment centers can be considered as complex systems in which several types of medical professionals and administrative staff must coordinate their work to achieve the overall goals of providing quality patient care within budgetary constraints. In this article, we use analytical methods that have been successfully employed for other complex systems to show how a clinic can simultaneously reduce patient waiting times and non-value added staff work in a process that has a series of steps, more than one of which involves a scarce resource. The article describes the system model and the key elements in the operation that lead to staff rework and patient queuing. We propose solutions to the problems and provide a framework to evaluate clinic performance. At the time of this report, the proposals are in the process of implementation at a cancer treatment clinic in a major metropolitan hospital in Montreal, Canada.

  15. CAD-Based Aerodynamic Design of Complex Configurations using a Cartesian Method

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.; Pulliam, Thomas H.

    2003-01-01

    A modular framework for aerodynamic optimization of complex geometries is developed. By working directly with a parametric CAD system, complex-geometry models are modified nnd tessellated in an automatic fashion. The use of a component-based Cartesian method significantly reduces the demands on the CAD system, and also provides for robust and efficient flowfield analysis. The optimization is controlled using either a genetic or quasi-Newton algorithm. Parallel efficiency of the framework is maintained even when subject to limited CAD resources by dynamically re-allocating the processors of the flow solver. Overall, the resulting framework can explore designs incorporating large shape modifications and changes in topology.

  16. 12-mode OFDM transmission using reduced-complexity maximum likelihood detection.

    PubMed

    Lobato, Adriana; Chen, Yingkan; Jung, Yongmin; Chen, Haoshuo; Inan, Beril; Kuschnerov, Maxim; Fontaine, Nicolas K; Ryf, Roland; Spinnler, Bernhard; Lankl, Berthold

    2015-02-01

    We report the transmission of 163-Gb/s MDM-QPSK-OFDM and 245-Gb/s MDM-8QAM-OFDM transmission over 74 km of few-mode fiber supporting 12 spatial and polarization modes. A low-complexity maximum likelihood detector is employed to enhance the performance of a system impaired by mode-dependent loss.

  17. The Equity of New York State's System of Financing Schools: An Update.

    ERIC Educational Resources Information Center

    Scheuer, Joan

    1983-01-01

    This statistical analysis of the equity and efficiency of New York's complex school finance system concludes that legislation since 1975 has neither significantly reduced wide disparities in local spending nor weakened the link between wealth and expenditure because the system cannot be improved without a substantial funding increase. (MJL)

  18. Methods for evaluating information in managing the enterprise on the basis of a hybrid three-tier system

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-01-01

    The article presents data on the influence of information upon the functioning of complex systems in the process of ensuring their effective management. Ways and methods for evaluating multidimensional information that reduce time and resources, improve the validity of the studied system management decisions, were proposed.

  19. The effect of Beta-cyclodextrin on percutaneous absorption of commonly used Eusolex® sunscreens.

    PubMed

    Shokri, J; Hasanzadeh, D; Ghanbarzadeh, S; Dizadji-Ilkhchi, M; Adibkia, K

    2013-11-01

    There is a serious concern about the topical and systemic absorption of organic ultraviolet filters in sunscreen formulations and subsequent phototoxic and photo allergic reactions. Ideally, a sunscreen should localize in the surface of stratum corneum and create a barrier against UV radiation, but not penetrate into the underlying viable tissues and systemic circulation. The objective of the present study was to determine the effects of β-cyclodextrin (β-CDX) complexation on the transdermal penetration of 3 commonly used sun blocking agents, Eusolex ® 4360 (avobenzone), Eusolex ® 9020 (Oxybenzone) and Eusolex ® 232 (Ensulizole). The complexation of the sunscreen agents with β-CDX was performed by 3 methods and confirmed by differential scanning calorimetry (DSC). Sunscreens, and their physical mixtures and complexes with β-CDX were introduced into a model cream base (o/w emulsion). To find out the influence of β-CDX, sunscreen creams were applied to the rat skin in vitro in standard Franz diffusion cells and the amount of sunscreen permeated after 6 h was assessed by HPLC. The skin penetration flux of the UV filters was significantly reduced (4–15 fold) by complexation with β-CDX. Complexation also could prolong absorption lag time of sun blocking agents to more than 150 min. Considering the ability of β-CDX complexation in the reduction of flux and enhancement ratio as well as prolongation of absorption lag time, this technique could be very helpful for reducing systemic absorption of the UV filters and subsequent toxicity and allergic reaction.

  20. Drug Delivery Systems For Anti-Cancer Active Complexes of Some Coinage Metals.

    PubMed

    Zhang, Ming; Saint-Germain, Camille; He, Guiling; Sun, Raymond Wai-Yin

    2018-02-12

    Although cisplatin and a number of platinum complexes have widely been used for the treatment of neoplasia, patients receiving these treatments have frequently suffered from their severe toxic side effects, the development of resistance with consequent relapse. In the recent decades, numerous complexes of coinage metals including that of gold, copper and silver have been reported to display promising in vitro and/or in vivo anti-cancer activities as well as potent activities towards cisplatin-resistant tumors. Nevertheless, the medical development of these metal complexes has been hampered by their instability in aqueous solutions and the nonspecific binding in biological systems. One of the approaches to overcome these problems is to design and develop adequate drug delivery systems (DDSs) for the transport of these complexes. By functionalization, encapsulation or formulation of the metal complexes, several types of DDSs have been reported to improve the desired pharmacological profile of the metal complexes, improving their overall stability, bioavailability, anti-cancer activity and reducing their toxicity towards normal cells. In this review, we summarized the recent findings for different DDSs for various anti- cancer active complexes of some coinage metals. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. Acoustic Levitation With One Driver

    NASA Technical Reports Server (NTRS)

    Wang, T. G.; Rudnick, I.; Elleman, D. D.; Stoneburner, J. D.

    1985-01-01

    Report discusses acoustic levitation in rectangular chamber using one driver mounted at corner. Placement of driver at corner enables it to couple effectively to acoustic modes along all three axes. Use of single driver reduces cost, complexity and weight of levitation system below those of three driver system.

  2. Using a commercial CAD system for simultaneous input to theoretical aerodynamic programs and wind-tunnel model construction

    NASA Technical Reports Server (NTRS)

    Enomoto, F.; Keller, P.

    1984-01-01

    The Computer Aided Design (CAD) system's common geometry database was used to generate input for theoretical programs and numerically controlled (NC) tool paths for wind tunnel part fabrication. This eliminates the duplication of work in generating separate geometry databases for each type of analysis. Another advantage is that it reduces the uncertainty due to geometric differences when comparing theoretical aerodynamic data with wind tunnel data. The system was adapted to aerodynamic research by developing programs written in Design Analysis Language (DAL). These programs reduced the amount of time required to construct complex geometries and to generate input for theoretical programs. Certain shortcomings of the Design, Drafting, and Manufacturing (DDM) software limited the effectiveness of these programs and some of the Calma NC software. The complexity of aircraft configurations suggests that more types of surface and curve geometry should be added to the system. Some of these shortcomings may be eliminated as improved versions of DDM are made available.

  3. Formation of W(3)A(1) electron-transferring flavoprotein (ETF) hydroquinone in the trimethylamine dehydrogenase x ETF protein complex.

    PubMed

    Jang, M H; Scrutton, N S; Hille, R

    2000-04-28

    The electron-transferring flavoprotein (ETF) from Methylophilus methylotrophus (sp. W(3)A(1)) exhibits unusual oxidation-reduction properties and can only be reduced to the level of the semiquinone under most circumstances (including turnover with its physiological reductant, trimethylamine dehydrogenase (TMADH), or reaction with strong reducing reagents such as sodium dithionite). In the present study, we demonstrate that ETF can be reduced fully to its hydroquinone form both enzymatically and chemically when it is in complex with TMADH. Quantitative titration of the TMADH x ETF protein complex with sodium dithionite shows that a total of five electrons are taken up by the system, indicating that full reduction of ETF occurs within the complex. The results indicate that the oxidation-reduction properties of ETF are perturbed upon binding to TMADH, a conclusion further supported by the observation of a spectral change upon formation of the TMADH x ETF complex that is due to a change in the environment of the FAD of ETF. The results are discussed in the context of ETF undergoing a conformational change during formation of the TMADH x ETF electron transfer complex, which modulates the spectral and oxidation-reduction properties of ETF such that full reduction of the protein can take place.

  4. Photo-driven electron transfer from the highly reducing excited state of naphthalene diimide radical anion to a CO 2 reduction catalyst within a molecular triad

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinez, Jose F.; La Porte, Nathan T.; Mauck, Catherine M.

    2017-01-01

    The naphthalene-1,4:5,8-bis(dicarboximide) radical anion (NDI -˙), which is easily produced by mild chemical or electrochemical reduction (-0.5 Vvs.SCE), can be photoexcited at wavelengths as long as 785 nm, and has an excited state (NDI -˙*) oxidation potential of -2.1 Vvs.SCE, making it a very attractive choice for artificial photosynthetic systems that require powerful photoreductants, such as CO 2 reduction catalysts. However, once an electron is transferred from NDI -˙* to an acceptor directly bound to it, a combination of strong electronic coupling and favorable free energy change frequently make the back electron transfer rapid. To mitigate this effect, we havemore » designed a molecular triad system comprising an NDI -˙ chromophoric donor, a 9,10-diphenylanthracene (DPA) intermediate acceptor, and a Re(dmb)(CO) 3carbon dioxide reduction catalyst, where dmb is 4,4'-dimethyl-2,2'-bipyridine, as the terminal acceptor. Photoexcitation of NDI -˙ to NDI -˙* is followed by ultrafast reduction of DPA to DPA -˙, which then rapidly reduces the metal complex. The overall time constant for the forward electron transfer to reduce the metal complex is τ = 20.8 ps, while the time constant for back-electron transfer is six orders of magnitude longer, τ = 43.4 μs. Achieving long-lived, highly reduced states of these metal complexes is a necessary condition for their use as catalysts. The extremely long lifetime of the reduced metal complex is attributed to careful tuning of the redox potentials of the chromophore and intermediate acceptor. The NDI -˙–DPA fragment presents many attractive features for incorporation into other photoinduced electron transfer assemblies directed at the long-lived photosensitization of difficult-to-reduce catalytic centers.« less

  5. Increased Fire and Toxic Contaminant Detection Responsibility by Use of Distributed, Aspirating Sensors

    NASA Technical Reports Server (NTRS)

    Youngblood, Wallace W.

    1990-01-01

    Viewgraphs of increased fire and toxic contaminant detection responsivity by use of distributed, aspirating sensors for space station are presented. Objectives of the concept described are (1) to enhance fire and toxic contaminant detection responsivity in habitable regions of space station; (2) to reduce system weight and complexity through centralized detector/monitor systems; (3) to increase fire signature information from selected locations in a space station module; and (4) to reduce false alarms.

  6. Experimental study of combustion in hydrogen peroxide hybrid rockets

    NASA Astrophysics Data System (ADS)

    Wernimont, Eric John

    Combustion behavior in a hydrogen peroxide oxidized hybrid rocket motor is investigated with a series of experiments. Hybrid chemical rocket propulsion is presently of interest due to reduced system complexity compared to classical chemical propulsion systems. Reduced system complexity, by use of a storable oxidizer and a hybrid configuration, is expected to reduce propulsive costs. The fuel in this study is polyethylene which has the potential of continuous manufacture leading to further reduced system costs. The study investigated parameters of interest for nominal design of a full scale hydrogen peroxide oxidized hybrid rocket. Amongst these parameters is the influence of chamber pressure, mass flux, fuel molecular weight and fuel density on fuel regression rate. Effects of chamber pressure and aft combustion length on combustion efficiency and non-acoustic combustion oscillations are also examined. The fuel regression behavior is found to be strongly influenced by both chamber pressure and mass flux. Combustion efficiencies in the upper 90% range are attained by simple changes to the aft combustion chamber length as well as increased combustion pressure. Fuel burning surface is found to be influenced by the density of the polyethylene polymer as well as molecular weight. The combustion is observed to be exceptionally smooth (oscillations less than 5% zero-to-peak of mean) in all motors tested in this program. Tests using both a single port fuel gain and a novel radial flow hybrid are also performed.

  7. The antagonistic modulation of Arp2/3 activity by N-WASP, WAVE2 and PICK1 defines dynamic changes in astrocyte morphology.

    PubMed

    Murk, Kai; Blanco Suarez, Elena M; Cockbill, Louisa M R; Banks, Paul; Hanley, Jonathan G

    2013-09-01

    Astrocytes exhibit a complex, branched morphology, allowing them to functionally interact with numerous blood vessels, neighboring glial processes and neuronal elements, including synapses. They also respond to central nervous system (CNS) injury by a process known as astrogliosis, which involves morphological changes, including cell body hypertrophy and thickening of major processes. Following severe injury, astrocytes exhibit drastically reduced morphological complexity and collectively form a glial scar. The mechanistic details behind these morphological changes are unknown. Here, we investigate the regulation of the actin-nucleating Arp2/3 complex in controlling dynamic changes in astrocyte morphology. In contrast to other cell types, Arp2/3 inhibition drives the rapid expansion of astrocyte cell bodies and major processes. This intervention results in a reduced morphological complexity of astrocytes in both dissociated culture and in brain slices. We show that this expansion requires functional myosin II downstream of ROCK and RhoA. Knockdown of the Arp2/3 subunit Arp3 or the Arp2/3 activator N-WASP by siRNA also results in cell body expansion and reduced morphological complexity, whereas depleting WAVE2 specifically reduces the branching complexity of astrocyte processes. By contrast, knockdown of the Arp2/3 inhibitor PICK1 increases astrocyte branching complexity. Furthermore, astrocyte expansion induced by ischemic conditions is delayed by PICK1 knockdown or N-WASP overexpression. Our findings identify a new morphological outcome for Arp2/3 activation in restricting rather than promoting outwards movement of the plasma membrane in astrocytes. The Arp2/3 regulators PICK1, and N-WASP and WAVE2 function antagonistically to control the complexity of astrocyte branched morphology, and this mechanism underlies the morphological changes seen in astrocytes during their response to pathological insult.

  8. Complex Instruction Set Quantum Computing

    NASA Astrophysics Data System (ADS)

    Sanders, G. D.; Kim, K. W.; Holton, W. C.

    1998-03-01

    In proposed quantum computers, electromagnetic pulses are used to implement logic gates on quantum bits (qubits). Gates are unitary transformations applied to coherent qubit wavefunctions and a universal computer can be created using a minimal set of gates. By applying many elementary gates in sequence, desired quantum computations can be performed. This reduced instruction set approach to quantum computing (RISC QC) is characterized by serial application of a few basic pulse shapes and a long coherence time. However, the unitary matrix of the overall computation is ultimately a unitary matrix of the same size as any of the elementary matrices. This suggests that we might replace a sequence of reduced instructions with a single complex instruction using an optimally taylored pulse. We refer to this approach as complex instruction set quantum computing (CISC QC). One trades the requirement for long coherence times for the ability to design and generate potentially more complex pulses. We consider a model system of coupled qubits interacting through nearest neighbor coupling and show that CISC QC can reduce the time required to perform quantum computations.

  9. A model for Entropy Production, Entropy Decrease and Action Minimization in Self-Organization

    NASA Astrophysics Data System (ADS)

    Georgiev, Georgi; Chatterjee, Atanu; Vu, Thanh; Iannacchione, Germano

    In self-organization energy gradients across complex systems lead to change in the structure of systems, decreasing their internal entropy to ensure the most efficient energy transport and therefore maximum entropy production in the surroundings. This approach stems from fundamental variational principles in physics, such as the principle of least action. It is coupled to the total energy flowing through a system, which leads to increase the action efficiency. We compare energy transport through a fluid cell which has random motion of its molecules, and a cell which can form convection cells. We examine the signs of change of entropy, and the action needed for the motion inside those systems. The system in which convective motion occurs, reduces the time for energy transmission, compared to random motion. For more complex systems, those convection cells form a network of transport channels, for the purpose of obeying the equations of motion in this geometry. Those transport networks are an essential feature of complex systems in biology, ecology, economy and society.

  10. Simplified Ion Thruster Xenon Feed System for NASA Science Missions

    NASA Technical Reports Server (NTRS)

    Snyder, John Steven; Randolph, Thomas M.; Hofer, Richard R.; Goebel, Dan M.

    2009-01-01

    The successful implementation of ion thruster technology on the Deep Space 1 technology demonstration mission paved the way for its first use on the Dawn science mission, which launched in September 2007. Both Deep Space 1 and Dawn used a "bang-bang" xenon feed system which has proven to be highly successful. This type of feed system, however, is complex with many parts and requires a significant amount of engineering work for architecture changes. A simplified feed system, with fewer parts and less engineering work for architecture changes, is desirable to reduce the feed system cost to future missions. An attractive new path for ion thruster feed systems is based on new components developed by industry in support of commercial applications of electric propulsion systems. For example, since the launch of Deep Space 1 tens of mechanical xenon pressure regulators have successfully flown on commercial spacecraft using electric propulsion. In addition, active proportional flow controllers have flown on the Hall-thruster-equipped Tacsat-2, are flying on the ion thruster GOCE mission, and will fly next year on the Advanced EHF spacecraft. This present paper briefly reviews the Dawn xenon feed system and those implemented on other xenon electric propulsion flight missions. A simplified feed system architecture is presented that is based on assembling flight-qualified components in a manner that will reduce non-recurring engineering associated with propulsion system architecture changes, and is compared to the NASA Dawn standard. The simplified feed system includes, compared to Dawn, passive high-pressure regulation, a reduced part count, reduced complexity due to cross-strapping, and reduced non-recurring engineering work required for feed system changes. A demonstration feed system was assembled using flight-like components and used to operate a laboratory NSTAR-class ion engine. Feed system components integrated into a single-string architecture successfully operated the engine over the entire NSTAR throttle range over a series of tests. Flow rates were very stable with variations of at most 0.2%, and transition times between throttle levels were typically 90 seconds or less with a maximum of 200 seconds, both significant improvements over the Dawn bang-bang feed system.

  11. Lactate oxidation coupled to energy production in mitochondria like particles from Setaria digitata, a filarial parasite.

    PubMed

    Sivan, V M; Raj, R K

    1994-10-14

    In the filarial parasite, Setaria digitata, the mitochondria like particles (MLP) show NAD reduction with sodium lactate. The MLP also reduces dye and ferricyanide with lactate. The ferricyanide reduction by lactate is found to be sensitive to the cytochrome o inhibitor orthohydroxy diphenyl (OHD) and complex I inhibitor rotenone, modulated by ADP (+) and ATP (-) and inhibited by pyruvate and oxaloacetate. MLP shows lactate oxidation sensitive to OHD, rotenone and sodium malonate. Thus, the lactate utilizing complex system, consisting of an NADH generating MLP bound lactate dehydrogenase and a lactate flavocytochrome reductase tightly linked to complex I and cytochrome o, produces ATP in functional association with fumarate reductase complex and other enzyme systems. Hence, this study provides new dimensions to the study of metabolism in filarial parasites.

  12. Contingency Software in Autonomous Systems: Technical Level Briefing

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Patterson-Hines, Ann

    2006-01-01

    Contingency management is essential to the robust operation of complex systems such as spacecraft and Unpiloted Aerial Vehicles (UAVs). Automatic contingency handling allows a faster response to unsafe scenarios with reduced human intervention on low-cost and extended missions. Results, applied to the Autonomous Rotorcraft Project and Mars Science Lab, pave the way to more resilient autonomous systems.

  13. Packaged Waste Treatment

    NASA Technical Reports Server (NTRS)

    1977-01-01

    This Jacksonville, Florida, apartment complex has a wastewater treatment system which clears the water, removes harmful microorganisms and reduces solid residue to ash. It is a spinoff from spacecraft waste management and environmental control technology.

  14. Recent advances in QM/MM free energy calculations using reference potentials.

    PubMed

    Duarte, Fernanda; Amrein, Beat A; Blaha-Nelson, David; Kamerlin, Shina C L

    2015-05-01

    Recent years have seen enormous progress in the development of methods for modeling (bio)molecular systems. This has allowed for the simulation of ever larger and more complex systems. However, as such complexity increases, the requirements needed for these models to be accurate and physically meaningful become more and more difficult to fulfill. The use of simplified models to describe complex biological systems has long been shown to be an effective way to overcome some of the limitations associated with this computational cost in a rational way. Hybrid QM/MM approaches have rapidly become one of the most popular computational tools for studying chemical reactivity in biomolecular systems. However, the high cost involved in performing high-level QM calculations has limited the applicability of these approaches when calculating free energies of chemical processes. In this review, we present some of the advances in using reference potentials and mean field approximations to accelerate high-level QM/MM calculations. We present illustrative applications of these approaches and discuss challenges and future perspectives for the field. The use of physically-based simplifications has shown to effectively reduce the cost of high-level QM/MM calculations. In particular, lower-level reference potentials enable one to reduce the cost of expensive free energy calculations, thus expanding the scope of problems that can be addressed. As was already demonstrated 40 years ago, the usage of simplified models still allows one to obtain cutting edge results with substantially reduced computational cost. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014. Published by Elsevier B.V.

  15. Operationally Efficient Propulsion System Study (OEPSS) data book. Volume 4: OEPSS design concepts

    NASA Technical Reports Server (NTRS)

    Wong, George S.; Ziese, James M.; Farhangi, Shahram

    1990-01-01

    This study was initiated to identify operations problems and cost drivers for current propulsion systems and to identify technology and design approaches to increase the operational efficiency and reduce operations costs for future propulsion systems. To provide readily usable data for the Advanced Launch System (ALS) program, the results of the OEPSS study have been organized into a series of OEPSS Data Books. This volume describes three propulsion concepts that will simplify the propulsion system design and significantly reduce operational requirements. The concepts include: (1) a fully integrated, booster propulsion module concept for the ALS that avoids the complex system created by using autonomous engines with numerous artificial interfaces; (2) an LOX tank aft concept which avoids potentially dangerous geysering in long LOX propellant lines; and (3) an air augmented, rocket engine nozzle afterburning propulsion concept that will significantly reduce LOX propellant requirements, reduce vehicle size and simplify ground operations and ground support equipment and facilities.

  16. Combining a reactive potential with a harmonic approximation for molecular dynamics simulation of failure: construction of a reduced potential

    NASA Astrophysics Data System (ADS)

    Tejada, I. G.; Brochard, L.; Stoltz, G.; Legoll, F.; Lelièvre, T.; Cancès, E.

    2015-01-01

    Molecular dynamics is a simulation technique that can be used to study failure in solids, provided the inter-atomic potential energy is able to account for the complex mechanisms at failure. Reactive potentials fitted on ab initio results or on experimental values have the ability to adapt to any complex atomic arrangement and, therefore, are suited to simulate failure. But the complexity of these potentials, together with the size of the systems considered, make simulations computationally expensive. In order to improve the efficiency of numerical simulations, simpler harmonic potentials can be used instead of complex reactive potentials in the regions where the system is close to its ground state and a harmonic approximation reasonably fits the actual reactive potential. However the validity and precision of such an approach has not been investigated in detail yet. We present here a methodology for constructing a reduced potential and combining it with the reactive one. We also report some important features of crack propagation that may be affected by the coupling of reactive and reduced potentials. As an illustrative case, we model a crystalline two-dimensional material (graphene) with a reactive empirical bond-order potential (REBO) or with harmonic potentials made of bond and angle springs that are designed to reproduce the second order approximation of REBO in the ground state. We analyze the consistency of this approximation by comparing the mechanical behavior and the phonon spectra of systems modeled with these potentials. These tests reveal when the anharmonicity effects appear. As anharmonic effects originate from strain, stress or temperature, the latter quantities are the basis for establishing coupling criteria for on the fly substitution in large simulations.

  17. Increased tumor localization and reduced immune response to adenoviral vector formulated with the liposome DDAB/DOPE.

    PubMed

    Steel, Jason C; Cavanagh, Heather M A; Burton, Mark A; Abu-Asab, Mones S; Tsokos, Maria; Morris, John C; Kalle, Wouter H J

    2007-04-01

    We aimed to increase the efficiency of adenoviral vectors by limiting adenoviral spread from the target site and reducing unwanted host immune responses to the vector. We complexed adenoviral vectors with DDAB-DOPE liposomes to form adenovirus-liposomal (AL) complexes. AL complexes were delivered by intratumoral injection in an immunocompetent subcutaneous rat tumor model and the immunogenicity of the AL complexes and the expression efficiency in the tumor and other organs was examined. Animals treated with the AL complexes had significantly lower levels of beta-galactosidase expression in systemic tissues compared to animals treated with the naked adenovirus (NA) (P<0.05). The tumor to non-tumor ratio of beta-galactosidase marker expression was significantly higher for the AL complex treated animals. NA induced significantly higher titers of adenoviral-specific antibodies compared to the AL complexes (P<0.05). The AL complexes provided protection (immunoshielding) to the adenovirus from neutralizing antibody. Forty-seven percent more beta-galactosidase expression was detected following intratumoral injection with AL complexes compared to the NA in animals pre-immunized with adenovirus. Complexing of adenovirus with liposomes provides a simple method to enhance tumor localization of the vector, decrease the immunogenicity of adenovirus, and provide protection of the virus from pre-existing neutralizing antibodies.

  18. Crosstalk cancellation on linearly and circularly polarized communications satellite links

    NASA Technical Reports Server (NTRS)

    Overstreet, W. P.; Bostian, C. W.

    1979-01-01

    The paper discusses the cancellation network approach for reducing crosstalk caused by depolarization on a dual-polarized communications satellite link. If the characteristics of rain depolarization are sufficiently well known, the cancellation network can be designed in a way that reduces system complexity, the most important parameter being the phase of the cross-polarized signal. Relevant theoretical calculations and experimental data are presented. The simplicity of the cancellation system proposed makes it ideal for use with small domestic or private earth terminals.

  19. Using eDNA to estimate distribution of fish species in a complex river system (presentation)

    EPA Science Inventory

    Environmental DNA (eDNA) analysis of biological material shed by aquatic organisms is a noninvasive genetic tool that can improve efficiency and reduce costs associated with species detection in aquatic systems. eDNA methods are widely used to assess presence/absence of a target ...

  20. Air traffic control : good progress on interim replacement for outage-plagued system, but risks can be further reduced

    DOT National Transportation Integrated Search

    1996-10-01

    Certain air traffic control(ATC) centers experienced a series of major outages, : some of which were caused by the Display Channel Complex or DCC-a mainframe : computer system that processes radar and other data into displayable images on : controlle...

  1. Computer program determines chemical composition of physical system at equilibrium

    NASA Technical Reports Server (NTRS)

    Kwong, S. S.

    1966-01-01

    FORTRAN 4 digital computer program calculates equilibrium composition of complex, multiphase chemical systems. This is a free energy minimization method with solution of the problem reduced to mathematical operations, without concern for the chemistry involved. Also certain thermodynamic properties are determined as byproducts of the main calculations.

  2. Comparing and improving proper orthogonal decomposition (POD) to reduce the complexity of groundwater models

    NASA Astrophysics Data System (ADS)

    Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas

    2017-04-01

    Physically-based modeling is a wide-spread tool in understanding and management of natural systems. With the high complexity of many such models and the huge amount of model runs necessary for parameter estimation and uncertainty analysis, overall run times can be prohibitively long even on modern computer systems. An encouraging strategy to tackle this problem are model reduction methods. In this contribution, we compare different proper orthogonal decomposition (POD, Siade et al. (2010)) methods and their potential applications to groundwater models. The POD method performs a singular value decomposition on system states as simulated by the complex (e.g., PDE-based) groundwater model taken at several time-steps, so-called snapshots. The singular vectors with the highest information content resulting from this decomposition are then used as a basis for projection of the system of model equations onto a subspace of much lower dimensionality than the original complex model, thereby greatly reducing complexity and accelerating run times. In its original form, this method is only applicable to linear problems. Many real-world groundwater models are non-linear, tough. These non-linearities are introduced either through model structure (unconfined aquifers) or boundary conditions (certain Cauchy boundaries, like rivers with variable connection to the groundwater table). To date, applications of POD focused on groundwater models simulating pumping tests in confined aquifers with constant head boundaries. In contrast, POD model reduction either greatly looses accuracy or does not significantly reduce model run time if the above-mentioned non-linearities are introduced. We have also found that variable Dirichlet boundaries are problematic for POD model reduction. An extension to the POD method, called POD-DEIM, has been developed for non-linear groundwater models by Stanko et al. (2016). This method uses spatial interpolation points to build the equation system in the reduced model space, thereby allowing the recalculation of system matrices at every time-step necessary for non-linear models while retaining the speed of the reduced model. This makes POD-DEIM applicable for groundwater models simulating unconfined aquifers. However, in our analysis, the method struggled to reproduce variable river boundaries accurately and gave no advantage for variable Dirichlet boundaries compared to the original POD method. We have developed another extension for POD that targets to address these remaining problems by performing a second POD operation on the model matrix on the left-hand side of the equation. The method aims to at least reproduce the accuracy of the other methods where they are applicable while outperforming them for setups with changing river boundaries or variable Dirichlet boundaries. We compared the new extension with original POD and POD-DEIM for different combinations of model structures and boundary conditions. The new method shows the potential of POD extensions for applications to non-linear groundwater systems and complex boundary conditions that go beyond the current, relatively limited range of applications. References: Siade, A. J., Putti, M., and Yeh, W. W.-G. (2010). Snapshot selection for groundwater model reduction using proper orthogonal decomposition. Water Resour. Res., 46(8):W08539. Stanko, Z. P., Boyce, S. E., and Yeh, W. W.-G. (2016). Nonlinear model reduction of unconfined groundwater flow using pod and deim. Advances in Water Resources, 97:130 - 143.

  3. A study of compositional verification based IMA integration method

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  4. Simplified radio-over-fiber transport systems with a low-cost multiband light source.

    PubMed

    Chang, Ching-Hung; Peng, Peng-Chun; Lu, Hai-Han; Shih, Chine-Liang; Chen, Hwan-Wen

    2010-12-01

    In this Letter, low-cost radio-over-fiber (ROF) transport systems are proposed and experimentally demonstrated. By utilizing a laser diode (LD) and a local oscillator (LO) to generate coherent multiband optical carriers, as well as a self-composed wavelength selector to separate every two carriers for different ROF transport systems, no any other dedicated LD or electrical frequency upconverting circuit/process is needed in the central station (CS). Compared with current ROF systems, the required numbers of LDs, LOs, and mixers in a CS are significantly reduced. Reducing the number of components not only can simplify the network structure but can also reduce the volume and complexity of the relative logistics. To demonstrate the practice of the proposed ROF transport systems, clear eye diagrams and error-free transmission performance are experimentally presented.

  5. Transition Manifolds of Complex Metastable Systems: Theory and Data-Driven Computation of Effective Dynamics.

    PubMed

    Bittracher, Andreas; Koltai, Péter; Klus, Stefan; Banisch, Ralf; Dellnitz, Michael; Schütte, Christof

    2018-01-01

    We consider complex dynamical systems showing metastable behavior, but no local separation of fast and slow time scales. The article raises the question of whether such systems exhibit a low-dimensional manifold supporting its effective dynamics. For answering this question, we aim at finding nonlinear coordinates, called reaction coordinates, such that the projection of the dynamics onto these coordinates preserves the dominant time scales of the dynamics. We show that, based on a specific reducibility property, the existence of good low-dimensional reaction coordinates preserving the dominant time scales is guaranteed. Based on this theoretical framework, we develop and test a novel numerical approach for computing good reaction coordinates. The proposed algorithmic approach is fully local and thus not prone to the curse of dimension with respect to the state space of the dynamics. Hence, it is a promising method for data-based model reduction of complex dynamical systems such as molecular dynamics.

  6. Transition Manifolds of Complex Metastable Systems

    NASA Astrophysics Data System (ADS)

    Bittracher, Andreas; Koltai, Péter; Klus, Stefan; Banisch, Ralf; Dellnitz, Michael; Schütte, Christof

    2018-04-01

    We consider complex dynamical systems showing metastable behavior, but no local separation of fast and slow time scales. The article raises the question of whether such systems exhibit a low-dimensional manifold supporting its effective dynamics. For answering this question, we aim at finding nonlinear coordinates, called reaction coordinates, such that the projection of the dynamics onto these coordinates preserves the dominant time scales of the dynamics. We show that, based on a specific reducibility property, the existence of good low-dimensional reaction coordinates preserving the dominant time scales is guaranteed. Based on this theoretical framework, we develop and test a novel numerical approach for computing good reaction coordinates. The proposed algorithmic approach is fully local and thus not prone to the curse of dimension with respect to the state space of the dynamics. Hence, it is a promising method for data-based model reduction of complex dynamical systems such as molecular dynamics.

  7. Exploring Machine Learning Techniques For Dynamic Modeling on Future Exascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Shuaiwen; Tallent, Nathan R.; Vishnu, Abhinav

    2013-09-23

    Future exascale systems must be optimized for both power and performance at scale in order to achieve DOE’s goal of a sustained petaflop within 20 Megawatts by 2022 [1]. Massive parallelism of the future systems combined with complex memory hierarchies will form a barrier to efficient application and architecture design. These challenges are exacerbated with emerging complex architectures such as GPGPUs and Intel Xeon Phi as parallelism increases orders of magnitude and system power consumption can easily triple or quadruple. Therefore, we need techniques that can reduce the search space for optimization, isolate power-performance bottlenecks, identify root causes for software/hardwaremore » inefficiency, and effectively direct runtime scheduling.« less

  8. Optically controlled phased-array antenna technology for space communication systems

    NASA Technical Reports Server (NTRS)

    Kunath, Richard R.; Bhasin, Kul B.

    1988-01-01

    Using MMICs in phased-array applications above 20 GHz requires complex RF and control signal distribution systems. Conventional waveguide, coaxial cable, and microstrip methods are undesirable due to their high weight, high loss, limited mechanical flexibility and large volume. An attractive alternative to these transmission media, for RF and control signal distribution in MMIC phased-array antennas, is optical fiber. Presented are potential system architectures and their associated characteristics. The status of high frequency opto-electronic components needed to realize the potential system architectures is also discussed. It is concluded that an optical fiber network will reduce weight and complexity, and increase reliability and performance, but may require higher power.

  9. Sites of superoxide and hydrogen peroxide production during fatty acid oxidation in rat skeletal muscle mitochondria

    PubMed Central

    Perevoshchikova, Irina V.; Quinlan, Casey L.; Orr, Adam L.; Gerencser, Akos A.; Brand, Martin D.

    2013-01-01

    H2O2 production by skeletal muscle mitochondria oxidizing palmitoylcarnitine was examined under two conditions: the absence of respiratory chain inhibitors and the presence of myxothiazol to inhibit complex III. Without inhibitors, respiration and H2O2 production were low unless carnitine or malate was added to limit acetyl-CoA accumulation. With palmitoylcarnitine alone, H2O2 production was dominated by complex II (44% from site IIF in the forward reaction); the remainder was mostly from complex I (34%, superoxide from site IF). With added carnitine, H2O2 production was about equally shared between complexes I, II, and III. With added malate, it was 75% from complex III (superoxide from site IIIQo) and 25% from site IF. Thus complex II (site IIF in the forward reaction) is a major source of H2O2 production during oxidation of palmitoylcarnitine ± carnitine. Under the second condition (myxothiazol present to keep ubiquinone reduced), the rates of H2O2 production were highest in the presence of palmitoylcarnitine ± carnitine and were dominated by complex II (site IIF in the reverse reaction). About half the rest was from site IF, but a significant portion, ~40 pmol H2O2 · min−1 · mg protein−1, was not from complex I, II, or III and was attributed to the proteins of β-oxidation (electron-transferring flavoprotein (ETF) and ETF-ubiquinone oxidoreductase). The maximum rate from the ETF system was ~200 pmol H2O2 · min−1 ~ mg protein−1 under conditions of compromised antioxidant defense and reduced ubiqui-none pool. Thus complex II and the ETF system both contribute to H2O2 production during fatty acid oxidation under appropriate conditions. PMID:23583329

  10. Analytical Micromechanics Modeling Technique Developed for Ceramic Matrix Composites Analysis

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    Ceramic matrix composites (CMCs) promise many advantages for next-generation aerospace propulsion systems. Specifically, carbon-reinforced silicon carbide (C/SiC) CMCs enable higher operational temperatures and provide potential component weight savings by virtue of their high specific strength. These attributes may provide systemwide benefits. Higher operating temperatures lessen or eliminate the need for cooling, thereby reducing both fuel consumption and the complex hardware and plumbing required for heat management. This, in turn, lowers system weight, size, and complexity, while improving efficiency, reliability, and service life, resulting in overall lower operating costs.

  11. Modern cockpit complexity challenges pilot interfaces.

    PubMed

    Dornheim, M A

    1995-01-30

    Advances in the use of automated cockpits are examined. Crashes at Nagoya and Toulouse in 1994 and incidents at Manchester, England, and Paris Orly are used as examples of cockpit automation versus manual operation of aircraft. Human factors researchers conclude that flight management systems (FMS) should have fewer modes and less authority. Reducing complexity and authority override systems of FMS can provide pilots with greater flexibility during crises. Aircraft discussed include Boeing 737-300 and 757-200, Airbus A300-600 and A310, McDonnell Douglas MD-11, and Tarom A310-300.

  12. Photoinduced electron transfer from rylenediimide radical anions and dianions to Re(bpy)(CO) 3 using red and near-infrared light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    La Porte, Nathan T.; Martinez, Jose F.; Hedström, Svante

    A major goal of artificial photosynthesis research is photosensitizing highly reducing metal centers using as much as possible of the solar spectrum reaching Earth's surface. The radical anions and dianions of rylenediimide (RDI) dyes, which absorb at wavelengths as long as 950 nm, are powerful photoreductants with excited state oxidation potentials that rival or exceed those of organometallic chromophores. These dyes have been previously incorporated into all-organic donor–acceptor systems, but have not yet been shown to reduce organometallic centers. This study describes a set of dyads in which perylenediimide (PDI) or naphthalenediimide (NDI) chromophores are attached to Re(bpy)(CO) 3 throughmore » either the bipyridine ligand or more directly to the Re center via a pyridine ligand. The chromophores are reduced with a mild reducing agent, after which excitation with long-wavelength red or near-infrared light leads to reduction of the Re complex. The kinetics of electron transfer from the photoexcited anions to the Re complex are monitored using transient visible/near-IR and mid-IR spectroscopy, complemented by theoretical spectroscopic assignments. The photo-driven charge shift from the reduced PDI or NDI to the complex occurs in picoseconds regardless of whether PDI or NDI is attached to the bipyridine or to the Re center, but back electron transfer is found to be three orders of magnitude slower with the chromophore attached to the Re center. These results will inform the design of future catalytic systems that incorporate RDI anions as chromophores.« less

  13. Photoinduced electron transfer from rylenediimide radical anions and dianions to Re(bpy)(CO) 3 using red and near-infrared light

    DOE PAGES

    La Porte, Nathan T.; Martinez, Jose F.; Hedström, Svante; ...

    2017-02-28

    A major goal of artificial photosynthesis research is photosensitizing highly reducing metal centers using as much as possible of the solar spectrum reaching Earth's surface. The radical anions and dianions of rylenediimide (RDI) dyes, which absorb at wavelengths as long as 950 nm, are powerful photoreductants with excited state oxidation potentials that rival or exceed those of organometallic chromophores. These dyes have been previously incorporated into all-organic donor–acceptor systems, but have not yet been shown to reduce organometallic centers. This study describes a set of dyads in which perylenediimide (PDI) or naphthalenediimide (NDI) chromophores are attached to Re(bpy)(CO) 3 throughmore » either the bipyridine ligand or more directly to the Re center via a pyridine ligand. The chromophores are reduced with a mild reducing agent, after which excitation with long-wavelength red or near-infrared light leads to reduction of the Re complex. The kinetics of electron transfer from the photoexcited anions to the Re complex are monitored using transient visible/near-IR and mid-IR spectroscopy, complemented by theoretical spectroscopic assignments. The photo-driven charge shift from the reduced PDI or NDI to the complex occurs in picoseconds regardless of whether PDI or NDI is attached to the bipyridine or to the Re center, but back electron transfer is found to be three orders of magnitude slower with the chromophore attached to the Re center. These results will inform the design of future catalytic systems that incorporate RDI anions as chromophores.« less

  14. Ground state atoms confined in a real Rydberg and complex Rydberg-Scarf II potential

    NASA Astrophysics Data System (ADS)

    Mansoori Kermani, Maryam

    2017-12-01

    In this work, a system of two ground state atoms confined in a one-dimensional real Rydberg potential was modeled. The atom-atom interaction was considered as a nonlocal separable potential (NLSP) of rank one. This potential was assumed because it leads to an analytical solution of the Lippmann-Schwinger equation. The NLSPs are useful in the few body problems that the many-body potential at each point is replaced by a projective two-body nonlocal potential operator. Analytical expressions for the confined particle resolvent were calculated as a key function in this study. The contributions of the bound and virtual states in the complex energy plane were obtained via the derived transition matrix. Since the low energy quantum scattering problems scattering length is an important quantity, the behavior of this parameter was described versus the reduced energy considering various values of potential parameters. In a one-dimensional model, the total cross section in units of the area is not a meaningful property; however, the reflectance coefficient has a similar role. Therefore the reflectance probability and its behavior were investigated. Then a new confined potential via combining the complex absorbing Scarf II potential with the real Rydberg potential, called the Rydberg-Scarf II potential, was introduced to construct a non-Hermitian Hamiltonian. In order to investigate the effect of the complex potential, the scattering length and reflectance coefficient were calculated. It was concluded that in addition to the competition between the repulsive and attractive parts of both potentials, the imaginary part of the complex potential has an important effect on the properties of the system. The complex potential also reduces the reflectance probability via increasing the absorption probability. For all numerical computations, the parameters of a system including argon gas confined in graphite were considered.

  15. Access to Formally Ni(I) States in a Heterobimetallic NiZn System

    PubMed Central

    Uyeda, Christopher

    2014-01-01

    Heterobimetallic NiZn complexes featuring metal centers in distinct coordination environments have been synthesized using diimine-dioxime ligands as binucleating scaffolds. A tetramethylfuran-containing ligand derivative enables a stable one-electron-reduced S = 1/2 species to be accessed using Cp2Co as a chemical reductant. The resulting pseudo-square planar complex exhibits spectroscopic and crystallographic characteristics of a ligand-centered radical bound to a Ni(II) center. Upon coordination of a π-acidic ligand such as PPh3, however, a five-coordinate Ni(I) metalloradical is formed. The electronic structures of these reduced species provide insight into the subtle effects of ligand structure on the potential and reversibility of the NiII/I couple for complexes of redox-active tetraazamacrocycles. PMID:25614786

  16. ALC: automated reduction of rule-based models

    PubMed Central

    Koschorreck, Markus; Gilles, Ernst Dieter

    2008-01-01

    Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction) is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML) as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files. PMID:18973705

  17. Wigner distribution functions for complex dynamical systems: the emergence of the Wigner-Boltzmann equation.

    PubMed

    Sels, Dries; Brosens, Fons

    2013-10-01

    The equation of motion for the reduced Wigner function of a system coupled to an external quantum system is presented for the specific case when the external quantum system can be modeled as a set of harmonic oscillators. The result is derived from the Wigner function formulation of the Feynman-Vernon influence functional theory. It is shown how the true self-energy for the equation of motion is connected with the influence functional for the path integral. Explicit expressions are derived in terms of the bare Wigner propagator. Finally, we show under which approximations the resulting equation of motion reduces to the Wigner-Boltzmann equation.

  18. Cybersecurity in Hospitals: A Systematic, Organizational Perspective

    PubMed Central

    Kaiser, Jessica P

    2018-01-01

    Background Cybersecurity incidents are a growing threat to the health care industry in general and hospitals in particular. The health care industry has lagged behind other industries in protecting its main stakeholder (ie, patients), and now hospitals must invest considerable capital and effort in protecting their systems. However, this is easier said than done because hospitals are extraordinarily technology-saturated, complex organizations with high end point complexity, internal politics, and regulatory pressures. Objective The purpose of this study was to develop a systematic and organizational perspective for studying (1) the dynamics of cybersecurity capability development at hospitals and (2) how these internal organizational dynamics interact to form a system of hospital cybersecurity in the United States. Methods We conducted interviews with hospital chief information officers, chief information security officers, and health care cybersecurity experts; analyzed the interview data; and developed a system dynamics model that unravels the mechanisms by which hospitals build cybersecurity capabilities. We then use simulation analysis to examine how changes to variables within the model affect the likelihood of cyberattacks across both individual hospitals and a system of hospitals. Results We discuss several key mechanisms that hospitals use to reduce the likelihood of cybercriminal activity. The variable that most influences the risk of cyberattack in a hospital is end point complexity, followed by internal stakeholder alignment. Although resource availability is important in fueling efforts to close cybersecurity capability gaps, low levels of resources could be compensated for by setting a high target level of cybersecurity. Conclusions To enhance cybersecurity capabilities at hospitals, the main focus of chief information officers and chief information security officers should be on reducing end point complexity and improving internal stakeholder alignment. These strategies can solve cybersecurity problems more effectively than blindly pursuing more resources. On a macro level, the cyber vulnerability of a country’s hospital infrastructure is affected by the vulnerabilities of all individual hospitals. In this large system, reducing variation in resource availability makes the whole system less vulnerable—a few hospitals with low resources for cybersecurity threaten the entire infrastructure of health care. In other words, hospitals need to move forward together to make the industry less attractive to cybercriminals. Moreover, although compliance is essential, it does not equal security. Hospitals should set their target level of cybersecurity beyond the requirements of current regulations and policies. As of today, policies mostly address data privacy, not data security. Thus, policy makers need to introduce policies that not only raise the target level of cybersecurity capabilities but also reduce the variability in resource availability across the entire health care system. PMID:29807882

  19. Where systems biology meets postharvest

    USDA-ARS?s Scientific Manuscript database

    Interpreting fruit metabolism, particularly tree fruit metabolism, presents unique challenges. Long periods from tree establishment to fruiting render techniques directed towards reducing the complexity of metabolic mechanisms, such as genomic modification, relatively difficult. Consequently, holi...

  20. Atomistic Simulations of Complex DNA DSBs and the Interactions with Ku70/80 Heterodimer

    NASA Technical Reports Server (NTRS)

    Hu, Shaowen; Cucinotta, Francis A.

    2011-01-01

    Compared to DNA with simple DSBs, the complex lesions can enhance the hydrogen bonds opening rate at the DNA terminus, and increase the mobility of the whole duplex. Binding of Ku drastically reduces the structural disruption and flexibility caused by the complex lesions. In all complex DSBs systems, the binding of DSB terminus with Ku70 is softened while the binding of the middle duplex with Ku80 is tightened. Binding of Ku promotes the rigidity of DNA duplexes, due to the clamp structure of the inner surface of the rings of Ku70/80.

  1. Improved Dye Stability in Single-Molecule Fluorescence Experiments

    NASA Astrophysics Data System (ADS)

    EcheverrÍa Aitken, Colin; Marshall, R. Andrew; Pugi, Joseph D.

    Complex biological systems challenge existing single-molecule methods. In particular, dye stability limits observation time in singlemolecule fluorescence applications. Current approaches to improving dye performance involve the addition of enzymatic oxygen scavenging systems and small molecule additives. We present an enzymatic oxygen scavenging system that improves dye stability in single-molecule experiments. Compared to the currently-employed glucose-oxidase/catalase system, the protocatechuate-3,4-dioxygenase system achieves lower dissolved oxygen concentration and stabilizes single Cy3, Cy5, and Alexa488 fluorophores. Moreover, this system possesses none of the limitations associated with the glucose oxidase/catalase system. We also tested the effects of small molecule additives in this system. Biological reducing agents significantly destabilize the Cy5 fluorophore as a function of reducing potential. In contrast, anti-oxidants stabilize the Cy3 and Alexa488 fluorophores. We recommend use of the protocatechuate-3,4,-dioxygenase system with antioxidant additives, and in the absence of biological reducing agents. This system should have wide application to single-molecule fluorescence experiments.

  2. Complexity and the Limits of Revolution: What Will Happen to the Arab Spring?

    NASA Astrophysics Data System (ADS)

    Gard-Murray, Alexander S.; Bar-Yam, Yaneer

    The recent social unrest across the Middle East and North Africa has deposed dictators who had ruled for decades. While the events have been hailed as an "Arab Spring" by those who hope that repressive autocracies will be replaced by democracies, what sort of regimes will eventually emerge from the crisis remains far from certain. Here we provide a complex systems framework, validated by historical precedent, to help answer this question. We describe the dynamics of governmental change as an evolutionary process similar to biological evolution, in which complex organizations gradually arise by replication, variation, and competitive selection. Different kinds of governments, however, have differing levels of complexity. Democracies must be more systemically complex than autocracies because of their need to incorporate large numbers of people in decision-making. This difference has important implications for the relative robustness of democratic and autocratic governments after revolutions. Revolutions may disrupt existing evolved complexity, limiting the potential for building more complex structures quickly. Insofar as systemic complexity is reduced by revolution, democracy is harder to create in the wake of unrest than autocracy. Applying this analysis to the Middle East and North Africa, we infer that in the absence of stable institutions or external assistance, new governments are in danger of facing increasingly insurmountable challenges and reverting to autocracy.

  3. Computational and experimental study of airflow around a fan powered UVGI lamp

    NASA Astrophysics Data System (ADS)

    Kaligotla, Srikar; Tavakoli, Behtash; Glauser, Mark; Ahmadi, Goodarz

    2011-11-01

    The quality of indoor air environment is very important for improving the health of occupants and reducing personal exposure to hazardous pollutants. An effective way of controlling air quality is by eliminating the airborne bacteria and viruses or by reducing their emissions. Ultraviolet Germicidal Irradiation (UVGI) lamps can effectively reduce these bio-contaminants in an indoor environment, but the efficiency of these systems depends on airflow in and around the device. UVGI lamps would not be as effective in stagnant environments as they would be when the moving air brings the bio-contaminant in their irradiation region. Introducing a fan into the UVGI system would augment the efficiency of the system's kill rate. Airflows in ventilated spaces are quite complex due to the vast range of length and velocity scales. The purpose of this research is to study these complex airflows using CFD techniques and validate computational model with airflow measurements around the device using Particle Image Velocimetry measurements. The experimental results including mean velocities, length scales and RMS values of fluctuating velocities are used in the CFD validation. Comparison of these data at different locations around the device with the CFD model predictions are performed and good agreement was observed.

  4. Biochar amendment immobilizes lead in rice paddy soils and reduces its phytoavailability

    NASA Astrophysics Data System (ADS)

    Li, Honghong; Liu, Yuting; Chen, Yanhui; Wang, Shanli; Wang, Mingkuang; Xie, Tuanhui; Wang, Guo

    2016-08-01

    This study aimed to determine effects of rice straw biochar on Pb sequestration in a soil-rice system. Pot experiments were conducted with rice plants in Pb-contaminated paddy soils that had been amended with 0, 2.5, and 5% (w/w) biochar. Compared to the control treatment, amendment with 5% biochar resulted in 54 and 94% decreases in the acid soluble and CaCl2-extractable Pb, respectively, in soils containing rice plants at the maturity stage. The amount of Fe-plaque on root surfaces and the Pb concentrations of the Fe-plaque were also reduced in biochar amended soils. Furthermore, lead species in rice roots were determined using Pb L3-edge X-ray absorption near edge structure (XANES), and although Pb-ferrihydrite complexes dominated Pb inventories, increasing amounts of organic complexes like Pb-pectins and Pb-cysteine were found in roots from the 5% biochar treatments. Such organic complexes might impede Pb translocation from root to shoot and subsequently reduce Pb accumulation in rice with biochar amendment.

  5. Complexity, fractal dynamics and determinism in treadmill ambulation: Implications for clinical biomechanists.

    PubMed

    Hollman, John H; Watkins, Molly K; Imhoff, Angela C; Braun, Carly E; Akervik, Kristen A; Ness, Debra K

    2016-08-01

    Reduced inter-stride complexity during ambulation may represent a pathologic state. Evidence is emerging that treadmill training for rehabilitative purposes may constrain the locomotor system and alter gait dynamics in a way that mimics pathological states. The purpose of this study was to examine the dynamical system components of gait complexity, fractal dynamics and determinism during treadmill ambulation. Twenty healthy participants aged 23.8 (1.2) years walked at preferred walking speeds for 6min on a motorized treadmill and overground while wearing APDM 6 Opal inertial monitors. Stride times, stride lengths and peak sagittal plane trunk velocities were measured. Mean values and estimates of complexity, fractal dynamics and determinism were calculated for each parameter. Data were compared between overground and treadmill walking conditions. Mean values for each gait parameter were statistically equivalent between overground and treadmill ambulation (P>0.05). Through nonlinear analyses, however, we found that complexity in stride time signals (P<0.001), and long-range correlations in stride time and stride length signals (P=0.005 and P=0.024, respectively), were reduced on the treadmill. Treadmill ambulation induces more predictable inter-stride time dynamics and constrains fluctuations in stride times and stride lengths, which may alter feedback from destabilizing perturbations normally experienced by the locomotor control system during overground ambulation. Treadmill ambulation, therefore, may provide less opportunity for experiencing the adaptability necessary to successfully ambulate overground. Investigators and clinicians should be aware that treadmill ambulation will alter dynamic gait characteristics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Master-slave system with force feedback based on dynamics of virtual model

    NASA Technical Reports Server (NTRS)

    Nojima, Shuji; Hashimoto, Hideki

    1994-01-01

    A master-slave system can extend manipulating and sensing capabilities of a human operator to a remote environment. But the master-slave system has two serious problems: one is the mechanically large impedance of the system; the other is the mechanical complexity of the slave for complex remote tasks. These two problems reduce the efficiency of the system. If the slave has local intelligence, it can help the human operator by using its good points like fast calculation and large memory. The authors suggest that the slave is a dextrous hand with many degrees of freedom able to manipulate an object of known shape. It is further suggested that the dimensions of the remote work space be shared by the human operator and the slave. The effect of the large impedance of the system can be reduced in a virtual model, a physical model constructed in a computer with physical parameters as if it were in the real world. A method to determine the damping parameter dynamically for the virtual model is proposed. Experimental results show that this virtual model is better than the virtual model with fixed damping.

  7. Strategies for Reduced-Order Models in Uncertainty Quantification of Complex Turbulent Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Qi, Di

    Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are applied in the training phase for calibrating model errors to achieve optimal imperfect model parameters; and total statistical energy dynamics are introduced to improve the model sensitivity in the prediction phase especially when strong external perturbations are exerted. The validity of reduced-order models for predicting statistical responses and intermittency is demonstrated on a series of instructive models with increasing complexity, including the stochastic triad model, the Lorenz '96 model, and models for barotropic and baroclinic turbulence. The skillful low-order modeling methods developed here should also be useful for other applications such as efficient algorithms for data assimilation.

  8. An integrated cell-free metabolic platform for protein production and synthetic biology

    PubMed Central

    Jewett, Michael C; Calhoun, Kara A; Voloshin, Alexei; Wuu, Jessica J; Swartz, James R

    2008-01-01

    Cell-free systems offer a unique platform for expanding the capabilities of natural biological systems for useful purposes, i.e. synthetic biology. They reduce complexity, remove structural barriers, and do not require the maintenance of cell viability. Cell-free systems, however, have been limited by their inability to co-activate multiple biochemical networks in a single integrated platform. Here, we report the assessment of biochemical reactions in an Escherichia coli cell-free platform designed to activate natural metabolism, the Cytomim system. We reveal that central catabolism, oxidative phosphorylation, and protein synthesis can be co-activated in a single reaction system. Never before have these complex systems been shown to be simultaneously activated without living cells. The Cytomim system therefore promises to provide the metabolic foundation for diverse ab initio cell-free synthetic biology projects. In addition, we describe an improved Cytomim system with enhanced protein synthesis yields (up to 1200 mg/l in 2 h) and lower costs to facilitate production of protein therapeutics and biochemicals that are difficult to make in vivo because of their toxicity, complexity, or unusual cofactor requirements. PMID:18854819

  9. Analyzing system safety in lithium-ion grid energy storage

    NASA Astrophysics Data System (ADS)

    Rosewater, David; Williams, Adam

    2015-12-01

    As grid energy storage systems become more complex, it grows more difficult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to fill the gaps recognized in PRA for designing complex systems and hence be more effective or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. We conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.

  10. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    NASA Astrophysics Data System (ADS)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  11. Designing for Cost

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Unal, Resit

    1991-01-01

    Designing for cost is a state of mind. Of course, a lot of technical knowledge is required and the use of appropriate tools will improve the process. Unfortunately, the extensive use of weight based cost estimating relationships has generated a perception in the aerospace community that the primary way to reduce cost is to reduce weight. Wrong! Based upon an approximation of an industry accepted formula, the PRICE H (tm) production-production equation, Dean demonstrated theoretically that the optimal trajectory for cost reduction is predominantly in the direction of system complexity reduction, not system weight reduction. Thus the phrase "keep it simple" is a primary state of mind required for reducing cost throughout the design process.

  12. Adaptable dialog architecture and runtime engine (AdaRTE): a framework for rapid prototyping of health dialog systems.

    PubMed

    Rojas-Barahona, L M; Giorgino, T

    2009-04-01

    Spoken dialog systems have been increasingly employed to provide ubiquitous access via telephone to information and services for the non-Internet-connected public. They have been successfully applied in the health care context; however, speech technology requires a considerable development investment. The advent of VoiceXML reduced the proliferation of incompatible dialog formalisms, at the expense of adding even more complexity. This paper introduces a novel architecture for dialogue representation and interpretation, AdaRTE, which allows developers to lay out dialog interactions through a high-level formalism, offering both declarative and procedural features. AdaRTE's aim is to provide a ground for deploying complex and adaptable dialogs whilst allowing experimentation and incremental adoption of innovative speech technologies. It enhances augmented transition networks with dynamic behavior, and drives multiple back-end realizers, including VoiceXML. It has been especially targeted to the health care context, because of the great scale and the need for reducing the barrier to a widespread adoption of dialog systems.

  13. Learning to manage complexity through simulation: students' challenges and possible strategies.

    PubMed

    Gormley, Gerard J; Fenwick, Tara

    2016-06-01

    Many have called for medical students to learn how to manage complexity in healthcare. This study examines the nuances of students' challenges in coping with a complex simulation learning activity, using concepts from complexity theory, and suggests strategies to help them better understand and manage complexity.Wearing video glasses, participants took part in a simulation ward-based exercise that incorporated characteristics of complexity. Video footage was used to elicit interviews, which were transcribed. Using complexity theory as a theoretical lens, an iterative approach was taken to identify the challenges that participants faced and possible coping strategies using both interview transcripts and video footage.Students' challenges in coping with clinical complexity included being: a) unprepared for 'diving in', b) caught in an escalating system, c) captured by the patient, and d) unable to assert boundaries of acceptable practice.Many characteristics of complexity can be recreated in a ward-based simulation learning activity, affording learners an embodied and immersive experience of these complexity challenges. Possible strategies for managing complexity themes include: a) taking time to size up the system, b) attuning to what emerges, c) reducing complexity, d) boundary practices, and e) working with uncertainty. This study signals pedagogical opportunities for recognizing and dealing with complexity.

  14. The effect of pH and triethanolamine on sulfisoxazole complexation with hydroxypropyl-beta-cyclodextrin.

    PubMed

    Gladys, Granero; Claudia, Garnero; Marcela, Longhi

    2003-11-01

    A novel complexation of sulfisoxazole with hydroxypropyl-beta-cyclodextrin (HP-beta-CD) was studied. Two systems were used: binary complexes prepared with HP-beta-CD and multicomponent system (HP-beta-CD and the basic compound triethanolamine (TEA)). Inclusion complex formation in aqueous solutions and in solid state were investigated by the solubility method, thermal analysis (differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA)), Fourier-transform infrared spectroscopy (FT-IR) and dissolution studies. The solid complexes of sulfisoxazole were prepared by freeze-drying the homogeneous concentrated aqueous solutions in molar ratios of sulfisoxazole:HP-beta-CD 1:1 and 1:2, and sulfisoxazole:TEA:HP-beta-CD 1:1:2. FT-IR and thermal analysis showed differences among sulfisoxazole:HP-beta-CD and sulfisoxazole:TEA:HP-beta-CD and their corresponding physical mixtures and individual components. The HP-beta-CD solubilization of sulfisoxazole could be improved by ionization of the drug molecule through pH adjustments. However, larger improvements of the HP-beta-CD solubilization are obtained when multicomponent systems are used, allowing to reduce the amount of CD necessary to prepare the target formulation.

  15. Principal process analysis of biological models.

    PubMed

    Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc

    2018-06-14

    Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.

  16. On the dimension of complex responses in nonlinear structural vibrations

    NASA Astrophysics Data System (ADS)

    Wiebe, R.; Spottswood, S. M.

    2016-07-01

    The ability to accurately model engineering systems under extreme dynamic loads would prove a major breakthrough in many aspects of aerospace, mechanical, and civil engineering. Extreme loads frequently induce both nonlinearities and coupling which increase the complexity of the response and the computational cost of finite element models. Dimension reduction has recently gained traction and promises the ability to distill dynamic responses down to a minimal dimension without sacrificing accuracy. In this context, the dimensionality of a response is related to the number of modes needed in a reduced order model to accurately simulate the response. Thus, an important step is characterizing the dimensionality of complex nonlinear responses of structures. In this work, the dimensionality of the nonlinear response of a post-buckled beam is investigated. Significant detail is dedicated to carefully introducing the experiment, the verification of a finite element model, and the dimensionality estimation algorithm as it is hoped that this system may help serve as a benchmark test case. It is shown that with minor modifications, the method of false nearest neighbors can quantitatively distinguish between the response dimension of various snap-through, non-snap-through, random, and deterministic loads. The state-space dimension of the nonlinear system in question increased from 2-to-10 as the system response moved from simple, low-level harmonic to chaotic snap-through. Beyond the problem studied herein, the techniques developed will serve as a prescriptive guide in developing fast and accurate dimensionally reduced models of nonlinear systems, and eventually as a tool for adaptive dimension-reduction in numerical modeling. The results are especially relevant in the aerospace industry for the design of thin structures such as beams, panels, and shells, which are all capable of spatio-temporally complex dynamic responses that are difficult and computationally expensive to model.

  17. Using emergent order to shape a space society

    NASA Technical Reports Server (NTRS)

    Graps, Amara L.

    1993-01-01

    A fast-growing movement in the scientific community is reshaping the way that we view the world around us. The short-hand name for this movement is 'chaos'. Chaos is a science of the global, nonlinear nature of systems. The center of this set of ideas is that simple, deterministic systems can breed complexity. Systems as complex as the human body, ecology, the mind or a human society. While it is true that simple laws can breed complexity, the other side is that complex systems can breed order. It is the latter that I will focus on in this paper. In the past, nonlinear was nearly synonymous with unsolvable because no general analytic solutions exist. Mathematically, an essential difference exists between linear and nonlinear systems. For linear systems, you just break up the complicated system into many simple pieces and patch together the separated solutions for each piece to form a solution to the full problem. In contrast, solutions to a nonlinear system cannot be added to form a new solution. The system must be treated in its full complexity. While it is true that no general analytical approach exists for reducing a complex system such as a society, it can be modeled. The technical involves a mathematical construct called phase space. In this space stable structures can appear which I use as analogies for the stable structures that appear in a complex system such as an ecology, the mind or a society. The common denominator in all of these systems is that they rely on a process called feedback loops. Feedback loops link the microscopic (individual) parts to the macroscopic (global) parts. The key, then, in shaping a space society, is in effectively using feedback loops. This paper will illustrate how one can model a space society by using methods that chaoticists have developed over the last hundred years. And I will show that common threads exist in the modeling of biological, economical, philosophical, and sociological systems.

  18. Improving multi-objective reservoir operation optimization with sensitivity-informed dimension reduction

    NASA Astrophysics Data System (ADS)

    Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.

    2015-08-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.

  19. Improving multi-objective reservoir operation optimization with sensitivity-informed problem decomposition

    NASA Astrophysics Data System (ADS)

    Chu, J. G.; Zhang, C.; Fu, G. T.; Li, Y.; Zhou, H. C.

    2015-04-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce the computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed problem decomposition dramatically reduces the computational demands required for attaining high quality approximations of optimal tradeoff relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed problem decomposition and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform problem decomposition when solving the complex multi-objective reservoir operation problems.

  20. 75 FR 77961 - Endangered and Threatened Wildlife and Plants; Revised Critical Habitat for Santa Ana Sucker

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-14

    ... integrated water system that contains and provides the appropriate quantity of coarse substrates such as... reduces water temperature during summer and fall months. Therefore, a complex and integrated stream system... water velocities to support successful spawning. Swift (2001, p. 26) considered that only the Rialto...

  1. Incorporating Flexibility in the Design of Repairable Systems - Design of Microgrids

    DTIC Science & Technology

    2014-01-01

    MICROGRIDS Vijitashwa Pandey1 Annette Skowronska1,2...optimization of complex systems such as a microgrid is however, computationally intensive. The problem is exacerbated if we must incorporate...flexibility in terms of allowing the microgrid architecture and its running protocol to change with time. To reduce the computational effort, this paper

  2. Emergence Processes up to Consciousness Using the Multiplicity Principle and Quantum Physics

    NASA Astrophysics Data System (ADS)

    Ehresmann, Andrée C.; Vanbremeersch, Jean-Paul

    2002-09-01

    Evolution is marked by the emergence of new objects and interactions. Pursuing our preceding work on Memory Evolutive Systems (MES; cf. our Internet site), we propose a general mathematical model for this process, based on Category Theory. Its main characteristics is the Multiplicity Principle (MP) which asserts the existence of complex objects with several possible configurations. The MP entails the emergence of non-reducible more and more complex objects (emergentist reductionism). From the laws of Quantum Physics, it follows that the MP is valid for the category of particles and atoms, hence, by complexification, for any natural autonomous anticipatory complex system, such as biological systems up to neural systems, or social systems. Applying the model to the MES of neurons, we describe the emergence of higher and higher cognitive processes and of a semantic memory. Consciousness is characterized by the development of a permanent `personal' memory, the archetypal core, which allows the formation of extended landscapes with an integration of the temporal dimensions.

  3. System for decision analysis support on complex waste management issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shropshire, D.E.

    1997-10-01

    A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs,more » or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years.« less

  4. NASA Information Technology Implementation Plan

    NASA Technical Reports Server (NTRS)

    2000-01-01

    NASA's Information Technology (IT) resources and IT support continue to be a growing and integral part of all NASA missions. Furthermore, the growing IT support requirements are becoming more complex and diverse. The following are a few examples of the growing complexity and diversity of NASA's IT environment. NASA is conducting basic IT research in the Intelligent Synthesis Environment (ISE) and Intelligent Systems (IS) Initiatives. IT security, infrastructure protection, and privacy of data are requiring more and more management attention and an increasing share of the NASA IT budget. Outsourcing of IT support is becoming a key element of NASA's IT strategy as exemplified by Outsourcing Desktop Initiative for NASA (ODIN) and the outsourcing of NASA Integrated Services Network (NISN) support. Finally, technology refresh is helping to provide improved support at lower cost. Recently the NASA Automated Data Processing (ADP) Consolidation Center (NACC) upgraded its bipolar technology computer systems with Complementary Metal Oxide Semiconductor (CMOS) technology systems. This NACC upgrade substantially reduced the hardware maintenance and software licensing costs, significantly increased system speed and capacity, and reduced customer processing costs by 11 percent.

  5. AdaRTE: adaptable dialogue architecture and runtime engine. A new architecture for health-care dialogue systems.

    PubMed

    Rojas-Barahona, L M; Giorgino, T

    2007-01-01

    Spoken dialogue systems have been increasingly employed to provide ubiquitous automated access via telephone to information and services for the non-Internet-connected public. In the health care context, dialogue systems have been successfully applied. Nevertheless, speech-based technology is not easy to implement because it requires a considerable development investment. The advent of VoiceXML for voice applications contributed to reduce the proliferation of incompatible dialogue interpreters, but introduced new complexity. As a response to these issues, we designed an architecture for dialogue representation and interpretation, AdaRTE, which allows developers to layout dialogue interactions through a high level formalism that offers both declarative and procedural features. AdaRTE aim is to provide a ground for deploying complex and adaptable dialogues whilst allows the experimentation and incremental adoption of innovative speech technologies. It provides the dynamic behavior of Augmented Transition Networks and enables the generation of different backends formats such as VoiceXML. It is especially targeted to the health care context, where a framework for easy dialogue deployment could reduce the barrier for a more widespread adoption of dialogue systems.

  6. A case study of quality improvement methods for complex adaptive systems applied to an academic hepatology program.

    PubMed

    Fontanesi, John; Martinez, Anthony; Boyo, Toritsesan O; Gish, Robert

    2015-01-01

    Although demands for greater access to hepatology services that are less costly and achieve better outcomes have led to numerous quality improvement initiatives, traditional quality management methods may be inappropriate for hepatology. We empirically tested a model for conducting quality improvement in an academic hepatology program using methods developed to analyze and improve complex adaptive systems. We achieved a 25% increase in volume using 15% more clinical sessions with no change in staff or faculty FTEs, generating a positive margin of 50%. Wait times for next available appointments were reduced from five months to two weeks; unscheduled appointment slots dropped from 7% to less than 1%; "no-show" rates dropped to less than 10%; Press-Ganey scores increased to the 100th percentile. We conclude that framing hepatology as a complex adaptive system may improve our understanding of the complex, interdependent actions required to improve quality of care, patient satisfaction, and cost-effectiveness.

  7. Research and application of embedded real-time operating system

    NASA Astrophysics Data System (ADS)

    Zhang, Bo

    2013-03-01

    In this paper, based on the analysis of existing embedded real-time operating system, the architecture of an operating system is designed and implemented. The experimental results show that the design fully complies with the requirements of embedded real-time operating system, can achieve the purposes of reducing the complexity of embedded software design and improving the maintainability, reliability, flexibility. Therefore, this design program has high practical value.

  8. Robustness of self-organised systems to changes in behaviour: an example from real and simulated self-organised snail aggregations.

    PubMed

    Stafford, Richard; Williams, Gray A; Davies, Mark S

    2011-01-01

    Group or population level self-organised systems comprise many individuals displaying group-level emergent properties. Current theory indicates that individual-level behaviours have an effect on the final group-level behaviour; that is, self-organised systems are sensitive to small changes in individual behaviour. Here we examine a self-organised behaviour in relation to environmentally-driven individual-level changes in behaviour, using both natural systems and computer simulations. We demonstrate that aggregations of intertidal snails slightly decrease in size when, owing to hotter and more desiccating conditions, individuals forage for shorter periods--a seemingly non-adaptive behaviour for the snails since aggregation reduces desiccation stress. This decrease, however, only occurs in simple experimental systems (and simulations of these systems). When studied in their natural and more complex environment, and simulations of such an environment, using the same reduced foraging time, no difference in aggregation behaviour was found between hot and cool days. These results give an indication of how robust self-organised systems are to changes in individual-level behaviour. The complexity of the natural environment and the interactions of individuals with this environment, therefore, can result in self-organised systems being more resilient to individual-level changes than previously assumed.

  9. Preprocessing Inconsistent Linear System for a Meaningful Least Squares Solution

    NASA Technical Reports Server (NTRS)

    Sen, Syamal K.; Shaykhian, Gholam Ali

    2011-01-01

    Mathematical models of many physical/statistical problems are systems of linear equations. Due to measurement and possible human errors/mistakes in modeling/data, as well as due to certain assumptions to reduce complexity, inconsistency (contradiction) is injected into the model, viz. the linear system. While any inconsistent system irrespective of the degree of inconsistency has always a least-squares solution, one needs to check whether an equation is too much inconsistent or, equivalently too much contradictory. Such an equation will affect/distort the least-squares solution to such an extent that renders it unacceptable/unfit to be used in a real-world application. We propose an algorithm which (i) prunes numerically redundant linear equations from the system as these do not add any new information to the model, (ii) detects contradictory linear equations along with their degree of contradiction (inconsistency index), (iii) removes those equations presumed to be too contradictory, and then (iv) obtain the minimum norm least-squares solution of the acceptably inconsistent reduced linear system. The algorithm presented in Matlab reduces the computational and storage complexities and also improves the accuracy of the solution. It also provides the necessary warning about the existence of too much contradiction in the model. In addition, we suggest a thorough relook into the mathematical modeling to determine the reason why unacceptable contradiction has occurred thus prompting us to make necessary corrections/modifications to the models - both mathematical and, if necessary, physical.

  10. Preprocessing in Matlab Inconsistent Linear System for a Meaningful Least Squares Solution

    NASA Technical Reports Server (NTRS)

    Sen, Symal K.; Shaykhian, Gholam Ali

    2011-01-01

    Mathematical models of many physical/statistical problems are systems of linear equations Due to measurement and possible human errors/mistakes in modeling/data, as well as due to certain assumptions to reduce complexity, inconsistency (contradiction) is injected into the model, viz. the linear system. While any inconsistent system irrespective of the degree of inconsistency has always a least-squares solution, one needs to check whether an equation is too much inconsistent or, equivalently too much contradictory. Such an equation will affect/distort the least-squares solution to such an extent that renders it unacceptable/unfit to be used in a real-world application. We propose an algorithm which (i) prunes numerically redundant linear equations from the system as these do not add any new information to the model, (ii) detects contradictory linear equations along with their degree of contradiction (inconsistency index), (iii) removes those equations presumed to be too contradictory, and then (iv) obtain the . minimum norm least-squares solution of the acceptably inconsistent reduced linear system. The algorithm presented in Matlab reduces the computational and storage complexities and also improves the accuracy of the solution. It also provides the necessary warning about the existence of too much contradiction in the model. In addition, we suggest a thorough relook into the mathematical modeling to determine the reason why unacceptable contradiction has occurred thus prompting us to make necessary corrections/modifications to the models - both mathematical and, if necessary, physical.

  11. A Framework of Working Across Disciplines in Early Design and R&D of Large Complex Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.

    2015-01-01

    This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.

  12. Recent advances in QM/MM free energy calculations using reference potentials☆

    PubMed Central

    Duarte, Fernanda; Amrein, Beat A.; Blaha-Nelson, David; Kamerlin, Shina C.L.

    2015-01-01

    Background Recent years have seen enormous progress in the development of methods for modeling (bio)molecular systems. This has allowed for the simulation of ever larger and more complex systems. However, as such complexity increases, the requirements needed for these models to be accurate and physically meaningful become more and more difficult to fulfill. The use of simplified models to describe complex biological systems has long been shown to be an effective way to overcome some of the limitations associated with this computational cost in a rational way. Scope of review Hybrid QM/MM approaches have rapidly become one of the most popular computational tools for studying chemical reactivity in biomolecular systems. However, the high cost involved in performing high-level QM calculations has limited the applicability of these approaches when calculating free energies of chemical processes. In this review, we present some of the advances in using reference potentials and mean field approximations to accelerate high-level QM/MM calculations. We present illustrative applications of these approaches and discuss challenges and future perspectives for the field. Major conclusions The use of physically-based simplifications has shown to effectively reduce the cost of high-level QM/MM calculations. In particular, lower-level reference potentials enable one to reduce the cost of expensive free energy calculations, thus expanding the scope of problems that can be addressed. General significance As was already demonstrated 40 years ago, the usage of simplified models still allows one to obtain cutting edge results with substantially reduced computational cost. This article is part of a Special Issue entitled Recent developments of molecular dynamics. PMID:25038480

  13. Reducing aberration effect of Fourier transform lens by modifying Fourier spectrum of diffractive optical element in beam shaping optical system.

    PubMed

    Zhang, Fang; Zhu, Jing; Song, Qiang; Yue, Weirui; Liu, Jingdan; Wang, Jian; Situ, Guohai; Huang, Huijie

    2015-10-20

    In general, Fourier transform lenses are considered as ideal in the design algorithms of diffractive optical elements (DOEs). However, the inherent aberrations of a real Fourier transform lens disturb the far field pattern. The difference between the generated pattern and the expected design will impact the system performance. Therefore, a method for modifying the Fourier spectrum of DOEs without introducing other optical elements to reduce the aberration effect of the Fourier transform lens is proposed. By applying this method, beam shaping performance is improved markedly for the optical system with a real Fourier transform lens. The experiments carried out with a commercial Fourier transform lens give evidence for this method. The method is capable of reducing the system complexity as well as improving its performance.

  14. Absorption spectroscopic studies of Np(IV) complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, D. T.

    2004-01-01

    The complexation of neptunium (IV) with selected inorganic and organic ligands was studied as part of an investigation to establish key subsurface interactions between neptunium and biological systems. The prevalence of reducing environments in most subsurface migation scenarios, which are in many cases induced by biological activity, has increased the role and importance of Np(IV) as a key subsurface neptunium oxidation state. The biodegradation of larger organics that often coexist with actinides in the subsurface leads to the formation of many organic acids as transient products that, by complexation, play a key role in defining the fate and speciation ofmore » neptunium in biologically active systems. These often compete with inorganic complexes e.g. hydrolysis and phosphate. Herein we report the results of a series of complexation studies based on new band formation of the characteristic 960 nm band for Np(IV). Formation constants for Np(IV) complexes with phosphate, hydrolysis, succinate, acetohydroxamic acid, and acetate were determined. These results show the 960 nm absorption band to be very amenable to these types of complexation studies.« less

  15. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    NASA Technical Reports Server (NTRS)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  16. Protein glycosylation in diverse cell systems: implications for modification and analysis of recombinant proteins.

    PubMed

    Brooks, Susan A

    2006-06-01

    A major challenge for the biotechnology industry is to engineer the glycosylation pathways of expression systems to synthesize recombinant proteins with human glycosylation. Inappropriate glycosylation can result in reduced activity, limited half-life in circulation and unwanted immunogenicity. In this review, the complexities of glycosylation in human cells are explained and compared with glycosylation in bacteria, yeasts, fungi, insects, plants and nonhuman mammalian species. Key advances in the engineering of the glycosylation of expression systems are highlighted. Advances in the challenging and technically complex field of glycan analysis are also described. The emergence of a new generation of expression systems with sophisticated engineering for humanized glycosylation of glycoproteins appears to be on the horizon.

  17. Environmental Assessment for Vandenberg Gate Complex Construction, Dorm Construction and Demolition at Hanscom Air Force Base, Massachusetts

    DTIC Science & Technology

    2014-12-01

    MA January 2015 3 There are no existing underground stormwater drains in the area of the new Vandenberg Gate Complex. The addition of... stormwater management systems that utilize the pervious landscape, vegetative filtration, sediment removal, infiltration via bioswales, deep sump...Airmen Dormitory construction, a base-wide stormwater standard requires redevelopment projects to reduce stormwater rate and volume by 10% over the

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The four-dimensional scattering function S(Q,w) obtained by inelastic neutron scattering measurements provides unique "dynamical fingerprints" of the spin state and interactions present in complex magnetic materials. Extracting this information however is currently a slow and complex process that may take an expert -depending on the complexity of the system- up to several weeks of painstaking work to complete. Spin Wave Genie was created to abstract and automate this process. It strives to both reduce the time to complete this analysis and make these calculations more accessible to a broader group of scientists and engineers.

  19. Reduced-rank technique for joint channel estimation in TD-SCDMA systems

    NASA Astrophysics Data System (ADS)

    Kamil Marzook, Ali; Ismail, Alyani; Mohd Ali, Borhanuddin; Sali, Adawati; Khatun, Sabira

    2013-02-01

    In time division-synchronous code division multiple access systems, increasing the system capacity by exploiting the inserting of the largest number of users in one time slot (TS) requires adding more estimation processes to estimate the joint channel matrix for the whole system. The increase in the number of channel parameters due the increase in the number of users in one TS directly affects the precision of the estimator's performance. This article presents a novel channel estimation with low complexity, which relies on reducing the rank order of the total channel matrix H. The proposed method exploits the rank deficiency of H to reduce the number of parameters that characterise this matrix. The adopted reduced-rank technique is based on truncated singular value decomposition algorithm. The algorithms for reduced-rank joint channel estimation (JCE) are derived and compared against traditional full-rank JCEs: least squares (LS) or Steiner and enhanced (LS or MMSE) algorithms. Simulation results of the normalised mean square error showed the superiority of reduced-rank estimators. In addition, the channel impulse responses founded by reduced-rank estimator for all active users offers considerable performance improvement over the conventional estimator along the channel window length.

  20. Representing Operational Modes for Situation Awareness

    NASA Astrophysics Data System (ADS)

    Kirchhübel, Denis; Lind, Morten; Ravn, Ole

    2017-01-01

    Operating complex plants is an increasingly demanding task for human operators. Diagnosis of and reaction to on-line events requires the interpretation of real time data. Vast amounts of sensor data as well as operational knowledge about the state and design of the plant are necessary to deduct reasonable reactions to abnormal situations. Intelligent computational support tools can make the operator’s task easier, but they require knowledge about the overall system in form of some model. While tools used for fault-tolerant control design based on physical principles and relations are valuable tools for designing robust systems, the models become too complex when considering the interactions on a plant-wide level. The alarm systems meant to support human operators in the diagnosis of the plant-wide situation on the other hand fail regularly in situations where these interactions of systems lead to many related alarms overloading the operator with alarm floods. Functional modelling can provide a middle way to reduce the complexity of plant-wide models by abstracting from physical details to more general functions and behaviours. Based on functional models the propagation of failures through the interconnected systems can be inferred and alarm floods can potentially be reduced to their root-cause. However, the desired behaviour of a complex system changes due to operating procedures that require more than one physical and functional configuration. In this paper a consistent representation of possible configurations is deduced from the analysis of an exemplary start-up procedure by functional models. The proposed interpretation of the modelling concepts simplifies the functional modelling of distinct modes. The analysis further reveals relevant links between the quantitative sensor data and the qualitative perspective of the diagnostics tool based on functional models. This will form the basis for the ongoing development of a novel real-time diagnostics system based on the on-line adaptation of the underlying MFM model.

  1. Health technology assessment review: Computerized glucose regulation in the intensive care unit - how to create artificial control

    PubMed Central

    2009-01-01

    Current care guidelines recommend glucose control (GC) in critically ill patients. To achieve GC, many ICUs have implemented a (nurse-based) protocol on paper. However, such protocols are often complex, time-consuming, and can cause iatrogenic hypoglycemia. Computerized glucose regulation protocols may improve patient safety, efficiency, and nurse compliance. Such computerized clinical decision support systems (Cuss) use more complex logic to provide an insulin infusion rate based on previous blood glucose levels and other parameters. A computerized CDSS for glucose control has the potential to reduce overall workload, reduce the chance of human cognitive failure, and improve glucose control. Several computer-assisted glucose regulation programs have been published recently. In order of increasing complexity, the three main types of algorithms used are computerized flowcharts, Proportional-Integral-Derivative (PID), and Model Predictive Control (MPC). PID is essentially a closed-loop feedback system, whereas MPC models the behavior of glucose and insulin in ICU patients. Although the best approach has not yet been determined, it should be noted that PID controllers are generally thought to be more robust than MPC systems. The computerized Cuss that are most likely to emerge are those that are fully a part of the routine workflow, use patient-specific characteristics and apply variable sampling intervals. PMID:19849827

  2. On the complexity of search for keys in quantum cryptography

    NASA Astrophysics Data System (ADS)

    Molotkov, S. N.

    2016-03-01

    The trace distance is used as a security criterion in proofs of security of keys in quantum cryptography. Some authors doubted that this criterion can be reduced to criteria used in classical cryptography. The following question has been answered in this work. Let a quantum cryptography system provide an ɛ-secure key such that ½‖ρ XE - ρ U ⊗ ρ E ‖1 < ɛ, which will be repeatedly used in classical encryption algorithms. To what extent does the ɛ-secure key reduce the number of search steps (guesswork) as compared to the use of ideal keys? A direct relation has been demonstrated between the complexity of the complete consideration of keys, which is one of the main security criteria in classical systems, and the trace distance used in quantum cryptography. Bounds for the minimum and maximum numbers of search steps for the determination of the actual key have been presented.

  3. Developing an eLearning tool formalizing in YAWL the guidelines used in a transfusion medicine service.

    PubMed

    Russo, Paola; Piazza, Miriam; Leonardi, Giorgio; Roncoroni, Layla; Russo, Carlo; Spadaro, Salvatore; Quaglini, Silvana

    2012-01-01

    The blood transfusion is a complex activity subject to a high risk of eventually fatal errors. The development and application of computer-based systems could help reducing the error rate, playing a fundamental role in the improvement of the quality of care. This poster presents an under development eLearning tool formalizing the guidelines of the transfusion process. This system, implemented in YAWL (Yet Another Workflow Language), will be used to train the personnel in order to improve the efficiency of care and to reduce errors.

  4. On the existence of mosaic-skeleton approximations for discrete analogues of integral operators

    NASA Astrophysics Data System (ADS)

    Kashirin, A. A.; Taltykina, M. Yu.

    2017-09-01

    Exterior three-dimensional Dirichlet problems for the Laplace and Helmholtz equations are considered. By applying methods of potential theory, they are reduced to equivalent Fredholm boundary integral equations of the first kind, for which discrete analogues, i.e., systems of linear algebraic equations (SLAEs) are constructed. The existence of mosaic-skeleton approximations for the matrices of the indicated systems is proved. These approximations make it possible to reduce the computational complexity of an iterative solution of the SLAEs. Numerical experiments estimating the capabilities of the proposed approach are described.

  5. Digital adaptive optics line-scanning confocal imaging system.

    PubMed

    Liu, Changgeng; Kim, Myung K

    2015-01-01

    A digital adaptive optics line-scanning confocal imaging (DAOLCI) system is proposed by applying digital holographic adaptive optics to a digital form of line-scanning confocal imaging system. In DAOLCI, each line scan is recorded by a digital hologram, which allows access to the complex optical field from one slice of the sample through digital holography. This complex optical field contains both the information of one slice of the sample and the optical aberration of the system, thus allowing us to compensate for the effect of the optical aberration, which can be sensed by a complex guide star hologram. After numerical aberration compensation, the corrected optical fields of a sequence of line scans are stitched into the final corrected confocal image. In DAOLCI, a numerical slit is applied to realize the confocality at the sensor end. The width of this slit can be adjusted to control the image contrast and speckle noise for scattering samples. DAOLCI dispenses with the hardware pieces, such as Shack–Hartmann wavefront sensor and deformable mirror, and the closed-loop feedbacks adopted in the conventional adaptive optics confocal imaging system, thus reducing the optomechanical complexity and cost. Numerical simulations and proof-of-principle experiments are presented that demonstrate the feasibility of this idea.

  6. Resources for Systems Genetics.

    PubMed

    Williams, Robert W; Williams, Evan G

    2017-01-01

    A key characteristic of systems genetics is its reliance on populations that vary to a greater or lesser degree in genetic complexity-from highly admixed populations such as the Collaborative Cross and Diversity Outcross to relatively simple crosses such as sets of consomic strains and reduced complexity crosses. This protocol is intended to help investigators make more informed decisions about choices of resources given different types of questions. We consider factors such as costs, availability, and ease of breeding for common scenarios. In general, we recommend using complementary resources and minimizing depth of resampling of any given genome or strain.

  7. Scalable quantum computation scheme based on quantum-actuated nuclear-spin decoherence-free qubits

    NASA Astrophysics Data System (ADS)

    Dong, Lihong; Rong, Xing; Geng, Jianpei; Shi, Fazhan; Li, Zhaokai; Duan, Changkui; Du, Jiangfeng

    2017-11-01

    We propose a novel theoretical scheme of quantum computation. Nuclear spin pairs are utilized to encode decoherence-free (DF) qubits. A nitrogen-vacancy center serves as a quantum actuator to initialize, readout, and quantum control the DF qubits. The realization of CNOT gates between two DF qubits are also presented. Numerical simulations show high fidelities of all these processes. Additionally, we discuss the potential of scalability. Our scheme reduces the challenge of classical interfaces from controlling and observing complex quantum systems down to a simple quantum actuator. It also provides a novel way to handle complex quantum systems.

  8. The antagonistic modulation of Arp2/3 activity by N-WASP, WAVE2 and PICK1 defines dynamic changes in astrocyte morphology

    PubMed Central

    Murk, Kai; Blanco Suarez, Elena M.; Cockbill, Louisa M. R.; Banks, Paul; Hanley, Jonathan G.

    2013-01-01

    Summary Astrocytes exhibit a complex, branched morphology, allowing them to functionally interact with numerous blood vessels, neighboring glial processes and neuronal elements, including synapses. They also respond to central nervous system (CNS) injury by a process known as astrogliosis, which involves morphological changes, including cell body hypertrophy and thickening of major processes. Following severe injury, astrocytes exhibit drastically reduced morphological complexity and collectively form a glial scar. The mechanistic details behind these morphological changes are unknown. Here, we investigate the regulation of the actin-nucleating Arp2/3 complex in controlling dynamic changes in astrocyte morphology. In contrast to other cell types, Arp2/3 inhibition drives the rapid expansion of astrocyte cell bodies and major processes. This intervention results in a reduced morphological complexity of astrocytes in both dissociated culture and in brain slices. We show that this expansion requires functional myosin II downstream of ROCK and RhoA. Knockdown of the Arp2/3 subunit Arp3 or the Arp2/3 activator N-WASP by siRNA also results in cell body expansion and reduced morphological complexity, whereas depleting WAVE2 specifically reduces the branching complexity of astrocyte processes. By contrast, knockdown of the Arp2/3 inhibitor PICK1 increases astrocyte branching complexity. Furthermore, astrocyte expansion induced by ischemic conditions is delayed by PICK1 knockdown or N-WASP overexpression. Our findings identify a new morphological outcome for Arp2/3 activation in restricting rather than promoting outwards movement of the plasma membrane in astrocytes. The Arp2/3 regulators PICK1, and N-WASP and WAVE2 function antagonistically to control the complexity of astrocyte branched morphology, and this mechanism underlies the morphological changes seen in astrocytes during their response to pathological insult. PMID:23843614

  9. Dynamic Resectorization and Coordination Technology: An Evaluation of Air Traffic Control Complexity

    NASA Technical Reports Server (NTRS)

    Brinton, Christopher R.

    1996-01-01

    The work described in this report is done under contract with the National Aeronautics and Space Administration (NASA) to support the Advanced Air Transportation Technology (AATR) program. The goal of this program is to contribute to and accelerate progress in Advanced Air Transportation Technologies. Wyndemere Incorporated is supporting this goal by studying the complexity of the Air Traffic Specialist's role in maintaining the safety of the Air Transportation system. It is envisioned that the implementation of Free Flight may significantly increase the complexity and difficulty of maintaining this safety. Wyndemere Incorporated is researching potential methods to reduce this complexity. This is the final report for the contract.

  10. Coherent optical monolithic phased-array antenna steering system

    DOEpatents

    Hietala, Vincent M.; Kravitz, Stanley H.; Vawter, Gregory A.

    1994-01-01

    An optical-based RF beam steering system for phased-array antennas comprising a photonic integrated circuit (PIC). The system is based on optical heterodyning employed to produce microwave phase shifting by a monolithic PIC constructed entirely of passive components. Microwave power and control signal distribution to the antenna is accomplished by optical fiber, permitting physical separation of the PIC and its control functions from the antenna. The system reduces size, weight, complexity, and cost of phased-array antenna systems.

  11. Performance study of large area encoding readout MRPC

    NASA Astrophysics Data System (ADS)

    Chen, X. L.; Wang, Y.; Chen, G.; Han, D.; Wang, X.; Zeng, M.; Zeng, Z.; Zhao, Z.; Guo, B.

    2018-02-01

    Muon tomography system built by the 2-D readout high spatial resolution Multi-gap Resistive Plate Chamber (MRPC) detector is a project of Tsinghua University. An encoding readout method based on the fine-fine configuration has been used to minimize the number of the readout electronic channels resulting in reducing the complexity and the cost of the system. In this paper, we provide a systematic comparison of the MRPC detector performance with and without fine-fine encoding readout. Our results suggest that the application of the fine-fine encoding readout leads us to achieve a detecting system with slightly worse spatial resolution but dramatically reduce the number of electronic channels.

  12. Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer

    DTIC Science & Technology

    2006-03-01

    able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem

  13. Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil

    2016-01-01

    Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.

  14. Coding response to a case-mix measurement system based on multiple diagnoses.

    PubMed

    Preyra, Colin

    2004-08-01

    To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post.

  15. Adaptive tracking for complex systems using reduced-order models

    NASA Technical Reports Server (NTRS)

    Carnigan, Craig R.

    1990-01-01

    Reduced-order models are considered in the context of parameter adaptive controllers for tracking workspace trajectories. A dual-arm manipulation task is used to illustrate the methodology and provide simulation results. A parameter adaptive controller is designed to track a payload trajectory using a four-parameter model instead of the full-order, nine-parameter model. Several simulations with different payload-to-arm mass ratios are used to illustrate the capabilities of the reduced-order model in tracking the desired trajectory.

  16. Adaptive tracking for complex systems using reduced-order models

    NASA Technical Reports Server (NTRS)

    Carignan, Craig R.

    1990-01-01

    Reduced-order models are considered in the context of parameter adaptive controllers for tracking workspace trajectories. A dual-arm manipulation task is used to illustrate the methodology and provide simulation results. A parameter adaptive controller is designed to track the desired position trajectory of a payload using a four-parameter model instead of a full-order, nine-parameter model. Several simulations with different payload-to-arm mass ratios are used to illustrate the capabilities of the reduced-order model in tracking the desired trajectory.

  17. Multistate Memristive Tantalum Oxide Devices for Ternary Arithmetic

    PubMed Central

    Kim, Wonjoo; Chattopadhyay, Anupam; Siemon, Anne; Linn, Eike; Waser, Rainer; Rana, Vikas

    2016-01-01

    Redox-based resistive switching random access memory (ReRAM) offers excellent properties to implement future non-volatile memory arrays. Recently, the capability of two-state ReRAMs to implement Boolean logic functionality gained wide interest. Here, we report on seven-states Tantalum Oxide Devices, which enable the realization of an intrinsic modular arithmetic using a ternary number system. Modular arithmetic, a fundamental system for operating on numbers within the limit of a modulus, is known to mathematicians since the days of Euclid and finds applications in diverse areas ranging from e-commerce to musical notations. We demonstrate that multistate devices not only reduce the storage area consumption drastically, but also enable novel in-memory operations, such as computing using high-radix number systems, which could not be implemented using two-state devices. The use of high radix number system reduces the computational complexity by reducing the number of needed digits. Thus the number of calculation operations in an addition and the number of logic devices can be reduced. PMID:27834352

  18. Multistate Memristive Tantalum Oxide Devices for Ternary Arithmetic.

    PubMed

    Kim, Wonjoo; Chattopadhyay, Anupam; Siemon, Anne; Linn, Eike; Waser, Rainer; Rana, Vikas

    2016-11-11

    Redox-based resistive switching random access memory (ReRAM) offers excellent properties to implement future non-volatile memory arrays. Recently, the capability of two-state ReRAMs to implement Boolean logic functionality gained wide interest. Here, we report on seven-states Tantalum Oxide Devices, which enable the realization of an intrinsic modular arithmetic using a ternary number system. Modular arithmetic, a fundamental system for operating on numbers within the limit of a modulus, is known to mathematicians since the days of Euclid and finds applications in diverse areas ranging from e-commerce to musical notations. We demonstrate that multistate devices not only reduce the storage area consumption drastically, but also enable novel in-memory operations, such as computing using high-radix number systems, which could not be implemented using two-state devices. The use of high radix number system reduces the computational complexity by reducing the number of needed digits. Thus the number of calculation operations in an addition and the number of logic devices can be reduced.

  19. Multistate Memristive Tantalum Oxide Devices for Ternary Arithmetic

    NASA Astrophysics Data System (ADS)

    Kim, Wonjoo; Chattopadhyay, Anupam; Siemon, Anne; Linn, Eike; Waser, Rainer; Rana, Vikas

    2016-11-01

    Redox-based resistive switching random access memory (ReRAM) offers excellent properties to implement future non-volatile memory arrays. Recently, the capability of two-state ReRAMs to implement Boolean logic functionality gained wide interest. Here, we report on seven-states Tantalum Oxide Devices, which enable the realization of an intrinsic modular arithmetic using a ternary number system. Modular arithmetic, a fundamental system for operating on numbers within the limit of a modulus, is known to mathematicians since the days of Euclid and finds applications in diverse areas ranging from e-commerce to musical notations. We demonstrate that multistate devices not only reduce the storage area consumption drastically, but also enable novel in-memory operations, such as computing using high-radix number systems, which could not be implemented using two-state devices. The use of high radix number system reduces the computational complexity by reducing the number of needed digits. Thus the number of calculation operations in an addition and the number of logic devices can be reduced.

  20. Convergence analysis of the alternating RGLS algorithm for the identification of the reduced complexity Volterra model.

    PubMed

    Laamiri, Imen; Khouaja, Anis; Messaoud, Hassani

    2015-03-01

    In this paper we provide a convergence analysis of the alternating RGLS (Recursive Generalized Least Square) algorithm used for the identification of the reduced complexity Volterra model describing stochastic non-linear systems. The reduced Volterra model used is the 3rd order SVD-PARAFC-Volterra model provided using the Singular Value Decomposition (SVD) and the Parallel Factor (PARAFAC) tensor decomposition of the quadratic and the cubic kernels respectively of the classical Volterra model. The Alternating RGLS (ARGLS) algorithm consists on the execution of the classical RGLS algorithm in alternating way. The ARGLS convergence was proved using the Ordinary Differential Equation (ODE) method. It is noted that the algorithm convergence canno׳t be ensured when the disturbance acting on the system to be identified has specific features. The ARGLS algorithm is tested in simulations on a numerical example by satisfying the determined convergence conditions. To raise the elegies of the proposed algorithm, we proceed to its comparison with the classical Alternating Recursive Least Squares (ARLS) presented in the literature. The comparison has been built on a non-linear satellite channel and a benchmark system CSTR (Continuous Stirred Tank Reactor). Moreover the efficiency of the proposed identification approach is proved on an experimental Communicating Two Tank system (CTTS). Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Sparsity enabled cluster reduced-order models for control

    NASA Astrophysics Data System (ADS)

    Kaiser, Eurika; Morzyński, Marek; Daviller, Guillaume; Kutz, J. Nathan; Brunton, Bingni W.; Brunton, Steven L.

    2018-01-01

    Characterizing and controlling nonlinear, multi-scale phenomena are central goals in science and engineering. Cluster-based reduced-order modeling (CROM) was introduced to exploit the underlying low-dimensional dynamics of complex systems. CROM builds a data-driven discretization of the Perron-Frobenius operator, resulting in a probabilistic model for ensembles of trajectories. A key advantage of CROM is that it embeds nonlinear dynamics in a linear framework, which enables the application of standard linear techniques to the nonlinear system. CROM is typically computed on high-dimensional data; however, access to and computations on this full-state data limit the online implementation of CROM for prediction and control. Here, we address this key challenge by identifying a small subset of critical measurements to learn an efficient CROM, referred to as sparsity-enabled CROM. In particular, we leverage compressive measurements to faithfully embed the cluster geometry and preserve the probabilistic dynamics. Further, we show how to identify fewer optimized sensor locations tailored to a specific problem that outperform random measurements. Both of these sparsity-enabled sensing strategies significantly reduce the burden of data acquisition and processing for low-latency in-time estimation and control. We illustrate this unsupervised learning approach on three different high-dimensional nonlinear dynamical systems from fluids with increasing complexity, with one application in flow control. Sparsity-enabled CROM is a critical facilitator for real-time implementation on high-dimensional systems where full-state information may be inaccessible.

  2. Complex agro-ecosystems for food security in a changing climate

    PubMed Central

    Khumairoh, Uma; Groot, Jeroen CJ; Lantinga, Egbert A

    2012-01-01

    Attempts to increase food crop yields by intensifying agricultural systems using high inputs of nonrenewable resources and chemicals frequently lead to de-gradation of natural resources, whereas most technological innovations are not accessible for smallholders that represent the majority of farmers world wide. Alternatively, cocultures consisting of assemblages of plant and animal species can support ecological processes of nutrient cycling and pest control, which may lead to increasing yields and declining susceptibility to extreme weather conditions with increasing complexity of the systems. Here we show that enhancing the complexity of a rice production system by adding combinations of compost, azolla, ducks, and fish resulted in strongly increased grain yields and revenues in a season with extremely adverse weather conditions on East Java, Indonesia. We found that azolla, duck, and fish increased plant nutrient content, tillering and leaf area expansion, and strongly reduced the density of six different pests. In the most complex system comprising all components the highest grain yield was obtained. The net revenues of this system from sales of rice grain, fish, and ducks, after correction for extra costs, were 114% higher than rice cultivation with only compost as fertilizer. These results provide more insight in the agro-ecological processes and demonstrate how complex agricultural systems can contribute to food security in a changing climate. If smallholders can be trained to manage these systems and are supported for initial investments by credits, their livelihoods can be improved while producing in an ecologically benign way. PMID:22957173

  3. Using Agent-Based Modeling to Enhance System-Level Real-time Control of Urban Stormwater Systems

    NASA Astrophysics Data System (ADS)

    Rimer, S.; Mullapudi, A. M.; Kerkez, B.

    2017-12-01

    The ability to reduce combined-sewer overflow (CSO) events is an issue that challenges over 800 U.S. municipalities. When the volume of a combined sewer system or wastewater treatment plant is exceeded, untreated wastewater then overflows (a CSO event) into nearby streams, rivers, or other water bodies causing localized urban flooding and pollution. The likelihood and impact of CSO events has only exacerbated due to urbanization, population growth, climate change, aging infrastructure, and system complexity. Thus, there is an urgent need for urban areas to manage CSO events. Traditionally, mitigating CSO events has been carried out via time-intensive and expensive structural interventions such as retention basins or sewer separation, which are able to reduce CSO events, but are costly, arduous, and only provide a fixed solution to a dynamic problem. Real-time control (RTC) of urban drainage systems using sensor and actuator networks has served as an inexpensive and versatile alternative to traditional CSO intervention. In particular, retrofitting individual stormwater elements for sensing and automated active distributed control has been shown to significantly reduce the volume of discharge during CSO events, with some RTC models demonstrating a reduction upwards of 90% when compared to traditional passive systems. As more stormwater elements become retrofitted for RTC, system-level RTC across complete watersheds is an attainable possibility. However, when considering the diverse set of control needs of each of these individual stormwater elements, such system-level RTC becomes a far more complex problem. To address such diverse control needs, agent-based modeling is employed such that each individual stormwater element is treated as an autonomous agent with a diverse decision making capabilities. We present preliminary results and limitations of utilizing the agent-based modeling computational framework for the system-level control of diverse, interacting stormwater elements.

  4. Theoretical and software considerations for general dynamic analysis using multilevel substructured models

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1985-01-01

    The dynamic analysis of complex structural systems using the finite element method and multilevel substructured models is presented. The fixed-interface method is selected for substructure reduction because of its efficiency, accuracy, and adaptability to restart and reanalysis. This method is extended to reduction of substructures which are themselves composed of reduced substructures. The implementation and performance of the method in a general purpose software system is emphasized. Solution algorithms consistent with the chosen data structures are presented. It is demonstrated that successful finite element software requires the use of software executives to supplement the algorithmic language. The complexity of the implementation of restart and reanalysis porcedures illustrates the need for executive systems to support the noncomputational aspects of the software. It is shown that significant computational efficiencies can be achieved through proper use of substructuring and reduction technbiques without sacrificing solution accuracy. The restart and reanalysis capabilities and the flexible procedures for multilevel substructured modeling gives economical yet accurate analyses of complex structural systems.

  5. Reducing Uncertainty in Transpiration Estimation in Wet Tropical Forests and Upscaling Sap Flux Measurements in Complex Heterogeneous Systems

    NASA Astrophysics Data System (ADS)

    Moore, G. W.; Aparecido, L. M. T.; Jaimes, A.

    2017-12-01

    High tree species and functional diversity, complex age and stand structure, deeper active sapwood, and potential factors that reduce transpiration, such as frequent cloud cover and wet leaves are inherent in wet tropical forests. In face of these unique challenges, advancements are needed for optimizing in situ measurement strategies to reduce uncertainties, in particular, within-tree and among-tree variation. Over a five-year period, we instrumented 44 trees with heat dissipation sap flow sensors within a premontane wet tropical rainforest in Costa Rica (5000 mm MAP). Sensors were systematically apportioned among overstory, midstory, and suppressed trees. In a subset of dominant trees, radial profiles across the full range of active xylem were fitted as deep as 16 cm. Given high diversity, few instrumented trees belonged to the same species, genus, or even family. Leaf surfaces were wet 20-80% of daylight hours from the top to bottom of the canopy, respectively. As a result, transpiration was suppressed, even after accounting for lower vapor pressure deficit (<0.5 kPa) and reduced solar radiation (<500 W m-1). To the contrary, the driest month on record resulted in higher, not lower transpiration. We identified multiple functional types according to patterns in dry season water use for the period February to April, 2016 using Random Forest analysis to discriminate groups with unique temporal responses. These efforts are critical for improving global land surface models that increasingly partition canopy components within complex heterogeneous systems, and for improved accuracy of transpiration estimates in tropical forests.

  6. From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0

    PubMed Central

    Tononi, Giulio

    2014-01-01

    This paper presents Integrated Information Theory (IIT) of consciousness 3.0, which incorporates several advances over previous formulations. IIT starts from phenomenological axioms: information says that each experience is specific – it is what it is by how it differs from alternative experiences; integration says that it is unified – irreducible to non-interdependent components; exclusion says that it has unique borders and a particular spatio-temporal grain. These axioms are formalized into postulates that prescribe how physical mechanisms, such as neurons or logic gates, must be configured to generate experience (phenomenology). The postulates are used to define intrinsic information as “differences that make a difference” within a system, and integrated information as information specified by a whole that cannot be reduced to that specified by its parts. By applying the postulates both at the level of individual mechanisms and at the level of systems of mechanisms, IIT arrives at an identity: an experience is a maximally irreducible conceptual structure (MICS, a constellation of concepts in qualia space), and the set of elements that generates it constitutes a complex. According to IIT, a MICS specifies the quality of an experience and integrated information ΦMax its quantity. From the theory follow several results, including: a system of mechanisms may condense into a major complex and non-overlapping minor complexes; the concepts that specify the quality of an experience are always about the complex itself and relate only indirectly to the external environment; anatomical connectivity influences complexes and associated MICS; a complex can generate a MICS even if its elements are inactive; simple systems can be minimally conscious; complicated systems can be unconscious; there can be true “zombies” – unconscious feed-forward systems that are functionally equivalent to conscious complexes. PMID:24811198

  7. Understanding scaling through history-dependent processes with collapsing sample space.

    PubMed

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-04-28

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf's law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x) ~ x(-λ), where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α = 2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf's law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes.

  8. Cybersecurity in Hospitals: A Systematic, Organizational Perspective.

    PubMed

    Jalali, Mohammad S; Kaiser, Jessica P

    2018-05-28

    Cybersecurity incidents are a growing threat to the health care industry in general and hospitals in particular. The health care industry has lagged behind other industries in protecting its main stakeholder (ie, patients), and now hospitals must invest considerable capital and effort in protecting their systems. However, this is easier said than done because hospitals are extraordinarily technology-saturated, complex organizations with high end point complexity, internal politics, and regulatory pressures. The purpose of this study was to develop a systematic and organizational perspective for studying (1) the dynamics of cybersecurity capability development at hospitals and (2) how these internal organizational dynamics interact to form a system of hospital cybersecurity in the United States. We conducted interviews with hospital chief information officers, chief information security officers, and health care cybersecurity experts; analyzed the interview data; and developed a system dynamics model that unravels the mechanisms by which hospitals build cybersecurity capabilities. We then use simulation analysis to examine how changes to variables within the model affect the likelihood of cyberattacks across both individual hospitals and a system of hospitals. We discuss several key mechanisms that hospitals use to reduce the likelihood of cybercriminal activity. The variable that most influences the risk of cyberattack in a hospital is end point complexity, followed by internal stakeholder alignment. Although resource availability is important in fueling efforts to close cybersecurity capability gaps, low levels of resources could be compensated for by setting a high target level of cybersecurity. To enhance cybersecurity capabilities at hospitals, the main focus of chief information officers and chief information security officers should be on reducing end point complexity and improving internal stakeholder alignment. These strategies can solve cybersecurity problems more effectively than blindly pursuing more resources. On a macro level, the cyber vulnerability of a country's hospital infrastructure is affected by the vulnerabilities of all individual hospitals. In this large system, reducing variation in resource availability makes the whole system less vulnerable-a few hospitals with low resources for cybersecurity threaten the entire infrastructure of health care. In other words, hospitals need to move forward together to make the industry less attractive to cybercriminals. Moreover, although compliance is essential, it does not equal security. Hospitals should set their target level of cybersecurity beyond the requirements of current regulations and policies. As of today, policies mostly address data privacy, not data security. Thus, policy makers need to introduce policies that not only raise the target level of cybersecurity capabilities but also reduce the variability in resource availability across the entire health care system. ©Mohammad S Jalali, Jessica P Kaiser. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 28.05.2018.

  9. Free Energy and Virtual Reality in Neuroscience and Psychoanalysis: A Complexity Theory of Dreaming and Mental Disorder.

    PubMed

    Hopkins, Jim

    2016-01-01

    The main concepts of the free energy (FE) neuroscience developed by Karl Friston and colleagues parallel those of Freud's Project for a Scientific Psychology. In Hobson et al. (2014) these include an innate virtual reality generator that produces the fictive prior beliefs that Freud described as the primary process. This enables Friston's account to encompass a unified treatment-a complexity theory-of the role of virtual reality in both dreaming and mental disorder. In both accounts the brain operates to minimize FE aroused by sensory impingements-including interoceptive impingements that report compliance with biological imperatives-and constructs a representation/model of the causes of impingement that enables this minimization. In Friston's account (variational) FE equals complexity minus accuracy, and is minimized by increasing accuracy and decreasing complexity. Roughly the brain (or model) increases accuracy together with complexity in waking. This is mediated by consciousness-creating active inference-by which it explains sensory impingements in terms of perceptual experiences of their causes. In sleep it reduces complexity by processes that include both synaptic pruning and consciousness/virtual reality/dreaming in REM. The consciousness-creating active inference that effects complexity-reduction in REM dreaming must operate on FE-arousing data distinct from sensory impingement. The most relevant source is remembered arousals of emotion, both recent and remote, as processed in SWS and REM on "active systems" accounts of memory consolidation/reconsolidation. Freud describes these remembered arousals as condensed in the dreamwork for use in the conscious contents of dreams, and similar condensation can be seen in symptoms. Complexity partly reflects emotional conflict and trauma. This indicates that dreams and symptoms are both produced to reduce complexity in the form of potentially adverse (traumatic or conflicting) arousals of amygdala-related emotions. Mental disorder is thus caused by computational complexity together with mechanisms like synaptic pruning that have evolved for complexity-reduction; and important features of disorder can be understood in these terms. Details of the consilience among Freudian, systems consolidation, and complexity-reduction accounts appear clearly in the analysis of a single fragment of a dream, indicating also how complexity reduction proceeds by a process resembling Bayesian model selection.

  10. A power-efficient communication system between brain-implantable devices and external computers.

    PubMed

    Yao, Ning; Lee, Heung-No; Chang, Cheng-Chun; Sclabassi, Robert J; Sun, Mingui

    2007-01-01

    In this paper, we propose a power efficient communication system for linking a brain-implantable device to an external system. For battery powered implantable devices, the processor and the transmitter power should be reduced in order to both conserve battery power and reduce the health risks associated with transmission. To accomplish this, a joint source-channel coding/decoding system is devised. Low-density generator matrix (LDGM) codes are used in our system due to their low encoding complexity. The power cost for signal processing within the implantable device is greatly reduced by avoiding explicit source encoding. Raw data which is highly correlated is transmitted. At the receiver, a Markov chain source correlation model is utilized to approximate and capture the correlation of raw data. A turbo iterative receiver algorithm is designed which connects the Markov chain source model to the LDGM decoder in a turbo-iterative way. Simulation results show that the proposed system can save up to 1 to 2.5 dB on transmission power.

  11. Is the destabilization of the cournot equilibrium a good business strategy in cournot-puu duopoly?

    PubMed

    Canovas, Jose S

    2011-10-01

    It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.

  12. The effect of orthostasis on recurrence quantification analysis of heart rate and blood pressure dynamics.

    PubMed

    Javorka, M; Turianikova, Z; Tonhajzerova, I; Javorka, K; Baumert, M

    2009-01-01

    The purpose of this paper is to investigate the effect of orthostatic challenge on recurrence plot based complexity measures of heart rate and blood pressure variability (HRV and BPV). HRV and BPV complexities were assessed in 28 healthy subjects over 15 min in the supine and standing positions. The complexity of HRV and BPV was assessed based on recurrence quantification analysis. HRV complexity was reduced along with the HRV magnitude after changing from the supine to the standing position. In contrast, the BPV magnitude increased and BPV complexity decreased upon standing. Recurrence quantification analysis (RQA) of HRV and BPV is sensitive to orthostatic challenge and might therefore be suited to assess changes in autonomic neural outflow to the cardiovascular system.

  13. Structural resolution of inorganic nanotubes with complex stoichiometry.

    PubMed

    Monet, Geoffrey; Amara, Mohamed S; Rouzière, Stéphan; Paineau, Erwan; Chai, Ziwei; Elliott, Joshua D; Poli, Emiliano; Liu, Li-Min; Teobaldi, Gilberto; Launois, Pascale

    2018-05-23

    Determination of the atomic structure of inorganic single-walled nanotubes with complex stoichiometry remains elusive due to the too many atomic coordinates to be fitted with respect to X-ray diffractograms inherently exhibiting rather broad features. Here we introduce a methodology to reduce the number of fitted variables and enable resolution of the atomic structure for inorganic nanotubes with complex stoichiometry. We apply it to recently synthesized methylated aluminosilicate and aluminogermanate imogolite nanotubes of nominal composition (OH) 3 Al 2 O 3 Si(Ge)CH 3 . Fitting of X-ray scattering diagrams, supported by Density Functional Theory simulations, reveals an unexpected rolling mode for these systems. The transferability of the approach opens up for improved understanding of structure-property relationships of inorganic nanotubes to the benefit of fundamental and applicative research in these systems.

  14. Reducing the cognitive workload: Trouble managing power systems

    NASA Technical Reports Server (NTRS)

    Manner, David B.; Liberman, Eugene M.; Dolce, James L.; Mellor, Pamela A.

    1993-01-01

    The complexity of space-based systems makes monitoring them and diagnosing their faults taxing for human beings. Mission control operators are well-trained experts but they can not afford to have their attention diverted by extraneous information. During normal operating conditions monitoring the status of the components of a complex system alone is a big task. When a problem arises, immediate attention and quick resolution is mandatory. To aid humans in these endeavors we have developed an automated advisory system. Our advisory expert system, Trouble, incorporates the knowledge of the power system designers for Space Station Freedom. Trouble is designed to be a ground-based advisor for the mission controllers in the Control Center Complex at Johnson Space Center (JSC). It has been developed at NASA Lewis Research Center (LeRC) and tested in conjunction with prototype flight hardware contained in the Power Management and Distribution testbed and the Engineering Support Center, ESC, at LeRC. Our work will culminate with the adoption of these techniques by the mission controllers at JSC. This paper elucidates how we have captured power system failure knowledge, how we have built and tested our expert system, and what we believe are its potential uses.

  15. Analyzing system safety in lithium-ion grid energy storage

    DOE PAGES

    Rosewater, David; Williams, Adam

    2015-10-08

    As grid energy storage systems become more complex, it grows more di cult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to ll the gaps recognized in PRA for designing complex systems and hence be more e ectivemore » or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. Lastly, we conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.« less

  16. Biochar amendment immobilizes lead in rice paddy soils and reduces its phytoavailability

    PubMed Central

    Li, Honghong; Liu, Yuting; Chen, Yanhui; Wang, Shanli; Wang, Mingkuang; Xie, Tuanhui; Wang, Guo

    2016-01-01

    This study aimed to determine effects of rice straw biochar on Pb sequestration in a soil-rice system. Pot experiments were conducted with rice plants in Pb-contaminated paddy soils that had been amended with 0, 2.5, and 5% (w/w) biochar. Compared to the control treatment, amendment with 5% biochar resulted in 54 and 94% decreases in the acid soluble and CaCl2-extractable Pb, respectively, in soils containing rice plants at the maturity stage. The amount of Fe-plaque on root surfaces and the Pb concentrations of the Fe-plaque were also reduced in biochar amended soils. Furthermore, lead species in rice roots were determined using Pb L3-edge X-ray absorption near edge structure (XANES), and although Pb-ferrihydrite complexes dominated Pb inventories, increasing amounts of organic complexes like Pb-pectins and Pb-cysteine were found in roots from the 5% biochar treatments. Such organic complexes might impede Pb translocation from root to shoot and subsequently reduce Pb accumulation in rice with biochar amendment. PMID:27530495

  17. Parallel algorithms for mapping pipelined and parallel computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Many computational problems in image processing, signal processing, and scientific computing are naturally structured for either pipelined or parallel computation. When mapping such problems onto a parallel architecture it is often necessary to aggregate an obvious problem decomposition. Even in this context the general mapping problem is known to be computationally intractable, but recent advances have been made in identifying classes of problems and architectures for which optimal solutions can be found in polynomial time. Among these, the mapping of pipelined or parallel computations onto linear array, shared memory, and host-satellite systems figures prominently. This paper extends that work first by showing how to improve existing serial mapping algorithms. These improvements have significantly lower time and space complexities: in one case a published O(nm sup 3) time algorithm for mapping m modules onto n processors is reduced to an O(nm log m) time complexity, and its space requirements reduced from O(nm sup 2) to O(m). Run time complexity is further reduced with parallel mapping algorithms based on these improvements, which run on the architecture for which they create the mappings.

  18. Minimum Control Requirements for Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Boulange, Richard; Jones, Harry; Jones, Harry

    2002-01-01

    Advanced control technologies are not necessary for the safe, reliable and continuous operation of Advanced Life Support (ALS) systems. ALS systems can and are adequately controlled by simple, reliable, low-level methodologies and algorithms. The automation provided by advanced control technologies is claimed to decrease system mass and necessary crew time by reducing buffer size and minimizing crew involvement. In truth, these approaches increase control system complexity without clearly demonstrating an increase in reliability across the ALS system. Unless these systems are as reliable as the hardware they control, there is no savings to be had. A baseline ALS system is presented with the minimal control system required for its continuous safe reliable operation. This baseline control system uses simple algorithms and scheduling methodologies and relies on human intervention only in the event of failure of the redundant backup equipment. This ALS system architecture is designed for reliable operation, with minimal components and minimal control system complexity. The fundamental design precept followed is "If it isn't there, it can't fail".

  19. The "Biologically-Inspired Computing" Column

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike

    2007-01-01

    Self-managing systems, whether viewed from the perspective of Autonomic Computing, or from that of another initiative, offers a holistic vision for the development and evolution of biologically-inspired computer-based systems. It aims to bring new levels of automation and dependability to systems, while simultaneously hiding their complexity and reducing costs. A case can certainly be made that all computer-based systems should exhibit autonomic properties [6], and we envisage greater interest in, and uptake of, autonomic principles in future system development.

  20. Systemic risk: the dynamics of model banking systems

    PubMed Central

    May, Robert M.; Arinaminpathy, Nimalan

    2010-01-01

    The recent banking crises have made it clear that increasingly complex strategies for managing risk in individual banks have not been matched by corresponding attention to overall systemic risks. We explore some simple mathematical caricatures for ‘banking ecosystems’, with emphasis on the interplay between the characteristics of individual banks (capital reserves in relation to total assets, etc.) and the overall dynamical behaviour of the system. The results are discussed in relation to potential regulations aimed at reducing systemic risk. PMID:19864264

  1. Management of high-risk perioperative systems.

    PubMed

    Dain, Steven

    2006-06-01

    The perioperative system is a complex system that requires people, materials, and processes to come together in a highly ordered and timely manner. However, when working in this high-risk system, even well-organized, knowledgeable, vigilant, and well-intentioned individuals will eventually make errors. All systems need to be evaluated on a continual basis to reduce the risk of errors, make errors more easily recognizable, and provide methods for error mitigation. A simple approach to risk management that may be applied in clinical medicine is discussed.

  2. PyGirl: Generating Whole-System VMs from High-Level Prototypes Using PyPy

    NASA Astrophysics Data System (ADS)

    Bruni, Camillo; Verwaest, Toon

    Virtual machines (VMs) emulating hardware devices are generally implemented in low-level languages for performance reasons. This results in unmaintainable systems that are difficult to understand. In this paper we report on our experience using the PyPy toolchain to improve the portability and reduce the complexity of whole-system VM implementations. As a case study we implement a VM prototype for a Nintendo Game Boy, called PyGirl, in which the high-level model is separated from low-level VM implementation issues. We shed light on the process of refactoring from a low-level VM implementation in Java to a high-level model in RPython. We show that our whole-system VM written with PyPy is significantly less complex than standard implementations, without substantial loss in performance.

  3. Application of color to reduce complexity in air traffic control.

    DOT National Transportation Integrated Search

    2002-11-01

    The United States Air Traffic Control (ATC) system is designed to provide for the safe and efficient flow of air : traffic from origin to destination. The Federal Aviation Administration predicts that traffic levels will continue : increasing over th...

  4. A State-Space Approach to Optimal Level-Crossing Prediction for Linear Gaussian Processes

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2009-01-01

    In many complex engineered systems, the ability to give an alarm prior to impending critical events is of great importance. These critical events may have varying degrees of severity, and in fact they may occur during normal system operation. In this article, we investigate approximations to theoretically optimal methods of designing alarm systems for the prediction of level-crossings by a zero-mean stationary linear dynamic system driven by Gaussian noise. An optimal alarm system is designed to elicit the fewest false alarms for a fixed detection probability. This work introduces the use of Kalman filtering in tandem with the optimal level-crossing problem. It is shown that there is a negligible loss in overall accuracy when using approximations to the theoretically optimal predictor, at the advantage of greatly reduced computational complexity. I

  5. Gallium-based anti-infectives: targeting microbial iron-uptake mechanisms.

    PubMed

    Kelson, Andrew B; Carnevali, Maia; Truong-Le, Vu

    2013-10-01

    Microbes have evolved elaborate iron-acquisition systems to sequester iron from the host environment using siderophores and heme uptake systems. Gallium(III) is structurally similar to iron(III), except that it cannot be reduced under physiological conditions, therefore gallium has the potential to serve as an iron analog, and thus an anti-microbial. Because Ga(III) can bind to virtually any complex that binds Fe(III), simple gallium salts as well as more complex siderophores and hemes are potential carriers to deliver Ga(III) to the microbes. These gallium complexes represent a new class of anti-infectives that is different in mechanism of action from conventional antibiotics. Simple gallium salts such as gallium nitrate, maltolate, and simple gallium siderophore complexes such as gallium citrate have shown good antibacterial activities. The most studied complex has been gallium citrate, which exhibits broad activity against many Gram negative bacteria at ∼1-5μg/ml MICs, strong biofilm activity, low drug resistance, and efficacy in vivo. Using the structural features of specific siderophore and heme made by pathogenic bacteria and fungi, researchers have begun to evaluate new gallium complexes to target key pathogens. This review will summarize potential iron-acquisition system targets and recent research on gallium-based anti-infectives. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Sites of superoxide and hydrogen peroxide production during fatty acid oxidation in rat skeletal muscle mitochondria.

    PubMed

    Perevoshchikova, Irina V; Quinlan, Casey L; Orr, Adam L; Gerencser, Akos A; Brand, Martin D

    2013-08-01

    H2O2 production by skeletal muscle mitochondria oxidizing palmitoylcarnitine was examined under two conditions: the absence of respiratory chain inhibitors and the presence of myxothiazol to inhibit complex III. Without inhibitors, respiration and H2O2 production were low unless carnitine or malate was added to limit acetyl-CoA accumulation. With palmitoylcarnitine alone, H2O2 production was dominated by complex II (44% from site IIF in the forward reaction); the remainder was mostly from complex I (34%, superoxide from site IF). With added carnitine, H2O2 production was about equally shared between complexes I, II, and III. With added malate, it was 75% from complex III (superoxide from site IIIQo) and 25% from site IF. Thus complex II (site IIF in the forward reaction) is a major source of H2O2 production during oxidation of palmitoylcarnitine ± carnitine. Under the second condition (myxothiazol present to keep ubiquinone reduced), the rates of H2O2 production were highest in the presence of palmitoylcarnitine ± carnitine and were dominated by complex II (site IIF in the reverse reaction). About half the rest was from site IF, but a significant portion, ∼40pmol H2O2·min(-1)·mg protein(-1), was not from complex I, II, or III and was attributed to the proteins of β-oxidation (electron-transferring flavoprotein (ETF) and ETF-ubiquinone oxidoreductase). The maximum rate from the ETF system was ∼200pmol H2O2·min(-1)·mg protein(-1) under conditions of compromised antioxidant defense and reduced ubiquinone pool. Thus complex II and the ETF system both contribute to H2O2 productionduring fatty acid oxidation under appropriate conditions. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. A Review of Advanced Vehicular Diesel Research and Development Programs Which Have Potential Application to Stationary Diesel Power Plants.

    DTIC Science & Technology

    1980-03-01

    throttle torque capability. Various schemes are under development to reduce this disadvantage. These schemes include reducing compressor and turbine rotor...inertia, using a pelton wheel or burners, electronic feedback systems, and variable area turbocharging. Other turbocharging disadvantages include...around the turbine ) and using exhaust augmenters or combustors (wasteful of fuel, costly, and complex), and the variable area turbocharger (VAT). An

  8. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  9. The Marketability of Integrated Energy/Utility Systems: A Guide to the Dollar Savings Potential in Integrated Energy/Utility Systems; for Campuses, Medical Complexes, and Communities; Architect/Engineers, Industrial and Power Plant Owners; Suppliers; and Constructors.

    ERIC Educational Resources Information Center

    Coxe, Edwin F.; Hill, David E.

    This publication acquaints the prospective marketplace with the potential and underlying logic of the Integrated Utility System (IUS) concept. This system holds promise for educational and medical institutions seeking to reduce their energy costs. The generic IUS concept is described and how it can be incorporated into existing heating and…

  10. Reduced-order modeling of piezoelectric energy harvesters with nonlinear circuits under complex conditions

    NASA Astrophysics Data System (ADS)

    Xiang, Hong-Jun; Zhang, Zhi-Wei; Shi, Zhi-Fei; Li, Hong

    2018-04-01

    A fully coupled modeling approach is developed for piezoelectric energy harvesters in this work based on the use of available robust finite element packages and efficient reducing order modeling techniques. At first, the harvester is modeled using finite element packages. The dynamic equilibrium equations of harvesters are rebuilt by extracting system matrices from the finite element model using built-in commands without any additional tools. A Krylov subspace-based scheme is then applied to obtain a reduced-order model for improving simulation efficiency but preserving the key features of harvesters. Co-simulation of the reduced-order model with nonlinear energy harvesting circuits is achieved in a system level. Several examples in both cases of harmonic response and transient response analysis are conducted to validate the present approach. The proposed approach allows to improve the simulation efficiency by several orders of magnitude. Moreover, the parameters used in the equivalent circuit model can be conveniently obtained by the proposed eigenvector-based model order reduction technique. More importantly, this work establishes a methodology for modeling of piezoelectric energy harvesters with any complicated mechanical geometries and nonlinear circuits. The input load may be more complex also. The method can be employed by harvester designers to optimal mechanical structures or by circuit designers to develop novel energy harvesting circuits.

  11. Third Conference on Fibrous Composites in Flight Vehicle Design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The use of fibrous composite materials in the design of aircraft and space vehicle structures and their impact on future vehicle systems are discussed. The topics covered include: flight test work on composite components, design concepts and hardware, specialized applications, operational experience, certification and design criteria. Contributions to the design technology base include data concerning material properties, design procedures, environmental exposure effects, manufacturing procedures, and flight service reliability. By including composites as baseline design materials, significant payoffs are expected in terms of reduced structural weight fractions, longer structural life, reduced fuel consumption, reduced structural complexity, and reduced manufacturing cost.

  12. Fatigue reduces the complexity of knee extensor torque fluctuations during maximal and submaximal intermittent isometric contractions in man

    PubMed Central

    Pethick, Jamie; Winter, Samantha L; Burnley, Mark

    2015-01-01

    Neuromuscular fatigue increases the amplitude of fluctuations in torque output during isometric contractions, but the effect of fatigue on the temporal structure, or complexity, of these fluctuations is not known. We hypothesised that fatigue would result in a loss of temporal complexity and a change in fractal scaling of the torque signal during isometric knee extensor exercise. Eleven healthy participants performed a maximal test (5 min of intermittent maximal voluntary contractions, MVCs), and a submaximal test (contractions at a target of 40% MVC performed until task failure), each with a 60% duty factor (6 s contraction, 4 s rest). Torque and surface EMG signals were sampled continuously. Complexity and fractal scaling of torque were quantified by calculating approximate entropy (ApEn), sample entropy (SampEn) and the detrended fluctuation analysis (DFA) scaling exponent α. Fresh submaximal contractions were more complex than maximal contractions (mean ± SEM, submaximal vs. maximal: ApEn 0.65 ± 0.09 vs. 0.15 ± 0.02; SampEn 0.62 ± 0.09 vs. 0.14 ± 0.02; DFA α 1.35 ± 0.04 vs. 1.55 ± 0.03; all P < 0.005). Fatigue reduced the complexity of submaximal contractions (ApEn to 0.24 ± 0.05; SampEn to 0.22 ± 0.04; DFA α to 1.55 ± 0.03; all P < 0.005) and maximal contractions (ApEn to 0.10 ± 0.02; SampEn to 0.10 ± 0.02; DFA α to 1.63 ± 0.02; all P < 0.01). This loss of complexity and shift towards Brownian-like noise suggests that as well as reducing the capacity to produce torque, fatigue reduces the neuromuscular system's adaptability to external perturbations. PMID:25664928

  13. A multichannel implantable telemetry system for flow, pressure, and ECG measurements

    NASA Technical Reports Server (NTRS)

    Fryer, T. B.; Sandler, H.; Freund, W.; Mccutcheon, E. P.; Carlson, E. L.

    1975-01-01

    The design, principles of operation, and performance of an implantable miniaturized (48 cu cm in volume) multiplex telemetry system for simultaneous measurement of up to eight physiological parameters (including cardiovascular data) are described. Integrated circuits are used to reduce the size, complexity, and cost of fabrication. Power consumption is reduced using recently developed complementary MOS devices. PWM technique is selected as it is relatively easy to implement, lends itself to ICs, and provides an accurate means of transmitting data. The system is totally implantable within the chest of a test animal, with no wire penetrating through the skin. It is shown that the described system permits repeated measurement of the physiological effects of a variety of interventions in awake unanesthetized animals.

  14. Rheological and tribological study of complex soft gels containing polymer, phospholipids, oil, and water

    NASA Astrophysics Data System (ADS)

    Farias, Barbara; Hsiao, Lilian; Khan, Saad

    Oil-in-water emulsions with polymers are widely used for personal care products. Since the accumulation of traditional surfactants on the skin can promote irritation, an alternative is the use of hydrogenated phosphatidylcholine (HPC), a phospholipid that can form a lamellar structure similar to the skin barrier. This research aims to investigate the effect of composition on the rheological and tribological characteristics in complex systems containing HPC. For tribology experiments we used a soft model contacts made of polydimethylsiloxane (PDMS), while for bulk rheology studies we used dynamic and steady shear experiments. We examine how the addition of polymer, HPC and oil affects friction coefficients, lubrication regimes, viscoelasticity, yield stress, and gel formation. The bulk rheology shows that the studied systems are shear thinning and have gel-like behavior. The effect of each component was investigated by going from simple to more complex systems. The Stribeck curves obtained are related to the bulk rheology results to obtain physical insights into these complex systems. The results suggest that the polymer and phospholipids are being adsorbed onto the PDMS surface, reducing the friction coefficient at lower entrainment speeds.

  15. Long term results of diode laser cycloablation in complex glaucoma using the Zeiss Visulas II system

    PubMed Central

    Ataullah, S; Biswas, S; Artes, P H; O'Donoghue, E; Ridgway, A E A; Spencer, A F

    2002-01-01

    Aim: To investigate the safety and efficacy of the Zeiss Visulas II diode laser system in the reduction of intraocular pressure (IOP) in patients with complex glaucoma. Methods: The authors analysed the medical records of patients who underwent trans-scleral diode laser cycloablation (TDC) at the Manchester Royal Eye Hospital during a 34 month period. 55 eyes of 53 patients with complex glaucoma were followed up for a period of 12–52 months (mean 23.1 months) after initial treatment with the Zeiss Visulas II diode laser system. Results: Mean pretreatment IOP was 35.8 mm Hg (range 22–64 mm Hg). At the last examination, mean IOP was 17.3 mm Hg (range 0–40 mm Hg). After treatment, 45 eyes (82%) had an IOP between 5 and 22 mm Hg; in 46 eyes (84%) the preoperative IOP had been reduced by 30% or more. The mean number of treatment sessions was 1.7 (range 1–6). At the last follow up appointment, the mean number of glaucoma medications was reduced from 2.1 to 1.6 (p<0.05). In 10 eyes (18%), post-treatment visual acuity (VA) was worse than pretreatment VA by 2 or more lines. Conclusions: Treatment with the Zeiss Visulas II diode laser system can be safely repeated in order to achieve the target IOP. Treatment outcomes in this study were similar to those from previously published work using the Iris Medical Oculight SLx laser. PMID:11801501

  16. GaAs VLSI for aerospace electronics

    NASA Technical Reports Server (NTRS)

    Larue, G.; Chan, P.

    1990-01-01

    Advanced aerospace electronics systems require high-speed, low-power, radiation-hard, digital components for signal processing, control, and communication applications. GaAs VLSI devices provide a number of advantages over silicon devices including higher carrier velocities, ability to integrate with high performance optical devices, and high-resistivity substrates that provide very short gate delays, good isolation, and tolerance to many forms of radiation. However, III-V technologies also have disadvantages, such as lower yield compared to silicon MOS technology. Achieving very large scale integration (VLSI) is particularly important for fast complex systems. At very short gate delays (less than 100 ps), chip-to-chip interconnects severely degrade circuit clock rates. Complex systems, therefore, benefit greatly when as many gates as possible are placed on a single chip. To fully exploit the advantages of GaAs circuits, attention must be focused on achieving high integration levels by reducing power dissipation, reducing the number of devices per logic function, and providing circuit designs that are more tolerant to process and environmental variations. In addition, adequate noise margin must be maintained to ensure a practical yield.

  17. Hypothetical Modeling of Redox Conditions Within a Complex Ground-Water Flow Field in a Glacial Setting

    USGS Publications Warehouse

    Feinstein, Daniel T.; Thomas, Mary Ann

    2009-01-01

    This report describes a modeling approach for studying how redox conditions evolve under the influence of a complex ground-water flow field. The distribution of redox conditions within a flow system is of interest because of the intrinsic susceptibility of an aquifer to redox-sensitive, naturally occurring contaminants - such as arsenic - as well as anthropogenic contaminants - such as chlorinated solvents. The MODFLOW-MT3D-RT3D suite of code was applied to a glacial valley-fill aquifer to demonstrate a method for testing the interaction of flow patterns, sources of reactive organic carbon, and availability of electron acceptors in controlling redox conditions. Modeling results show how three hypothetical distributions of organic carbon influence the development of redox conditions in a water-supply aquifer. The distribution of strongly reduced water depends on the balance between the rate of redox reactions and the capability of different parts of the flow system to transmit oxygenated water. The method can take account of changes in the flow system induced by pumping that result in a new distribution of reduced water.

  18. Coupled Riccati equations for complex plane constraint

    NASA Technical Reports Server (NTRS)

    Strong, Kristin M.; Sesak, John R.

    1991-01-01

    A new Linear Quadratic Gaussian design method is presented which provides prescribed imaginary axis pole placement for optimal control and estimation systems. This procedure contributes another degree of design freedom to flexible spacecraft control. Current design methods which interject modal damping into the system tend to have little affect on modal frequencies, i.e., they predictably shift open plant poles horizontally in the complex plane to form the closed loop controller or estimator pole constellation, but make little provision for vertical (imaginary axis) pole shifts. Imaginary axis shifts which reduce the closed loop model frequencies (the bandwidths) are desirable since they reduce the sensitivity of the system to noise disturbances. The new method drives the closed loop modal frequencies to predictable (specified) levels, frequencies as low as zero rad/sec (real axis pole placement) can be achieved. The design procedure works through rotational and translational destabilizations of the plant, and a coupling of two independently solved algebraic Riccati equations through a structured state weighting matrix. Two new concepts, gain transference and Q equivalency, are introduced and their use shown.

  19. Rotordynamic analysis using the Complex Transfer Matrix: An application to elastomer supports using the viscoelastic correspondence principle

    NASA Astrophysics Data System (ADS)

    Varney, Philip; Green, Itzhak

    2014-11-01

    Numerous methods are available to calculate rotordynamic whirl frequencies, including analytic methods, finite element analysis, and the transfer matrix method. The typical real-valued transfer matrix (RTM) suffers from several deficiencies, including lengthy computation times and the inability to distinguish forward and backward whirl. Though application of complex coordinates in rotordynamic analysis is not novel per se, specific advantages gained from using such coordinates in a transfer matrix analysis have yet to be elucidated. The present work employs a complex coordinate redefinition of the transfer matrix to obtain reduced forms of the elemental transfer matrices in inertial and rotating reference frames, including external stiffness and damping. Application of the complex-valued state variable redefinition results in a reduction of the 8×8 RTM to the 4×4 Complex Transfer Matrix (CTM). The CTM is advantageous in that it intrinsically separates forward and backward whirl, eases symbolic manipulation by halving the transfer matrices’ dimension, and provides significant improvement in computation time. A symbolic analysis is performed on a simple overhung rotor to demonstrate the mathematical motivation for whirl frequency separation. The CTM's utility is further shown by analyzing a rotordynamic system supported by viscoelastic elastomer rings. Viscoelastic elastomer ring supports can provide significant damping while reducing the cost and complexity associated with conventional components such as squeeze film dampers. The stiffness and damping of a viscoelastic damper ring are determined herein as a function of whirl frequency using the viscoelastic correspondence principle and a constitutive fractional calculus viscoelasticity model. The CTM is then employed to obtain the characteristic equation, where the whirl frequency dependent stiffness and damping of the elastomer supports are included. The Campbell diagram is shown, demonstrating the CTM's ability to intrinsically separate synchronous whirl direction for a non-trivial rotordynamic system. Good agreement is found between the CTM results and previously obtained analytic and experimental results for the elastomer ring supported rotordynamic system.

  20. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  1. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  2. Organized All the Way Down

    NASA Astrophysics Data System (ADS)

    Sylvan, David

    At least since Adam Smith's The Wealth of Nations, it has been understood that social systems can be considered as having emergent properties not reducible to the actions of individuals. The appeal of this idea is obvious, no different now than in Smith's time: that aggregates of persons can be ordered without such order being intended or enforced by any particular person or persons. A search for such an "invisible hand" is what brings many of us to the study of complexity and the construction of various types of computational models aimed at capturing it. However, in proceeding along these lines, we have tended to focus on particular types of social systems — what I will in this paper call "thin" systems, such as markets and populations — and ignored other types, such as groups, whose base interactions are "thick," i.e., constructed as one of many possibilities, by the participants, at the moment in which they take place. These latter systems are not only ubiquitous but pose particular modeling problems for students of complexity: the local interactions are themselves complex and the systems display no strongly emergent features.

  3. Agile Integration of Complex Systems

    DTIC Science & Technology

    2010-04-28

    intervention in using SOA can be reduced Page 5 SOA in DoD DoD has mandated that all systems support the Network - Centric Environment and SOA is fundamental to...it and dropping it on an orchestrate icon (slide 22) Di i lifi d d d i l Page 13 scovery s mp e an ma e v sua SOAF Messaging Service Transport

  4. Simultaneous multimodal ophthalmic imaging using swept-source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography

    PubMed Central

    Malone, Joseph D.; El-Haddad, Mohamed T.; Bozic, Ivan; Tye, Logan A.; Majeau, Lucas; Godbout, Nicolas; Rollins, Andrew M.; Boudoux, Caroline; Joos, Karen M.; Patel, Shriji N.; Tao, Yuankai K.

    2016-01-01

    Scanning laser ophthalmoscopy (SLO) benefits diagnostic imaging and therapeutic guidance by allowing for high-speed en face imaging of retinal structures. When combined with optical coherence tomography (OCT), SLO enables real-time aiming and retinal tracking and provides complementary information for post-acquisition volumetric co-registration, bulk motion compensation, and averaging. However, multimodality SLO-OCT systems generally require dedicated light sources, scanners, relay optics, detectors, and additional digitization and synchronization electronics, which increase system complexity. Here, we present a multimodal ophthalmic imaging system using swept-source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography (SS-SESLO-OCT) for in vivo human retinal imaging. SESLO reduces the complexity of en face imaging systems by multiplexing spatial positions as a function of wavelength. SESLO image quality benefited from single-mode illumination and multimode collection through a prototype double-clad fiber coupler, which optimized scattered light throughput and reduce speckle contrast while maintaining lateral resolution. Using a shared 1060 nm swept-source, shared scanner and imaging optics, and a shared dual-channel high-speed digitizer, we acquired inherently co-registered en face retinal images and OCT cross-sections simultaneously at 200 frames-per-second. PMID:28101411

  5. The Next Frontier in Communication and the ECLIPPSE Study: Bridging the Linguistic Divide in Secure Messaging.

    PubMed

    Schillinger, Dean; McNamara, Danielle; Crossley, Scott; Lyles, Courtney; Moffet, Howard H; Sarkar, Urmimala; Duran, Nicholas; Allen, Jill; Liu, Jennifer; Oryn, Danielle; Ratanawongsa, Neda; Karter, Andrew J

    2017-01-01

    Health systems are heavily promoting patient portals. However, limited health literacy (HL) can restrict online communication via secure messaging (SM) because patients' literacy skills must be sufficient to convey and comprehend content while clinicians must encourage and elicit communication from patients and match patients' literacy level. This paper describes the Employing Computational Linguistics to Improve Patient-Provider Secure Email (ECLIPPSE) study, an interdisciplinary effort bringing together scientists in communication, computational linguistics, and health services to employ computational linguistic methods to (1) create a novel Linguistic Complexity Profile (LCP) to characterize communications of patients and clinicians and demonstrate its validity and (2) examine whether providers accommodate communication needs of patients with limited HL by tailoring their SM responses. We will study >5 million SMs generated by >150,000 ethnically diverse type 2 diabetes patients and >9000 clinicians from two settings: an integrated delivery system and a public (safety net) system. Finally, we will then create an LCP-based automated aid that delivers real-time feedback to clinicians to reduce the linguistic complexity of their SMs. This research will support health systems' journeys to become health literate healthcare organizations and reduce HL-related disparities in diabetes care.

  6. Fabrication of reduced graphene oxide/macrocyclic cobalt complex nanocomposites as counter electrodes for Pt-free dye-sensitized solar cells

    NASA Astrophysics Data System (ADS)

    Tsai, Chih-Hung; Shih, Chun-Jyun; Wang, Wun-Shiuan; Chi, Wen-Feng; Huang, Wei-Chih; Hu, Yu-Chung; Yu, Yuan-Hsiang

    2018-03-01

    In this study, macrocyclic Co complexes were successfully grafted onto graphene oxide (GO) to produce GO/Co nanocomposites with a large surface area, high electrical conductivity, and excellent catalytic properties. The novel GO/Co nanocomposites were applied as counter electrodes for Pt-free dye-sensitized solar cells (DSSCs). Various ratios of macrocyclic Co complexes were used as the reductant to react with the GO, with which the surface functional groups of the GO were reduced and the macrocyclic ligand of the Co complexes underwent oxidative dehydrogenation, after which the conjugated macrocyclic Co systems were grafted onto the surface of the reduced GO to form GO/Co nanocomposites. The surface morphology, material structure, and composition of the GO/Co composites and their influences on the power-conversion efficiency of DSSC devices were comprehensively investigated. The results showed that the GO/Co (1:10) counter electrode (CE) exhibited an optimal power conversion efficiency of 7.48%, which was higher than that of the Pt CE. The GO/Co (1:10) CE exhibited superior electric conductivity, catalytic capacity, and redox capacity. Because GO/Co (1:10) CEs are more efficient and cheaper than Pt CEs, they could potentially be used as a replacement for Pt electrodes.

  7. Positronium formation studies in solid molecular complexes: Triphenylphosphine oxide-triphenylmethanol

    NASA Astrophysics Data System (ADS)

    Oliveira, F. C.; Denadai, A. M. L.; Fulgêncio, F. H.; Magalhães, W. F.; Alcântara, A. F. C.; Windmöller, D.; Machado, J. C.

    2012-06-01

    Positronium formation in triphenylphosphine oxide (TPPO), triphenylmethanol (TPM), and systems [TPPO(1-X)ṡTPMX] has been studied. The low probability of positronium formation in complex [TPPO0.5ṡTPM0.5] was attributed to strong hydrogen bond and sixfold phenyl embrace interactions. These strong interactions in complex reduce the possibility of the n- and π-electrons to interact with positrons on the spur and consequently, the probability of positronium formation is lower. The τ3 parameter and free volume (correlated to τ3) were also sensitive to the formation of hydrogen bonds and sixfold phenyl embrace interactions within the complex. For physical mixture the positron annihilation parameters remained unchanged throughout the composition range.

  8. The application of sensitivity analysis to models of large scale physiological systems

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1974-01-01

    A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.

  9. Electrogenerated chemiluminescence. 58. Ligand-sensitized electrogenerated chemiluminescence in europium labels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richter, M.M.; Bard, A.J.

    The electrochemistry and electrogenerated chemiluminescence (ECL) of a series of europium chelates, cryptates, and mixed-ligand chelate/cryptand complexes were studied. The complexes were of the following general forms: EuL{sub 4}{sup -}, where L = {beta}-diketonate, a bis-chelating ligand (such as dibenzoylmethide), added as salts (A)EuL{sub 4}, where A= tetrabutylammonium ion or piperidinium ion (pipH{sup +}); Eu(crypt){sup 3+}, where crypt = a cryptand ligand, e.g., 4,7,13,16,21-pentaoxa-1,10-diazabicyclo[8,8,5]-tricosa ne; and Eu(crypt)(L){sup 2+} for the mixed-ligand systems. ECL was obtained for the chelates and mixed-ligand systems by reducing the complexes at a Pt electrode in the presence of peroxydisulfate in acetonitrile solutions and was attributedmore » to the electron-transfer reaction between the reduced bound ligands and SO{sub 4}{sup .-}, followed by intramolecular excitation transfer from the excited ligand orbitals to the metal-centered 4f states. No ECL was observed under the same conditions for the europium complexes incorporating only the cryptand ligands in aqueous solution. The ECL spectra matched the photoluminescence spectra with a narrow emission band observed at 612 nm, corresponding to a metal-centered 4f-4f transition. The ECL efficiencies for the ECL-active species were low, about 10{sup -1}-10{sup -4}% of that of the Ru-(bpy){sub 3}{sup 2+}/S{sub 2}O{sub 8}{sup 2-} system under similar conditions. 38 refs., 6 figs., 2 tabs.« less

  10. Autonomous Performance Monitoring System: Monitoring and Self-Tuning (MAST)

    NASA Technical Reports Server (NTRS)

    Peterson, Chariya; Ziyad, Nigel A.

    2000-01-01

    Maintaining the long-term performance of software onboard a spacecraft can be a major factor in the cost of operations. In particular, the task of controlling and maintaining a future mission of distributed spacecraft will undoubtedly pose a great challenge, since the complexity of multiple spacecraft flying in formation grows rapidly as the number of spacecraft in the formation increases. Eventually, new approaches will be required in developing viable control systems that can handle the complexity of the data and that are flexible, reliable and efficient. In this paper we propose a methodology that aims to maintain the accuracy of flight software, while reducing the computational complexity of software tuning tasks. The proposed Monitoring and Self-Tuning (MAST) method consists of two parts: a flight software monitoring algorithm and a tuning algorithm. The dependency on the software being monitored is mostly contained in the monitoring process, while the tuning process is a generic algorithm independent of the detailed knowledge on the software. This architecture will enable MAST to be applicable to different onboard software controlling various dynamics of the spacecraft, such as attitude self-calibration, and formation control. An advantage of MAST over conventional techniques such as filter or batch least square is that the tuning algorithm uses machine learning approach to handle uncertainty in the problem domain, resulting in reducing over all computational complexity. The underlying concept of this technique is a reinforcement learning scheme based on cumulative probability generated by the historical performance of the system. The success of MAST will depend heavily on the reinforcement scheme used in the tuning algorithm, which guarantees the tuning solutions exist.

  11. Individualized Levels System and Systematic Stimulus Pairing to Reduce Multiply Controlled Aggression of a Child With Autism Spectrum Disorder.

    PubMed

    Randall, Kayla R; Lambert, Joseph M; Matthews, Mary P; Houchins-Juarez, Nealetta J

    2018-05-01

    Research has shown that physical aggression is common in individuals with autism spectrum disorder (ASD). Interventions for multiply controlled aggression may be complex and difficult to implement with fidelity. As a result, the probability of treatment efficacy for this class of behavior may suffer. We designed an individualized levels system to reduce the physical aggression of an 11-year-old female with ASD. We then employed a systematic stimulus pairing procedure to facilitate generalization. Results suggest individualized levels systems can suppress multiply controlled aggression and that systematic stimulus pairing is an effective way to transfer treatment effects from trained therapists to caregivers.

  12. US EPA CSO CAPSTONE REPORT: THE CSO PROBLEM

    EPA Science Inventory

    The history of combined sewer systems (CSS) and combined sewer overflows (CSOs) in the US provides unique insights into the complex challenge faced in reducing and eliminating their adverse environmental effects. The evolution of the "modern" CSS shows how early urban drainag sys...

  13. Clear as glass: transparent financial reporting.

    PubMed

    Valletta, Robert M

    2005-08-01

    To be transparent, financial information needs to be easily accessible, timely, content-rich, and narrative. Not-for-profit hospitals and health systems should report detailed financial information quarterly. They need internal controls to reduce the level of complexity throughout the organization by creating standardized processes.

  14. A reduced complexity highly power/bandwidth efficient coded FQPSK system with iterative decoding

    NASA Technical Reports Server (NTRS)

    Simon, M. K.; Divsalar, D.

    2001-01-01

    Based on a representation of FQPSK as a trellis-coded modulation, this paper investigates the potential improvement in power efficiency obtained from the application of simple outer codes to form a concatenated coding arrangement with iterative decoding.

  15. Protocol for a mixed methods study of hospital readmissions: sensemaking in Veterans Health Administration healthcare system in the USA

    PubMed Central

    Leykum, Luci K; Noël, Polly; Finley, Erin P; Lanham, Holly Jordan; Pugh, Jacqueline

    2018-01-01

    Introduction Effective delivery of healthcare in complex systems requires managing interdependencies between professions and organisational units. Reducing 30-day hospital readmissions may be one of the most complex tasks that a healthcare system can undertake. We propose that these less than optimal outcomes are related to difficulties managing the complex interdependencies among organisational units and to a lack of effective sensemaking among individuals and organisational units regarding how best to coordinate patient needs. Methods and analysis This is a mixed method, multistepped study. We will conduct in-depth qualitative organisational case studies in 10 Veterans Health Administration facilities (6 with improving and 4 with worsening readmission rates), focusing on relationships, sensemaking and improvisation around care transition processes intended to reduce early readmissions. Data will be gathered through multiple methods (eg, chart reviews, surveys, interviews, observations) and analysed using analytic memos, qualitative coding and statistical analyses. We will construct an agent-based model based on those results to explore the influence of sensemaking and specific care transition processes on early readmissions. Ethics and dissemination Ethical approval has been obtained through the Institutional Review Board of the University of Texas Health Science Center at San Antonio (approval number: 14–258 hour). We will disseminate our findings in manuscripts in peer-reviewed journals, professional conferences and through short reports back to participating entities and stakeholders. PMID:29627815

  16. Mutualism supports biodiversity when the direct competition is weak

    PubMed Central

    Pascual-García, Alberto; Bastolla, Ugo

    2017-01-01

    A key question of theoretical ecology is which properties of ecosystems favour their stability and help maintaining biodiversity. This question recently reconsidered mutualistic systems, generating intense controversy about the role of mutualistic interactions and their network architecture. Here we show analytically and verify with simulations that reducing the effective interspecific competition and the propagation of perturbations positively influences structural stability against environmental perturbations, enhancing persistence. Noteworthy, mutualism reduces the effective interspecific competition only when the direct interspecific competition is weaker than a critical value. This critical competition is in almost all cases larger in pollinator networks than in random networks with the same connectance. Highly connected mutualistic networks reduce the propagation of environmental perturbations, a mechanism reminiscent of MacArthur’s proposal that ecosystem complexity enhances stability. Our analytic framework rationalizes previous contradictory results, and it gives valuable insight on the complex relationship between mutualism and biodiversity. PMID:28232740

  17. Study of the global positioning system for maritime concepts/applications: Study of the feasibility of replacing maritime shipborne navigation systems with NAVSTAR

    NASA Technical Reports Server (NTRS)

    Winn, C. B.; Huston, W.

    1981-01-01

    A geostationary reference satellite (REFSAT) that broadcasts every four seconds updated GPS satellite coordinates was developed. This procedure reduces the complexity of the GPS receiver. The economic and performance payoffs associated with replacing maritime stripborne navigation systems with NAVSTAR was quantified and the use of NAVSTAR for measurements of ocean currents in the broad ocean areas of the world was evaluated.

  18. Bacterial community changes in an industrial algae production system.

    PubMed

    Fulbright, Scott P; Robbins-Pianka, Adam; Berg-Lyons, Donna; Knight, Rob; Reardon, Kenneth F; Chisholm, Stephen T

    2018-04-01

    While microalgae are a promising feedstock for production of fuels and other chemicals, a challenge for the algal bioproducts industry is obtaining consistent, robust algae growth. Algal cultures include complex bacterial communities and can be difficult to manage because specific bacteria can promote or reduce algae growth. To overcome bacterial contamination, algae growers may use closed photobioreactors designed to reduce the number of contaminant organisms. Even with closed systems, bacteria are known to enter and cohabitate, but little is known about these communities. Therefore, the richness, structure, and composition of bacterial communities were characterized in closed photobioreactor cultivations of Nannochloropsis salina in F/2 medium at different scales, across nine months spanning late summer-early spring, and during a sequence of serially inoculated cultivations. Using 16S rRNA sequence data from 275 samples, bacterial communities in small, medium, and large cultures were shown to be significantly different. Larger systems contained richer bacterial communities compared to smaller systems. Relationships between bacterial communities and algae growth were complex. On one hand, blooms of a specific bacterial type were observed in three abnormal, poorly performing replicate cultivations, while on the other, notable changes in the bacterial community structures were observed in a series of serial large-scale batch cultivations that had similar growth rates. Bacteria common to the majority of samples were identified, including a single OTU within the class Saprospirae that was found in all samples. This study contributes important information for crop protection in algae systems, and demonstrates the complex ecosystems that need to be understood for consistent, successful industrial algae cultivation. This is the first study to profile bacterial communities during the scale-up process of industrial algae systems.

  19. Coding Response to a Case-Mix Measurement System Based on Multiple Diagnoses

    PubMed Central

    Preyra, Colin

    2004-01-01

    Objective To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Data Sources Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Study Design Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Principal Findings Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Conclusions Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post. PMID:15230940

  20. Aging and efficiency in living systems: Complexity, adaptation and self-organization.

    PubMed

    Chatterjee, Atanu; Georgiev, Georgi; Iannacchione, Germano

    2017-04-01

    Living systems are open, out-of-equilibrium thermodynamic entities, that maintain order by locally reducing their entropy. Aging is a process by which these systems gradually lose their ability to maintain their out-of-equilibrium state, as measured by their free-energy rate density, and hence, their order. Thus, the process of aging reduces the efficiency of those systems, making them fragile and less adaptive to the environmental fluctuations, gradually driving them towards the state of thermodynamic equilibrium. In this paper, we discuss the various metrics that can be used to understand the process of aging from a complexity science perspective. Among all the metrics that we propose, action efficiency, is observed to be of key interest as it can be used to quantify order and self-organization in any physical system. Based upon our arguments, we present the dependency of other metrics on the action efficiency of a system, and also argue as to how each of the metrics, influences all the other system variables. In order to support our claims, we draw parallels between technological progress and biological growth. Such parallels are used to support the universal applicability of the metrics and the methodology presented in this paper. Therefore, the results and the arguments presented in this paper throw light on the finer nuances of the science of aging. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Drag reduction in hydrocarbon-aluminum soap polymer systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakin, J.L.; Lee, K.C.

    1972-01-01

    While the drag-reducing capability of solutions of aluminum soap in hydrocarbons in turbulent flow has been known for over 20 yr, investigations of the effects of concentration, soap type, and aging on drag reduction have only recently begun. The effects of aging, shear stress, and the presence of peptizers on drag reduction of hydrocarbon dispersions of aluminum soaps at relatively low concentrations were studied. These systems showed an apparent upper critical shear stress above which drag reduction was gradually lost. Degradation of the soap micelle structure occurred relatively rapidly above this point and recovery was slow. The effect of peptizersmore » is complex. In some situations, it enhanced and in others reduced the drag-reducing ability of the soap polymers. (13 refs.)« less

  2. Recombination Catalysts for Hypersonic Fuels

    NASA Technical Reports Server (NTRS)

    Chinitz, W.

    1998-01-01

    The goal of commercially-viable access to space will require technologies that reduce propulsion system weight and complexity, while extracting maximum energy from the products of combustion. This work is directed toward developing effective nozzle recombination catalysts for the supersonic and hypersonic aeropropulsion engines used to provide such access to space. Effective nozzle recombination will significantly reduce rk=le length (hence, propulsion system weight) and reduce fuel requirements, further decreasing the vehicle's gross lift-off weight. Two such catalysts have been identified in this work, barium and antimony compounds, by developing chemical kinetic reaction mechanisms for these materials and determining the engine performance enhancement for a typical flight trajectory. Significant performance improvements are indicated, using only 2% (mole or mass) of these compounds in the combustor product gas.

  3. Effectiveness of biplane angiography compared to monoplane angiography for vascular neuro-interventions: a systematic review of the literature.

    PubMed

    Bellemare, C A; Poder, T G

    2017-07-01

    To compare biplane technology to monoplane technology for vascular neuro-intervention. A systematic review of the literature in MEDLINE (via PubMed), Scopus, and ScienceDirect was conducted without date or language restrictions. The Downs and Black quality-assessment checklist was used. The findings of this systematic review were combined with local and Canadian data. The nine articles selected for analysis had a very low level of evidence. The studies report that the biplane system appears to reduce ionising radiation and medical complications as well as shorten procedure time. Most major hospitals in Canada use the biplane system. The biplane system could improve the operator's confidence, which could translate into reduced risk, especially for more complex procedures. The superiority of the biplane system cannot be scientifically proven based on the data in the literature. Nevertheless, given the advantages that a biplane system can provide in terms of safety, quality of care, support to university teaching programmes based on best practices, enhanced capability in performing complex procedures, this technology should be implemented with a responsibility to collect outcome data to optimise the clinical protocol regarding the dose of ionising radiation delivered. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  4. Simplified stereo-optical ultrasound plane calibration

    NASA Astrophysics Data System (ADS)

    Hoßbach, Martin; Noll, Matthias; Wesarg, Stefan

    2013-03-01

    Image guided therapy is a natural concept and commonly used in medicine. In anesthesia, a common task is the injection of an anesthetic close to a nerve under freehand ultrasound guidance. Several guidance systems exist using electromagnetic tracking of the ultrasound probe as well as the needle, providing the physician with a precise projection of the needle into the ultrasound image. This, however, requires additional expensive devices. We suggest using optical tracking with miniature cameras attached to a 2D ultrasound probe to achieve a higher acceptance among physicians. The purpose of this paper is to present an intuitive method to calibrate freehand ultrasound needle guidance systems employing a rigid stereo camera system. State of the art methods are based on a complex series of error prone coordinate system transformations which makes them susceptible to error accumulation. By reducing the amount of calibration steps to a single calibration procedure we provide a calibration method that is equivalent, yet not prone to error accumulation. It requires a linear calibration object and is validated on three datasets utilizing di erent calibration objects: a 6mm metal bar and a 1:25mm biopsy needle were used for experiments. Compared to existing calibration methods for freehand ultrasound needle guidance systems, we are able to achieve higher accuracy results while additionally reducing the overall calibration complexity. Ke

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanders, L.K.; Xian, W.; Guaqueta, C.

    The aim for deterministic control of the interactions between macroions in aqueous media has motivated widespread experimental and theoretical work. Although it has been well established that like-charged macromolecules can aggregate under the influence of oppositely charged condensing agents, the specific conditions for the stability of such aggregates can only be determined empirically. We examine these conditions, which involve an interplay of electrostatic and osmotic effects, by using a well defined model system composed of F-actin, an anionic rod-like polyelectrolyte, and lysozyme, a cationic globular protein with a charge that can be genetically modified. The structure and stability of actin-lysozymemore » complexes for different lysozyme charge mutants and salt concentrations are examined by using synchrotron x-ray scattering and molecular dynamics simulations. We provide evidence that supports a structural transition from columnar arrangements of F-actin held together by arrays of lysozyme at the threefold interstitial sites of the actin sublattice to marginally stable complexes in which lysozyme resides at twofold bridging sites between actin. The reduced stability arises from strongly reduced partitioning of salt between the complex and the surrounding solution. Changes in the stability of actin-lysozyme complexes are of biomedical interest because their formation has been reported to contribute to the persistence of airway infections in cystic fibrosis by sequestering antimicrobials such as lysozyme. We present x-ray microscopy results that argue for the existence of actin-lysozyme complexes in cystic fibrosis sputum and demonstrate that, for a wide range of salt conditions, charge-reduced lysozyme is not sequestered in ordered complexes while retaining its bacterial killing activity.« less

  6. Extension of optical lithography by mask-litho integration with computational lithography

    NASA Astrophysics Data System (ADS)

    Takigawa, T.; Gronlund, K.; Wiley, J.

    2010-05-01

    Wafer lithography process windows can be enlarged by using source mask co-optimization (SMO). Recently, SMO including freeform wafer scanner illumination sources has been developed. Freeform sources are generated by a programmable illumination system using a micro-mirror array or by custom Diffractive Optical Elements (DOE). The combination of freeform sources and complex masks generated by SMO show increased wafer lithography process window and reduced MEEF. Full-chip mask optimization using source optimized by SMO can generate complex masks with small variable feature size sub-resolution assist features (SRAF). These complex masks create challenges for accurate mask pattern writing and low false-defect inspection. The accuracy of the small variable-sized mask SRAF patterns is degraded by short range mask process proximity effects. To address the accuracy needed for these complex masks, we developed a highly accurate mask process correction (MPC) capability. It is also difficult to achieve low false-defect inspections of complex masks with conventional mask defect inspection systems. A printability check system, Mask Lithography Manufacturability Check (M-LMC), is developed and integrated with 199-nm high NA inspection system, NPI. M-LMC successfully identifies printable defects from all of the masses of raw defect images collected during the inspection of a complex mask. Long range mask CD uniformity errors are compensated by scanner dose control. A mask CD uniformity error map obtained by mask metrology system is used as input data to the scanner. Using this method, wafer CD uniformity is improved. As reviewed above, mask-litho integration technology with computational lithography is becoming increasingly important.

  7. Functional Assembly of Soluble and Membrane Recombinant Proteins of Mammalian NADPH Oxidase Complex.

    PubMed

    Souabni, Hajer; Ezzine, Aymen; Bizouarn, Tania; Baciou, Laura

    2017-01-01

    Activation of phagocyte cells from an innate immune system is associated with a massive consumption of molecular oxygen to generate highly reactive oxygen species (ROS) as microbial weapons. This is achieved by a multiprotein complex, the so-called NADPH oxidase. The activity of phagocyte NADPH oxidase relies on an assembly of more than five proteins, among them the membrane heterodimer named flavocytochrome b 558 (Cytb 558 ), constituted by the tight association of the gp91 phox (also named Nox2) and p22 phox proteins. The Cytb 558 is the membrane catalytic core of the NADPH oxidase complex, through which the reducing equivalent provided by NADPH is transferred via the associated prosthetic groups (one flavin and two hemes) to reduce dioxygen into superoxide anion. The other major proteins (p47 phox , p67 phox , p40 phox , Rac) requisite for the complex activity are cytosolic proteins. Thus, the NADPH oxidase functioning relies on a synergic multi-partner assembly that in vivo can be hardly studied at the molecular level due to the cell complexity. Thus, a cell-free assay method has been developed to study the NADPH oxidase activity that allows measuring and eventually quantifying the ROS generation based on optical techniques following reduction of cytochrome c. This setup is a valuable tool for the identification of protein interactions, of crucial components and additives for a functional enzyme. Recently, this method was improved by the engineering and the production of a complete recombinant NADPH oxidase complex using the combination of purified proteins expressed in bacterial and yeast host cells. The reconstitution into artificial membrane leads to a fully controllable system that permits fine functional studies.

  8. Advanced Spectroscopic and Thermal Imaging Instrumentation for Shock Tube and Ballistic Range Facilities

    DTIC Science & Technology

    2010-04-01

    the development process, increase its quality and reduce development time through automation of synthesis, analysis or verification. For this purpose...made of time-non-deterministic systems, improving efficiency and reducing complexity of formal analysis . We also show how our theory relates to, and...of the most recent investigations for Earth and Mars atmospheres will be discussed in the following sections. 2.4.1 Earth: lunar return NASA’s

  9. Adaptive decoding of convolutional codes

    NASA Astrophysics Data System (ADS)

    Hueske, K.; Geldmacher, J.; Götze, J.

    2007-06-01

    Convolutional codes, which are frequently used as error correction codes in digital transmission systems, are generally decoded using the Viterbi Decoder. On the one hand the Viterbi Decoder is an optimum maximum likelihood decoder, i.e. the most probable transmitted code sequence is obtained. On the other hand the mathematical complexity of the algorithm only depends on the used code, not on the number of transmission errors. To reduce the complexity of the decoding process for good transmission conditions, an alternative syndrome based decoder is presented. The reduction of complexity is realized by two different approaches, the syndrome zero sequence deactivation and the path metric equalization. The two approaches enable an easy adaptation of the decoding complexity for different transmission conditions, which results in a trade-off between decoding complexity and error correction performance.

  10. Ascorbate Efflux as a New Strategy for Iron Reduction and Transport in Plants*

    PubMed Central

    Grillet, Louis; Ouerdane, Laurent; Flis, Paulina; Hoang, Minh Thi Thanh; Isaure, Marie-Pierre; Lobinski, Ryszard; Curie, Catherine; Mari, Stéphane

    2014-01-01

    Iron (Fe) is essential for virtually all living organisms. The identification of the chemical forms of iron (the speciation) circulating in and between cells is crucial to further understand the mechanisms of iron delivery to its final targets. Here we analyzed how iron is transported to the seeds by the chemical identification of iron complexes that are delivered to embryos, followed by the biochemical characterization of the transport of these complexes by the embryo, using the pea (Pisum sativum) as a model species. We have found that iron circulates as ferric complexes with citrate and malate (Fe(III)3Cit2Mal2, Fe(III)3Cit3Mal1, Fe(III)Cit2). Because dicotyledonous plants only transport ferrous iron, we checked whether embryos were capable of reducing iron of these complexes. Indeed, embryos did express a constitutively high ferric reduction activity. Surprisingly, iron(III) reduction is not catalyzed by the expected membrane-bound ferric reductase. Instead, embryos efflux high amounts of ascorbate that chemically reduce iron(III) from citrate-malate complexes. In vitro transport experiments on isolated embryos using radiolabeled 55Fe demonstrated that this ascorbate-mediated reduction is an obligatory step for the uptake of iron(II). Moreover, the ascorbate efflux activity was also measured in Arabidopsis embryos, suggesting that this new iron transport system may be generic to dicotyledonous plants. Finally, in embryos of the ascorbate-deficient mutants vtc2-4, vtc5-1, and vtc5-2, the reducing activity and the iron concentration were reduced significantly. Taken together, our results identified a new iron transport mechanism in plants that could play a major role to control iron loading in seeds. PMID:24347170

  11. Ascorbate efflux as a new strategy for iron reduction and transport in plants.

    PubMed

    Grillet, Louis; Ouerdane, Laurent; Flis, Paulina; Hoang, Minh Thi Thanh; Isaure, Marie-Pierre; Lobinski, Ryszard; Curie, Catherine; Mari, Stéphane

    2014-01-31

    Iron (Fe) is essential for virtually all living organisms. The identification of the chemical forms of iron (the speciation) circulating in and between cells is crucial to further understand the mechanisms of iron delivery to its final targets. Here we analyzed how iron is transported to the seeds by the chemical identification of iron complexes that are delivered to embryos, followed by the biochemical characterization of the transport of these complexes by the embryo, using the pea (Pisum sativum) as a model species. We have found that iron circulates as ferric complexes with citrate and malate (Fe(III)3Cit2Mal2, Fe(III)3Cit3Mal1, Fe(III)Cit2). Because dicotyledonous plants only transport ferrous iron, we checked whether embryos were capable of reducing iron of these complexes. Indeed, embryos did express a constitutively high ferric reduction activity. Surprisingly, iron(III) reduction is not catalyzed by the expected membrane-bound ferric reductase. Instead, embryos efflux high amounts of ascorbate that chemically reduce iron(III) from citrate-malate complexes. In vitro transport experiments on isolated embryos using radiolabeled (55)Fe demonstrated that this ascorbate-mediated reduction is an obligatory step for the uptake of iron(II). Moreover, the ascorbate efflux activity was also measured in Arabidopsis embryos, suggesting that this new iron transport system may be generic to dicotyledonous plants. Finally, in embryos of the ascorbate-deficient mutants vtc2-4, vtc5-1, and vtc5-2, the reducing activity and the iron concentration were reduced significantly. Taken together, our results identified a new iron transport mechanism in plants that could play a major role to control iron loading in seeds.

  12. Complexity in neuronal noise depends on network interconnectivity.

    PubMed

    Serletis, Demitre; Zalay, Osbert C; Valiante, Taufik A; Bardakjian, Berj L; Carlen, Peter L

    2011-06-01

    "Noise," or noise-like activity (NLA), defines background electrical membrane potential fluctuations at the cellular level of the nervous system, comprising an important aspect of brain dynamics. Using whole-cell voltage recordings from fast-spiking stratum oriens interneurons and stratum pyramidale neurons located in the CA3 region of the intact mouse hippocampus, we applied complexity measures from dynamical systems theory (i.e., 1/f(γ) noise and correlation dimension) and found evidence for complexity in neuronal NLA, ranging from high- to low-complexity dynamics. Importantly, these high- and low-complexity signal features were largely dependent on gap junction and chemical synaptic transmission. Progressive neuronal isolation from the surrounding local network via gap junction blockade (abolishing gap junction-dependent spikelets) and then chemical synaptic blockade (abolishing excitatory and inhibitory post-synaptic potentials), or the reverse order of these treatments, resulted in emergence of high-complexity NLA dynamics. Restoring local network interconnectivity via blockade washout resulted in resolution to low-complexity behavior. These results suggest that the observed increase in background NLA complexity is the result of reduced network interconnectivity, thereby highlighting the potential importance of the NLA signal to the study of network state transitions arising in normal and abnormal brain dynamics (such as in epilepsy, for example).

  13. Cyclic Peptide-Polymer Nanotubes as Efficient and Highly Potent Drug Delivery Systems for Organometallic Anticancer Complexes.

    PubMed

    Larnaudie, Sophie C; Brendel, Johannes C; Romero-Canelón, Isolda; Sanchez-Cano, Carlos; Catrouillet, Sylvain; Sanchis, Joaquin; Coverdale, James P C; Song, Ji-Inn; Habtemariam, Abraha; Sadler, Peter J; Jolliffe, Katrina A; Perrier, Sébastien

    2018-01-08

    Functional drug carrier systems have potential for increasing solubility and potency of drugs while reducing side effects. Complex polymeric materials, particularly anisotropic structures, are especially attractive due to their long circulation times. Here, we have conjugated cyclic peptides to the biocompatible polymer poly(2-hydroxypropyl methacrylamide) (pHPMA). The resulting conjugates were functionalized with organoiridium anticancer complexes. Small angle neutron scattering and static light scattering confirmed their self-assembly and elongated cylindrical shape. Drug-loaded nanotubes exhibited more potent antiproliferative activity toward human cancer cells than either free drug or the drug-loaded polymers, while the nanotubes themselves were nontoxic. Cellular accumulation studies revealed that the increased potency of the conjugate appears to be related to a more efficient mode of action rather than a higher cellular accumulation of iridium.

  14. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  15. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghattas, Omar

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  16. Modeling Reduced Human Performance as a Complex Adaptive System

    DTIC Science & Technology

    2003-09-01

    successfully used this design strategy in the domain of military simulation. See ( Arntzen 1998; Bohmann 1999; Le 1999; Schrepf 1999). 89 LCC’s design...THIS PAGE INTENTIONALLY LEFT BLANK 187 LIST OF REFERENCES Arntzen , A. (1998). Software Components for Air Defense Planing. Operations

  17. A joint precoding scheme for indoor downlink multi-user MIMO VLC systems

    NASA Astrophysics Data System (ADS)

    Zhao, Qiong; Fan, Yangyu; Kang, Bochao

    2017-11-01

    In this study, we aim to improve the system performance and reduce the implementation complexity of precoding scheme for visible light communication (VLC) systems. By incorporating the power-method algorithm and the block diagonalization (BD) algorithm, we propose a joint precoding scheme for indoor downlink multi-user multi-input-multi-output (MU-MIMO) VLC systems. In this scheme, we apply the BD algorithm to eliminate the co-channel interference (CCI) among users firstly. Secondly, the power-method algorithm is used to search the precoding weight for each user based on the optimal criterion of signal to interference plus noise ratio (SINR) maximization. Finally, the optical power restrictions of VLC systems are taken into account to constrain the precoding weight matrix. Comprehensive computer simulations in two scenarios indicate that the proposed scheme always has better bit error rate (BER) performance and lower computation complexity than that of the traditional scheme.

  18. Understanding Water-Stress Responses in Soybean Using Hydroponics System-A Systems Biology Perspective.

    PubMed

    Tripathi, Prateek; Rabara, Roel C; Shulaev, Vladimir; Shen, Qingxi J; Rushton, Paul J

    2015-01-01

    The deleterious changes in environmental conditions such as water stress bring physiological and biochemical changes in plants, which results in crop loss. Thus, combating water stress is important for crop improvement to manage the needs of growing population. Utilization of hydroponics system in growing plants is questionable to some researchers, as it does not represent an actual field condition. However, trying to address a complex problem like water stress we have to utilize a simpler growing condition like the hydroponics system wherein every input given to the plants can be controlled. With the advent of high-throughput technologies, it is still challenging to address all levels of the genetic machinery whether a gene, protein, metabolite, and promoter. Thus, using a system of reduced complexity like hydroponics can certainly direct us toward the right candidates, if not completely help us to resolve the issue.

  19. A microcontroller-based microwave free-space measurement system for permittivity determination of lossy liquid materials.

    PubMed

    Hasar, U C

    2009-05-01

    A microcontroller-based noncontact and nondestructive microwave free-space measurement system for real-time and dynamic determination of complex permittivity of lossy liquid materials has been proposed. The system is comprised of two main sections--microwave and electronic. While the microwave section provides for measuring only the amplitudes of reflection coefficients, the electronic section processes these data and determines the complex permittivity using a general purpose microcontroller. The proposed method eliminates elaborate liquid sample holder preparation and only requires microwave components to perform reflection measurements from one side of the holder. In addition, it explicitly determines the permittivity of lossy liquid samples from reflection measurements at different frequencies without any knowledge on sample thickness. In order to reduce systematic errors in the system, we propose a simple calibration technique, which employs simple and readily available standards. The measurement system can be a good candidate for industrial-based applications.

  20. Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-06-01

    The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.

  1. Application of Water Evaluation and Planning Model for Integrated Water Resources Management: Case Study of Langat River Basin, Malaysia

    NASA Astrophysics Data System (ADS)

    Leong, W. K.; Lai, S. H.

    2017-06-01

    Due to the effects of climate change and the increasing demand on water, sustainable development in term of water resources management has become a major challenge. In this context, the application of simulation models is useful to duel with the uncertainty and complexity of water system by providing stakeholders with the best solution. This paper outlines an integrated management planning network is developed based on Water Evaluation and Planning (WEAP) to evaluate current and future water management system of Langat River Basin, Malaysia under various scenarios. The WEAP model is known as an integrated decision support system investigate major stresses on demand and supply in terms of water availability in catchment scale. In fact, WEAP is applicable to simulate complex systems including various sectors within a single catchment or transboundary river system. To construct the model, by taking account of the Langat catchment and the corresponding demand points, we defined the hydrological model into 10 sub-hydrological catchments and 17 demand points included the export of treated water to the major cities outside the catchment. The model is calibrated and verified by several quantitative statistics (coefficient of determination, R2; Nash-Sutcliffe efficiency, NSE and Percent bias, PBIAS). The trend of supply and demand in the catchment is evaluated under three scenarios to 2050, 1: Population growth rate, 2: Demand side management (DSM) and 3: Combination of DSM and reduce non-revenue water (NRW). Results show that by reducing NRW and proper DSM, unmet demand able to reduce significantly.

  2. Nonlinear Focal Modulation Microscopy.

    PubMed

    Zhao, Guangyuan; Zheng, Cheng; Kuang, Cuifang; Zhou, Renjie; Kabir, Mohammad M; Toussaint, Kimani C; Wang, Wensheng; Xu, Liang; Li, Haifeng; Xiu, Peng; Liu, Xu

    2018-05-11

    We demonstrate nonlinear focal modulation microscopy (NFOMM) to achieve superresolution imaging. Traditional approaches to superresolution that utilize point scanning often rely on spatially reducing the size of the emission pattern by directly narrowing (e.g., through minimizing the detection pinhole in Airyscan, Zeiss) or indirectly peeling its outer profiles [e.g., through depleting the outer emission region in stimulated emission depletion (STED) microscopy]. We show that an alternative conceptualization that focuses on maximizing the optical system's frequency shifting ability offers advantages in further improving resolution while reducing system complexity. In NFOMM, a spatial light modulator and a suitably intense laser illumination are used to implement nonlinear focal-field modulation to achieve a transverse spatial resolution of ∼60  nm (∼λ/10). We show that NFOMM is comparable with STED microscopy and suitable for fundamental biology studies, as evidenced in imaging nuclear pore complexes, tubulin and vimentin in Vero cells. Since NFOMM is readily implemented as an add-on module to a laser-scanning microscope, we anticipate wide utility of this new imaging technique.

  3. [Assessment of prophylaxis and treatment of blood loss in patients with pre-eclampsia].

    PubMed

    Timokhova, S Iu; Golubtsov, V V; Zabolotskikh, I B

    2014-01-01

    To improve treatment results of women with massive obstetrical blood loss. Subjects and methods: 96 female patients with average and heavy degree preeclampsia worsened massive blood developing were involved into the investigation. The women were divided into two groups: main (n=55) (basic) - it's patients were treated with complex of offered wiays control (n=41) - it's patients were evaluated retrospectively. During the investigation the parameters of hemostasis system and periphery blood values were performed as dynamic evaluations, acidity-basic state and water-electrolyte balance parameters, medical history were monitored. As a result of the investigation it was found out that these offered actions complex application about reducing massive obstetric blood accelerates restoration of clinic, bio-chemical paramnleters during the early post-operating period The application of the offered methods has reduced both inltraoperative blood loss in women with preeclamsia and use of blood components and the time spent on the hemostasis system correction for all the women of the base group.

  4. Nonlinear Focal Modulation Microscopy

    NASA Astrophysics Data System (ADS)

    Zhao, Guangyuan; Zheng, Cheng; Kuang, Cuifang; Zhou, Renjie; Kabir, Mohammad M.; Toussaint, Kimani C.; Wang, Wensheng; Xu, Liang; Li, Haifeng; Xiu, Peng; Liu, Xu

    2018-05-01

    We demonstrate nonlinear focal modulation microscopy (NFOMM) to achieve superresolution imaging. Traditional approaches to superresolution that utilize point scanning often rely on spatially reducing the size of the emission pattern by directly narrowing (e.g., through minimizing the detection pinhole in Airyscan, Zeiss) or indirectly peeling its outer profiles [e.g., through depleting the outer emission region in stimulated emission depletion (STED) microscopy]. We show that an alternative conceptualization that focuses on maximizing the optical system's frequency shifting ability offers advantages in further improving resolution while reducing system complexity. In NFOMM, a spatial light modulator and a suitably intense laser illumination are used to implement nonlinear focal-field modulation to achieve a transverse spatial resolution of ˜60 nm (˜λ /10 ). We show that NFOMM is comparable with STED microscopy and suitable for fundamental biology studies, as evidenced in imaging nuclear pore complexes, tubulin and vimentin in Vero cells. Since NFOMM is readily implemented as an add-on module to a laser-scanning microscope, we anticipate wide utility of this new imaging technique.

  5. Low-Complexity Polynomial Channel Estimation in Large-Scale MIMO With Arbitrary Statistics

    NASA Astrophysics Data System (ADS)

    Shariati, Nafiseh; Bjornson, Emil; Bengtsson, Mats; Debbah, Merouane

    2014-10-01

    This paper considers pilot-based channel estimation in large-scale multiple-input multiple-output (MIMO) communication systems, also known as massive MIMO, where there are hundreds of antennas at one side of the link. Motivated by the fact that computational complexity is one of the main challenges in such systems, a set of low-complexity Bayesian channel estimators, coined Polynomial ExpAnsion CHannel (PEACH) estimators, are introduced for arbitrary channel and interference statistics. While the conventional minimum mean square error (MMSE) estimator has cubic complexity in the dimension of the covariance matrices, due to an inversion operation, our proposed estimators significantly reduce this to square complexity by approximating the inverse by a L-degree matrix polynomial. The coefficients of the polynomial are optimized to minimize the mean square error (MSE) of the estimate. We show numerically that near-optimal MSEs are achieved with low polynomial degrees. We also derive the exact computational complexity of the proposed estimators, in terms of the floating-point operations (FLOPs), by which we prove that the proposed estimators outperform the conventional estimators in large-scale MIMO systems of practical dimensions while providing a reasonable MSEs. Moreover, we show that L needs not scale with the system dimensions to maintain a certain normalized MSE. By analyzing different interference scenarios, we observe that the relative MSE loss of using the low-complexity PEACH estimators is smaller in realistic scenarios with pilot contamination. On the other hand, PEACH estimators are not well suited for noise-limited scenarios with high pilot power; therefore, we also introduce the low-complexity diagonalized estimator that performs well in this regime. Finally, we ...

  6. Fundamental concepts of structural loading and load relief techniques for the space shuttle

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Mowery, D. K.; Winder, S. W.

    1972-01-01

    The prediction of flight loads and their potential reduction, using various control system logics for the space shuttle vehicles, is discussed. Some factors not found on previous launch vehicles that increase the complexity are large lifting surfaces, unsymmetrical structure, unsymmetrical aerodynamics, trajectory control system coupling, and large aeroelastic effects. These load-producing factors and load-reducing techniques are analyzed.

  7. Cyber Hygiene for Control System Security

    DOE PAGES

    Oliver, David

    2015-10-08

    There are many resources from government and private industry available to assist organizations in reducing their attack surface and enhancing their security posture. Furthermore, standards are being written and improved upon to make the practice of securing a network more manageable. And while the specifics of network security are complex, most system vulnerabilities can be mitigated using fairly simple cyber hygiene techniques like those offered above.

  8. Are Models Easier to Understand than Code? An Empirical Study on Comprehension of Entity-Relationship (ER) Models vs. Structured Query Language (SQL) Code

    ERIC Educational Resources Information Center

    Sanchez, Pablo; Zorrilla, Marta; Duque, Rafael; Nieto-Reyes, Alicia

    2011-01-01

    Models in Software Engineering are considered as abstract representations of software systems. Models highlight relevant details for a certain purpose, whereas irrelevant ones are hidden. Models are supposed to make system comprehension easier by reducing complexity. Therefore, models should play a key role in education, since they would ease the…

  9. Metal-air cell with performance enhancing additive

    DOEpatents

    Friesen, Cody A; Buttry, Daniel

    2015-11-10

    Systems and methods drawn to an electrochemical cell comprising a low temperature ionic liquid comprising positive ions and negative ions and a performance enhancing additive added to the low temperature ionic liquid. The additive dissolves in the ionic liquid to form cations, which are coordinated with one or more negative ions forming ion complexes. The electrochemical cell also includes an air electrode configured to absorb and reduce oxygen. The ion complexes improve oxygen reduction thermodynamics and/or kinetics relative to the ionic liquid without the additive.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cresap, D.A.; Halverson, D.S.

    In the Fluorinel Dissolution Process (FDP) upgrade, excess hydrofluoric acid in the dissolver product must be complexed with aluminum nitrate (ANN) to eliminate corrosion concerns, adjusted with nitrate to facilitate extraction, and diluted with water to ensure solution stability. This is currently accomplished via batch processing in large vessels. However, to accommodate increases in projected throughput and reduce water production in a cost-effective manner, a semi-continuous system (In-line Complexing (ILC)) has been developed. The major conclusions drawn from tests demonstrating the feasibility of this concept are given in this report.

  11. Experimental demonstration of non-iterative interpolation-based partial ICI compensation in100G RGI-DP-CO-OFDM transport systems.

    PubMed

    Mousa-Pasandi, Mohammad E; Zhuge, Qunbi; Xu, Xian; Osman, Mohamed M; El-Sahn, Ziad A; Chagnon, Mathieu; Plant, David V

    2012-07-02

    We experimentally investigate the performance of a low-complexity non-iterative phase noise induced inter-carrier interference (ICI) compensation algorithm in reduced-guard-interval dual-polarization coherent-optical orthogonal-frequency-division-multiplexing (RGI-DP-CO-OFDM) transport systems. This interpolation-based ICI compensator estimates the time-domain phase noise samples by a linear interpolation between the CPE estimates of the consecutive OFDM symbols. We experimentally study the performance of this scheme for a 28 Gbaud QPSK RGI-DP-CO-OFDM employing a low cost distributed feedback (DFB) laser. Experimental results using a DFB laser with the linewidth of 2.6 MHz demonstrate 24% and 13% improvement in transmission reach with respect to the conventional equalizer (CE) in presence of weak and strong dispersion-enhanced-phase-noise (DEPN), respectively. A brief analysis of the computational complexity of this scheme in terms of the number of required complex multiplications is provided. This practical approach does not suffer from error propagation while enjoying low computational complexity.

  12. Program test objectives milestone 3. [Integrated Propulsion Technology Demonstrator

    NASA Technical Reports Server (NTRS)

    Gaynor, T. L.

    1994-01-01

    The following conclusions have been developed relative to propulsion system technology adequacy for efficient development and operation of recoverable and expendable launch vehicles (RLV and ELV) and the benefits which the integrated propulsion technology demonstrator will provide for enhancing technology: (1) Technology improvements relative to propulsion system design and operation can reduce program cost. Many features or improvement needs to enhance operability, reduce cost, and improve payload are identified. (2) The Integrated Propulsion Technology Demonstrator (IPTD) Program provides a means of resolving the majority of issues associated with improvement needs. (3) The IPTD will evaluate complex integration of vehicle and facility functions in fluid management and propulsion control systems, and provides an environment for validating improved mechanical and electrical components. (4) The IPTD provides a mechanism for investigating operational issues focusing on reducing manpower and time to perform various functions at the launch site. These efforts include model development, collection of data to validate subject models, and ultimate development of complex time line models. (5) The IPTD provides an engine test bed for tri/bi-propellant engine development firings which is representative of the actual vehicle environment. (6) The IPTD provides for only a limited multiengine configuration integration environment for RLV. Multiengine efforts may be simulated for a number of subsystems and a number of subsystems are relatively independent of the multiengine influences.

  13. Reduced circuit implementation of encoder and syndrome generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trager, Barry M; Winograd, Shmuel

    An error correction method and system includes an Encoder and Syndrome-generator that operate in parallel to reduce the amount of circuitry used to compute check symbols and syndromes for error correcting codes. The system and method computes the contributions to the syndromes and check symbols 1 bit at a time instead of 1 symbol at a time. As a result, the even syndromes can be computed as powers of the odd syndromes. Further, the system assigns symbol addresses so that there are, for an example GF(2.sup.8) which has 72 symbols, three (3) blocks of addresses which differ by a cubemore » root of unity to allow the data symbols to be combined for reducing size and complexity of odd syndrome circuits. Further, the implementation circuit for generating check symbols is derived from syndrome circuit using the inverse of the part of the syndrome matrix for check locations.« less

  14. Classified one-step high-radix signed-digit arithmetic units

    NASA Astrophysics Data System (ADS)

    Cherri, Abdallah K.

    1998-08-01

    High-radix number systems enable higher information storage density, less complexity, fewer system components, and fewer cascaded gates and operations. A simple one-step fully parallel high-radix signed-digit arithmetic is proposed for parallel optical computing based on new joint spatial encodings. This reduces hardware requirements and improves throughput by reducing the space-bandwidth produce needed. The high-radix signed-digit arithmetic operations are based on classifying the neighboring input digit pairs into various groups to reduce the computation rules. A new joint spatial encoding technique is developed to present both the operands and the computation rules. This technique increases the spatial bandwidth product of the spatial light modulators of the system. An optical implementation of the proposed high-radix signed-digit arithmetic operations is also presented. It is shown that our one-step trinary signed-digit and quaternary signed-digit arithmetic units are much simpler and better than all previously reported high-radix signed-digit techniques.

  15. A LWIR hyperspectral imager using a Sagnac interferometer and cooled HgCdTe detector array

    NASA Astrophysics Data System (ADS)

    Lucey, Paul G.; Wood, Mark; Crites, Sarah T.; Akagi, Jason

    2012-06-01

    LWIR hyperspectral imaging has a wide range of civil and military applications with its ability to sense chemical compositions at standoff ranges. Most recent implementations of this technology use spectrographs employing varying degrees of cryogenic cooling to reduce sensor self-emission that can severely limit sensitivity. We have taken an interferometric approach that promises to reduce the need for cooling while preserving high resolution. Reduced cooling has multiple benefits including faster system readiness from a power off state, lower mass, and potentially lower cost owing to lower system complexity. We coupled an uncooled Sagnac interferometer with a 256x320 mercury cadmium telluride array with an 11 micron cutoff to produce a spatial interferometric LWIR hyperspectral imaging system operating from 7.5 to 11 microns. The sensor was tested in ground-ground applications, and from a small aircraft producing spectral imagery including detection of gas emission from high vapor pressure liquids.

  16. Interpreter composition issues in the formal verification of a processor-memory module

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  17. The Quasimonotonicity of Linear Differential Systems -The Complex Spectrum

    DTIC Science & Technology

    2001-09-12

    proper, simplicial cone determined by the columns of B (see [10]) and that C is essentially nonnegative (see [11]). In [6], Heikkilä used Perron ...a B ≥ 0 such that Ae = B−1AB is essentially nonnegative and ir- reducible, then Perron - Frobenius theory tells us that Ae has a real eigenvalue λ1 with...systems requires that the comparison system be quasimonotone nondecreasing with respect to a cone contained in the nonnegative orthant. For linear

  18. Optimizing Automatic Deployment Using Non-functional Requirement Annotations

    NASA Astrophysics Data System (ADS)

    Kugele, Stefan; Haberl, Wolfgang; Tautschnig, Michael; Wechs, Martin

    Model-driven development has become common practice in design of safety-critical real-time systems. High-level modeling constructs help to reduce the overall system complexity apparent to developers. This abstraction caters for fewer implementation errors in the resulting systems. In order to retain correctness of the model down to the software executed on a concrete platform, human faults during implementation must be avoided. This calls for an automatic, unattended deployment process including allocation, scheduling, and platform configuration.

  19. Global services systems - Space communication

    NASA Technical Reports Server (NTRS)

    Shepphird, F. H.; Wolbers, H. L.

    1979-01-01

    The requirements projected to the year 2000 for space-based global service systems, including both personal communications and innovative services, are developed based on historic trends and anticipated worldwide demographic and economic growth patterns. The growing demands appear to be best satisfied by developing larger, more sophisticated space systems in order to reduce the size, complexity, and expense of ground terminals. The availability of low-cost ground terminals will, in turn, further stimulate the generation of new services and new customers.

  20. Safety and Suitability for Service Assessment Testing for Aircraft Launched Munitions

    DTIC Science & Technology

    2013-07-01

    2013 12 benefits in terms of cost and test efficiency that tend to associate the Analytical S3 Test Approach with complex missile systems and the... systems containing expensive, non-safety related components. c. When using the Analytical S3 Test Approach for aircraft launched bombs, full BTCA is...establish safety margin of the system . Details of the Empirical Test Flow with full and reduced BTCA options are provided in Appendix B, Annexes 3 and

  1. Magnetic bearing momentum wheels with magnetic gimballing capability for 3-axis active attitude control and energy storage

    NASA Technical Reports Server (NTRS)

    Sindlinger, R. S.

    1977-01-01

    A 3-axis active attitude control system with only one rotating part was developed using a momentum wheel with magnetic gimballing capability as a torque actuator for all three body axes. A brief description of magnetic bearing technology is given. It is concluded that based on this technology an integrated energy storage/attitude control system with one air of counterrotating rings could reduce the complexity and weight of conventional systems.

  2. Sequential Test Strategies for Multiple Fault Isolation

    NASA Technical Reports Server (NTRS)

    Shakeri, M.; Pattipati, Krishna R.; Raghavan, V.; Patterson-Hine, Ann; Kell, T.

    1997-01-01

    In this paper, we consider the problem of constructing near optimal test sequencing algorithms for diagnosing multiple faults in redundant (fault-tolerant) systems. The computational complexity of solving the optimal multiple-fault isolation problem is super-exponential, that is, it is much more difficult than the single-fault isolation problem, which, by itself, is NP-hard. By employing concepts from information theory and Lagrangian relaxation, we present several static and dynamic (on-line or interactive) test sequencing algorithms for the multiple fault isolation problem that provide a trade-off between the degree of suboptimality and computational complexity. Furthermore, we present novel diagnostic strategies that generate a static diagnostic directed graph (digraph), instead of a static diagnostic tree, for multiple fault diagnosis. Using this approach, the storage complexity of the overall diagnostic strategy reduces substantially. Computational results based on real-world systems indicate that the size of a static multiple fault strategy is strictly related to the structure of the system, and that the use of an on-line multiple fault strategy can diagnose faults in systems with as many as 10,000 failure sources.

  3. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  4. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  5. The Evolution of Software and Its Impact on Complex System Design in Robotic Spacecraft Embedded Systems

    NASA Technical Reports Server (NTRS)

    Butler, Roy

    2013-01-01

    The growth in computer hardware performance, coupled with reduced energy requirements, has led to a rapid expansion of the resources available to software systems, driving them towards greater logical abstraction, flexibility, and complexity. This shift in focus from compacting functionality into a limited field towards developing layered, multi-state architectures in a grand field has both driven and been driven by the history of embedded processor design in the robotic spacecraft industry.The combinatorial growth of interprocess conditions is accompanied by benefits (concurrent development, situational autonomy, and evolution of goals) and drawbacks (late integration, non-deterministic interactions, and multifaceted anomalies) in achieving mission success, as illustrated by the case of the Mars Reconnaissance Orbiter. Approaches to optimizing the benefits while mitigating the drawbacks have taken the form of the formalization of requirements, modular design practices, extensive system simulation, and spacecraft data trend analysis. The growth of hardware capability and software complexity can be expected to continue, with future directions including stackable commodity subsystems, computer-generated algorithms, runtime reconfigurable processors, and greater autonomy.

  6. Postural complexity influences development in infants born preterm with brain injury: relating perception-action theory to 3 cases.

    PubMed

    Dusing, Stacey C; Izzo, Theresa; Thacker, Leroy R; Galloway, James Cole

    2014-10-01

    Perception-action theory suggests a cyclical relationship between movement and perceptual information. In this case series, changes in postural complexity were used to quantify an infant's action and perception during the development of early motor behaviors. Three infants born preterm with periventricular white matter injury were included. Longitudinal changes in postural complexity (approximate entropy of the center of pressure), head control, reaching, and global development, measured with the Test of Infant Motor Performance and the Bayley Scales of Infant and Toddler Development, were assessed every 0.5 to 3 months during the first year of life. All 3 infants demonstrated altered postural complexity and developmental delays. However, the timing of the altered postural complexity and the type of delays varied among the infants. For infant 1, reduced postural complexity or limited action while learning to control her head in the midline position may have contributed to her motor delay. However, her ability to adapt her postural complexity eventually may have supported her ability to learn from her environment, as reflected in her relative cognitive strength. For infant 2, limited early postural complexity may have negatively affected his learning through action, resulting in cognitive delay. For infant 3, an increase in postural complexity above typical levels was associated with declining neurological status. Postural complexity is proposed as a measure of perception and action in the postural control system during the development of early behaviors. An optimal, intermediate level of postural complexity supports the use of a variety of postural control strategies and enhances the perception-action cycle. Either excessive or reduced postural complexity may contribute to developmental delays in infants born preterm with white matter injury. © 2014 American Physical Therapy Association.

  7. Radiochemical studies of 99mTc complexes of modified cysteine ligands and bifunctional chelating agents.

    PubMed

    Pillai, M R; Kothari, K; Banerjee, S; Samuel, G; Suresh, M; Sarma, H D; Jurisson, S

    1999-07-01

    The synthesis of four novel ligands using the amino-acid cysteine and its ethyl carboxylate derivative is described. The synthetic method involves a two-step procedure, wherein the intermediate Schiff base formed by the condensation of the amino group of the cysteine substrate and salicylaldehyde is reduced to give the target ligands. The intermediates and the final products were characterized by high resolution nuclear magnetic resonance spectroscopy. Complexation studies of the ligands with 99mTc were optimized using stannous tartrate as the reducing agent under varying reaction conditions. The complexes were characterized using standard quality control techniques such as thin layer chromatography, paper electrophoresis, and paper chromatography. Lipophilicities of the complexes were estimated by solvent extraction into chloroform. Substantial changes in net charge and lipophilicity of the 99mTc complexes were observed on substituting the carboxylic acid functionality in ligands I and II with the ethyl carboxylate groups (ligands II and IV). All the ligands formed 99mTc complexes in high yield. Whereas the complexes with ligands I and II were observed to be hydrophilic in nature and not extractable into CHCl3, ligands III and IV resulted in neutral and lipophilic 99mTc complexes. The 99mTc complex with ligand II was not stable and on storage formed a hydrophilic and nonextractable species. The biodistribution of the complexes of ligands I and II showed that they cleared predominantly through the kidneys, whereas the complexes with ligands III and IV were excreted primarily through the hepatobiliary system. No significant brain uptake was observed with the 99mTc complexes with ligands III and IV despite their favorable properties of neutrality, lipophilicity, and conversion into a hydrophilic species. These ligands offer potential for use as bifunctional chelating agents.

  8. Complexity reduction of biochemical rate expressions.

    PubMed

    Schmidt, Henning; Madsen, Mads F; Danø, Sune; Cedersund, Gunnar

    2008-03-15

    The current trend in dynamical modelling of biochemical systems is to construct more and more mechanistically detailed and thus complex models. The complexity is reflected in the number of dynamic state variables and parameters, as well as in the complexity of the kinetic rate expressions. However, a greater level of complexity, or level of detail, does not necessarily imply better models, or a better understanding of the underlying processes. Data often does not contain enough information to discriminate between different model hypotheses, and such overparameterization makes it hard to establish the validity of the various parts of the model. Consequently, there is an increasing demand for model reduction methods. We present a new reduction method that reduces complex rational rate expressions, such as those often used to describe enzymatic reactions. The method is a novel term-based identifiability analysis, which is easy to use and allows for user-specified reductions of individual rate expressions in complete models. The method is one of the first methods to meet the classical engineering objective of improved parameter identifiability without losing the systems biology demand of preserved biochemical interpretation. The method has been implemented in the Systems Biology Toolbox 2 for MATLAB, which is freely available from http://www.sbtoolbox2.org. The Supplementary Material contains scripts that show how to use it by applying the method to the example models, discussed in this article.

  9. Networks consolidation program: Maintenance and Operations (M&O) staffing estimates

    NASA Technical Reports Server (NTRS)

    Goodwin, J. P.

    1981-01-01

    The Mark IV-A consolidate deep space and high elliptical Earth orbiter (HEEO) missions tracking and implements centralized control and monitoring at the deep space communications complexes (DSCC). One of the objectives of the network design is to reduce maintenance and operations (M&O) costs. To determine if the system design meets this objective an M&O staffing model for Goldstone was developed which was used to estimate the staffing levels required to support the Mark IV-A configuration. The study was performed for the Goldstone complex and the program office translated these estimates for the overseas complexes to derive the network estimates.

  10. A new initiating system based on [(SiMes)Ru(PPh3)(Ind)Cl2] combined with azo-bis-isobutyronitrile in the polymerization and copolymerization of styrene and methyl methacrylate.

    PubMed

    Al-Majid, Abdullah M; Shamsan, Waseem Sharaf; Al-Odayn, Abdel-Basit Mohammed; Nahra, Fady; Aouak, Taieb; Nolan, Steven P

    2017-01-01

    The homopolymerization and copolymerization of styrene and methyl methacrylate, initiated for the first time by the combination of azo-bis-isobutyronitrile (AIBN) with [(SiMes)Ru(PPh 3 )(Ind)Cl 2 ] complex. The reactions were successfully carried out, on a large scale, in presence this complex at 80 °C. It was concluded from the data obtained that the association of AIBN with the ruthenium complex reduces considerably the transfer reactions and leads to the controlled radical polymerization and the well-defined polymers.

  11. Using arborescences to estimate hierarchicalness in directed complex networks

    PubMed Central

    2018-01-01

    Complex networks are a useful tool for the understanding of complex systems. One of the emerging properties of such systems is their tendency to form hierarchies: networks can be organized in levels, with nodes in each level exerting control on the ones beneath them. In this paper, we focus on the problem of estimating how hierarchical a directed network is. We propose a structural argument: a network has a strong top-down organization if we need to delete only few edges to reduce it to a perfect hierarchy—an arborescence. In an arborescence, all edges point away from the root and there are no horizontal connections, both characteristics we desire in our idealization of what a perfect hierarchy requires. We test our arborescence score in synthetic and real-world directed networks against the current state of the art in hierarchy detection: agony, flow hierarchy and global reaching centrality. These tests highlight that our arborescence score is intuitive and we can visualize it; it is able to better distinguish between networks with and without a hierarchical structure; it agrees the most with the literature about the hierarchy of well-studied complex systems; and it is not just a score, but it provides an overall scheme of the underlying hierarchy of any directed complex network. PMID:29381761

  12. Natural Pyrrhotite as a Catalyst in Prebiotic Chemical Evolution

    PubMed Central

    López Ibáñez de Aldecoa, Alejandra; Velasco Roldán, Francisco; Menor-Salván, César

    2013-01-01

    The idea of an autotrophic organism as the first living being on Earth leads to the hypothesis of a protometabolic, complex chemical system. In one of the main hypotheses, the first metabolic systems emerged from the interaction between sulfide minerals and/or soluble iron-sulfide complexes and fluids rich in inorganic precursors, which are reduced and derived from crustal or mantle activity. Within this context, the possible catalytic role of pyrrhotite, one of the most abundant sulfide minerals, in biomimetic redox and carbon fixation reactions was studied. Our results showed that pyrrhotite, under simulated hydrothermal conditions, could catalyze the pyruvate synthesis from lactate and that a dynamic system formed by coupling iron metal and iron-sulfur species in an electrochemical cell could promote carbon fixation from thioacetate esters. PMID:25369819

  13. Complexity Management Using Metrics for Trajectory Flexibility Preservation and Constraint Minimization

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Shen, Ni; Wing, David J.

    2011-01-01

    The growing demand for air travel is increasing the need for mitigating air traffic congestion and complexity problems, which are already at high levels. At the same time new surveillance, navigation, and communication technologies are enabling major transformations in the air traffic management system, including net-based information sharing and collaboration, performance-based access to airspace resources, and trajectory-based rather than clearance-based operations. The new system will feature different schemes for allocating tasks and responsibilities between the ground and airborne agents and between the human and automation, with potential capacity and cost benefits. Therefore, complexity management requires new metrics and methods that can support these new schemes. This paper presents metrics and methods for preserving trajectory flexibility that have been proposed to support a trajectory-based approach for complexity management by airborne or ground-based systems. It presents extensions to these metrics as well as to the initial research conducted to investigate the hypothesis that using these metrics to guide user and service provider actions will naturally mitigate traffic complexity. The analysis showed promising results in that: (1) Trajectory flexibility preservation mitigated traffic complexity as indicated by inducing self-organization in the traffic patterns and lowering traffic complexity indicators such as dynamic density and traffic entropy. (2)Trajectory flexibility preservation reduced the potential for secondary conflicts in separation assurance. (3) Trajectory flexibility metrics showed potential application to support user and service provider negotiations for minimizing the constraints imposed on trajectories without jeopardizing their objectives.

  14. Reducing the cognitive workload - Trouble managing power systems

    NASA Technical Reports Server (NTRS)

    Manner, David B.; Liberman, Eugene M.; Dolce, James L.; Mellor, Pamela A.

    1993-01-01

    The complexity of space-based systems makes monitoring them and diagnosing their faults taxing for human beings. When a problem arises, immediate attention and quick resolution is mandatory. To aid humans in these endeavors we have developed an automated advisory system. Our advisory expert system, Trouble, incorporates the knowledge of the power system designers for Space Station Freedom. Trouble is designed to be a ground-based advisor for the mission controllers in the Control Center Complex at Johnson Space Center (JSC). It has been developed at NASA Lewis Research Center (LeRC) and tested in conjunction with prototype flight hardware contained in the Power Management and Distribution testbed and the Engineering Support Center, ESC, at LeRC. Our work will culminate with the adoption of these techniques by the mission controllers at JSC. This paper elucidates how we have captured power system failure knowledge, how we have built and tested our expert system, and what we believe its potential uses are.

  15. Failure of Local Thermal Equilibrium in Quantum Friction

    NASA Astrophysics Data System (ADS)

    Intravaia, F.; Behunin, R. O.; Henkel, C.; Busch, K.; Dalvit, D. A. R.

    2016-09-01

    Recent progress in manipulating atomic and condensed matter systems has instigated a surge of interest in nonequilibrium physics, including many-body dynamics of trapped ultracold atoms and ions, near-field radiative heat transfer, and quantum friction. Under most circumstances the complexity of such nonequilibrium systems requires a number of approximations to make theoretical descriptions tractable. In particular, it is often assumed that spatially separated components of a system thermalize with their immediate surroundings, although the global state of the system is out of equilibrium. This powerful assumption reduces the complexity of nonequilibrium systems to the local application of well-founded equilibrium concepts. While this technique appears to be consistent for the description of some phenomena, we show that it fails for quantum friction by underestimating by approximately 80% the magnitude of the drag force. Our results show that the correlations among the components of driven, but steady-state, quantum systems invalidate the assumption of local thermal equilibrium, calling for a critical reexamination of this approach for describing the physics of nonequilibrium systems.

  16. A System for Planning Ahead

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A software system that uses artificial intelligence techniques to help with complex Space Shuttle scheduling at Kennedy Space Center is commercially available. Stottler Henke Associates, Inc.(SHAI), is marketing its automatic scheduling system, the Automated Manifest Planner (AMP), to industries that must plan and project changes many different times before the tasks are executed. The system creates optimal schedules while reducing manpower costs. Using information entered into the system by expert planners, the system automatically makes scheduling decisions based upon resource limitations and other constraints. It provides a constraint authoring system for adding other constraints to the scheduling process as needed. AMP is adaptable to assist with a variety of complex scheduling problems in manufacturing, transportation, business, architecture, and construction. AMP can benefit vehicle assembly plants, batch processing plants, semiconductor manufacturing, printing and textiles, surface and underground mining operations, and maintenance shops. For most of SHAI's commercial sales, the company obtains a service contract to customize AMP to a specific domain and then issues the customer a user license.

  17. Discrete event simulation as a tool in optimization of a professional complex adaptive system.

    PubMed

    Nielsen, Anders Lassen; Hilwig, Helmer; Kissoon, Niranjan; Teelucksingh, Surujpal

    2008-01-01

    Similar urgent needs for improvement of health care systems exist in the developed and developing world. The culture and the organization of an emergency department in developing countries can best be described as a professional complex adaptive system, where each agent (employee) are ignorant of the behavior of the system as a whole; no one understands the entire system. Each agent's action is based on the state of the system at the moment (i.e. lack of medicine, unavailable laboratory investigation, lack of beds and lack of staff in certain functions). An important question is how one can improve the emergency service within the given constraints. The use of simulation signals is one new approach in studying issues amenable to improvement. Discrete event simulation was used to simulate part of the patient flow in an emergency department. A simple model was built using a prototyping approach. The simulation showed that a minor rotation among the nurses could reduce the mean number of visitors that had to be refereed to alternative flows within the hospital from 87 to 37 on a daily basis with a mean utilization of the staff between 95.8% (the nurses) and 87.4% (the doctors). We conclude that even faced with resource constraints and lack of accessible data discrete event simulation is a tool that can be used successfully to study the consequences of changes in very complex and self organizing professional complex adaptive systems.

  18. A user-friendly earth system model of low complexity: the ESCIMO system dynamics model of global warming towards 2100

    NASA Astrophysics Data System (ADS)

    Randers, Jorgen; Golüke, Ulrich; Wenstøp, Fred; Wenstøp, Søren

    2016-11-01

    We have made a simple system dynamics model, ESCIMO (Earth System Climate Interpretable Model), which runs on a desktop computer in seconds and is able to reproduce the main output from more complex climate models. ESCIMO represents the main causal mechanisms at work in the Earth system and is able to reproduce the broad outline of climate history from 1850 to 2015. We have run many simulations with ESCIMO to 2100 and beyond. In this paper we present the effects of introducing in 2015 six possible global policy interventions that cost around USD 1000 billion per year - around 1 % of world GDP. We tentatively conclude (a) that these policy interventions can at most reduce the global mean surface temperature - GMST - by up to 0.5 °C in 2050 and up to 1.0 °C in 2100 relative to no intervention. The exception is injection of aerosols into the stratosphere, which can reduce the GMST by more than 1.0 °C in a decade but creates other serious problems. We also conclude (b) that relatively cheap human intervention can keep global warming in this century below +2 °C relative to preindustrial times. Finally, we conclude (c) that run-away warming is unlikely to occur in this century but is likely to occur in the longer run. The ensuing warming is slow, however. In ESCIMO, it takes several hundred years to lift the GMST to +3 °C above preindustrial times through gradual self-reinforcing melting of the permafrost. We call for research to test whether more complex climate models support our tentative conclusions from ESCIMO.

  19. Toward a Definition of Complexity for Quantum Field Theory States.

    PubMed

    Chapman, Shira; Heller, Michal P; Marrochio, Hugo; Pastawski, Fernando

    2018-03-23

    We investigate notions of complexity of states in continuous many-body quantum systems. We focus on Gaussian states which include ground states of free quantum field theories and their approximations encountered in the context of the continuous version of the multiscale entanglement renormalization ansatz. Our proposal for quantifying state complexity is based on the Fubini-Study metric. It leads to counting the number of applications of each gate (infinitesimal generator) in the transformation, subject to a state-dependent metric. We minimize the defined complexity with respect to momentum-preserving quadratic generators which form su(1,1) algebras. On the manifold of Gaussian states generated by these operations, the Fubini-Study metric factorizes into hyperbolic planes with minimal complexity circuits reducing to known geodesics. Despite working with quantum field theories far outside the regime where Einstein gravity duals exist, we find striking similarities between our results and those of holographic complexity proposals.

  20. Scientific and technical complex for modeling, researching and testing of rocket-space vehicles’ electric power installations

    NASA Astrophysics Data System (ADS)

    Bezruchko, Konstantin; Davidov, Albert

    2009-01-01

    In the given article scientific and technical complex for modeling, researching and testing of rocket-space vehicles' power installations which was created in Power Source Laboratory of National Aerospace University "KhAI" is described. This scientific and technical complex gives the opportunity to replace the full-sized tests on model tests and to reduce financial and temporary inputs at modeling, researching and testing of rocket-space vehicles' power installations. Using the given complex it is possible to solve the problems of designing and researching of rocket-space vehicles' power installations efficiently, and also to provide experimental researches of physical processes and tests of solar and chemical batteries of rocket-space complexes and space vehicles. Scientific and technical complex also allows providing accelerated tests, diagnostics, life-time control and restoring of chemical accumulators for rocket-space vehicles' power supply systems.

  1. Toward a Definition of Complexity for Quantum Field Theory States

    NASA Astrophysics Data System (ADS)

    Chapman, Shira; Heller, Michal P.; Marrochio, Hugo; Pastawski, Fernando

    2018-03-01

    We investigate notions of complexity of states in continuous many-body quantum systems. We focus on Gaussian states which include ground states of free quantum field theories and their approximations encountered in the context of the continuous version of the multiscale entanglement renormalization ansatz. Our proposal for quantifying state complexity is based on the Fubini-Study metric. It leads to counting the number of applications of each gate (infinitesimal generator) in the transformation, subject to a state-dependent metric. We minimize the defined complexity with respect to momentum-preserving quadratic generators which form s u (1 ,1 ) algebras. On the manifold of Gaussian states generated by these operations, the Fubini-Study metric factorizes into hyperbolic planes with minimal complexity circuits reducing to known geodesics. Despite working with quantum field theories far outside the regime where Einstein gravity duals exist, we find striking similarities between our results and those of holographic complexity proposals.

  2. Assimilation of glider and mooring data into a coastal ocean model

    NASA Astrophysics Data System (ADS)

    Jones, Emlyn M.; Oke, Peter R.; Rizwi, Farhan; Murray, Lawrence M.

    We have applied an ensemble optimal interpolation (EnOI) data assimilation system to a high resolution coastal ocean model of south-east Tasmania, Australia. The region is characterised by a complex coastline with water masses influenced by riverine input and the interaction between two offshore current systems. Using a large static ensemble to estimate the systems background error covariance, data from a coastal observing network of fixed moorings and a Slocum glider are assimilated into the model at daily intervals. We demonstrate that the EnOI algorithm can successfully correct a biased high resolution coastal model. In areas with dense observations, the assimilation scheme reduces the RMS difference between the model and independent GHRSST observations by 90%, while the domain-wide RMS difference is reduced by a more modest 40%. Our findings show that errors introduced by surface forcing and boundary conditions can be identified and reduced by a relatively sparse observing array using an inexpensive ensemble-based data assimilation system.

  3. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  4. Virtual Control Policy for Binary Ordered Resources Petri Net Class.

    PubMed

    Rovetto, Carlos A; Concepción, Tomás J; Cano, Elia Esther

    2016-08-18

    Prevention and avoidance of deadlocks in sensor networks that use the wormhole routing algorithm is an active research domain. There are diverse control policies that will address this problem being our approach a new method. In this paper we present a virtual control policy for the new specialized Petri net subclass called Binary Ordered Resources Petri Net (BORPN). Essentially, it is an ordinary class constructed from various state machines that share unitary resources in a complex form, which allows branching and joining of processes. The reduced structure of this new class gives advantages that allow analysis of the entire system's behavior, which is a prohibitive task for large systems because of the complexity and routing algorithms.

  5. A preliminary investigation of cryogenic CO2 capture utilizing a reverse Brayton Cycle

    NASA Astrophysics Data System (ADS)

    Yuan, L. C.; Pfotenhauer, J. M.; Qiu, L. M.

    2014-01-01

    Utilizing CO2 capture and storage (CCS) technologies is a significant way to reduce carbon emissions from coal fired power plants. Cryogenic CO2 capture (CCC) is an innovative and promising CO2 capture technology, which has an apparent energy and environmental advantage compared to alternatives. A process of capturing CO2 from the flue gas of a coal-fired electrical power plant by cryogenically desublimating CO2 has been discussed and demonstrated theoretically. However, pressurizing the inlet flue gas to reduce the energy penalty for the cryogenic process will lead to a more complex system. In this paper, a modified CCC system utilizing a reverse Brayton Cycle is proposed, and the energy penalty of these two systems are compared theoretically.

  6. Economic networks: the new challenges.

    PubMed

    Schweitzer, Frank; Fagiolo, Giorgio; Sornette, Didier; Vega-Redondo, Fernando; Vespignani, Alessandro; White, Douglas R

    2009-07-24

    The current economic crisis illustrates a critical need for new and fundamental understanding of the structure and dynamics of economic networks. Economic systems are increasingly built on interdependencies, implemented through trans-national credit and investment networks, trade relations, or supply chains that have proven difficult to predict and control. We need, therefore, an approach that stresses the systemic complexity of economic networks and that can be used to revise and extend established paradigms in economic theory. This will facilitate the design of policies that reduce conflicts between individual interests and global efficiency, as well as reduce the risk of global failure by making economic networks more robust.

  7. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  8. Implementation in the midst of complexity: Using ethnography to study health care-associated infection prevention and control.

    PubMed

    Knobloch, Mary Jo; Thomas, Kevin V; Patterson, Erin; Zimbric, Michele L; Musuuza, Jackson; Safdar, Nasia

    2017-10-01

    Contextual factors associated with health care settings make reducing health care-associated infections (HAIs) a complex task. The aim of this article is to highlight how ethnography can assist in understanding contextual factors that support or hinder the implementation of evidence-based practices for reducing HAIs. We conducted a review of ethnographic studies specifically related to HAI prevention and control in the last 5 years (2012-2017). Twelve studies specific to HAIs and ethnographic methods were found. Researchers used various methods with video-reflexive sessions used in 6 of the 12 studies. Ethnography was used to understand variation in data reporting, identify barriers to adherence, explore patient perceptions of isolation practices and highlight the influence of physical design on infection prevention practices. The term ethnography was used to describe varied research methods. Most studies were conducted outside the United States, and authors indicate insights gained using ethnographic methods (whether observations, interviews, or reflexive video recording) as beneficial to unraveling the complexities of HAI prevention. Ethnography is well-suited for HAI prevention, especially video-reflexive ethnography, for activating patients and clinicians in infection control work. In this era of increasing pressure to reduce HAIs within complex work systems, ethnographic methods can promote understanding of contextual factors and may expedite translation evidence to practice. Published by Elsevier Inc.

  9. Texting during stair negotiation and implications for fall risk.

    PubMed

    Hashish, Rami; Toney-Bolger, Megan E; Sharpe, Sarah S; Lester, Benjamin D; Mulliken, Adam

    2017-10-01

    Walking requires the integration of the sensory and motor systems. Cognitive distractions have been shown to interfere with negotiation of complex walking environments, especially in populations at greater risk for falls (e.g. the elderly). With the pervasiveness of mobile messaging and the recent introduction of augmented reality mobile gaming, it is increasingly important to understand how distraction associated with the simultaneous use of a mobile device impacts navigation of the complex walking environments experienced in daily life. In this study, we investigated how gait kinematics were altered when participants performed a texting task during step negotiation. Twenty participants (13 female, 7 males) performed a series of walking trials involving a step-deck obstacle, consisting of at least 3 texting trials and 3 non-texting trials. When texting, participants ascended more slowly and demonstrated reduced dual-step foot toe clearance. Participants similarly descended more slowly when texting and demonstrated reduced single-step foot heel clearance as well as reduced dual-step foot fore-aft heel clearance. These data support the conclusion that texting during stair negotiation results in changes to gait kinematics that may increase the potential for gait disruptions, falls, and injury. Further research should examine the effect texting has on performing other common complex locomotor tasks, actual fall risk, and the patterns of resulting injury rate and severity when negotiating complex environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Reducing nurses'. Workload using a computerized nursing support system linked to the hospital information system.

    PubMed

    Ito, C; Satoh, I; Michiya, H; Kitayama, Y; Miyazaki, K; Ota, S; Satoh, H; Sakurai, T; Shirato, H; Miyasaka, K

    1997-01-01

    A computerised nursing support system (CNSS) linked to the hospital information system (HIS) was developed and has been in use for one year, in order to reduce the workload of nurses. CNSS consists of (1) a hand held computer for each nurse (2) desk-top computers in the nurses' station and doctors' rooms (3) a data server (4) an interface with the main hospital information system. Nurses enter vital signs, food intake and other information about the patients into the hand held computer at the bed-side. The information is then sent automatically to the CNSS data server, which also receives patients' details (prescribed medicines etc.) from the HIS. Nurses and doctors can see all the information on the desk-top and hand held computers. This system was introduced in May 1995 into a university hospital ward with 40 beds. A questionnaire was completed by 23 nurses before and after the introduction of CNSS. The mean time required to post vital data was significantly reduced from 121 seconds to 54 seconds (p < 0.01). After three months 30% of nurses felt CNSS had reduced their workload, while 30% felt it had complicated their work; after five months 70% noted a reduction and 0% reported that CNSS had made their work more complex. The study therefore concludes that the interface between a computerised nursing support system and the hospital information system reduced the workload of nurses.

  11. Medicare home health payment reform may jeopardize access for clinically complex and socially vulnerable patients.

    PubMed

    Rosati, Robert J; Russell, David; Peng, Timothy; Brickner, Carlin; Kurowski, Daniel; Christopher, Mary Ann; Sheehan, Kathleen M

    2014-06-01

    The Affordable Care Act directed Medicare to update its home health prospective payment system to reflect more recent data on costs and use of services-an exercise known as rebasing. As a result, the Centers for Medicare and Medicaid Services will reduce home health payments 3.5 percent per year in the period 2014-17. To determine the impact that these reductions could have on beneficiaries using home health care, we examined the Medicare reimbursement margins and the use of services in a national sample of 96,621 episodes of care provided by twenty-six not-for-profit home health agencies in 2011. We found that patients with clinically complex conditions and social vulnerability factors, such as living alone, had substantially higher service delivery costs than other home health patients. Thus, the socially vulnerable patients with complex conditions represent less profit-lower-to-negative Medicare margins-for home health agencies. This financial disincentive could reduce such patients' access to care as Medicare payments decline. Policy makers should consider the unique characteristics of these patients and ensure their continued access to Medicare's home health services when planning rebasing and future adjustments to the prospective payment system. Project HOPE—The People-to-People Health Foundation, Inc.

  12. Fabrication and optimization of fast disintegrating tablets employing interpolymeric chitosan-alginate complex and chitin as novel superdisintegrants.

    PubMed

    Goel, Honey; Tiwary, Ashok K; Rana, Vikas

    2011-01-01

    The objective of the present work was to optimize the formulation of fast disintegrating tablets (FDTs) of ondansetron HCl containing novel superdisintegrants, possessing sufficient mechanical strength and disintegration time comparable to those containing crospovidone or croscarmellose sodium. The FDTs were formulated using a novel superdisintegrant (chitosan-alginate (1:1) interpolymer complex and chitin) to achieve a sweet tasting disintegrating system. The results revealed that chitin (5-20%) increased the porosity and decreased the DT of tablets. At higher concentrations chitin maintained tablet porosity even at 5.5 kg crushing strength. Ondansetron HCl was found to antagonize the wicking action of glycine. Further, evaluation of the mechanism of disintegration revealed that glycine transported the aqueous medium to different parts of the tablets while the chitosan-alginate complex swelled up due to transfer of moisture from glycine. This phenomenon resulted in breakage of the tablet within seconds. For preparing optimized FDTs, the reduced model equations generated from Box-Behnken design (BBD) were solved after substituting the known disintegration time of FDTs containing superdisintegrants in the reduced model equations. The results suggested that excipient system under investigation not only improved the disintegration time but also made it possible to prepare FDTs with higher crushing strength as compared to tablets containing known superdisintegrants.

  13. Nanoparticles: Nanoscale Systems for Medical Applications

    NASA Astrophysics Data System (ADS)

    Wadkins, David Allen

    The goal of this project was to develop a series of nano platforms for single cell analysis and drug delivery. Nanoparticles are a promising option to improve our medical therapies by controlling biodistribution and pharmacokinetics of therapeutics. Nanosystems also offer significant opportunity to improve current imaging modalities. The systems developed during this thesis work can be foundations for developing advanced therapies for obesity and improving our fundamental understandings of single cell behavior. The first of the two systems we attempt to create was a drug delivery system that could selectively target adipose tissue to deliver uncoupling agents and drive browning of adipose tissue and associated weight loss. Protonophores have a history of significant toxic side effects in cardiac and neuronal tissues a recently discovered protonophore, but BAM-15, has been shown to have reduced cytotoxicity. We hypothesized that the altered biodistribution of BAM-15 encapsulated in a nanoparticle could provide systemic weight loss with minimized side effects. The second system developed utilized quantum dots to create a fluorescent barcode that could be repeatedly identified using quantitative fluorescent emission readings. This platform would allow for the tracking of individual cells, allowing repeat interrogation across time and space in complex multicellular environments. Ultimately this work demonstrates the process and complexity involved in developing nanoparticulate systems meant to interact with incredibly complex intracellular environments.

  14. Providing data science support for systems pharmacology and its implications to drug discovery.

    PubMed

    Hart, Thomas; Xie, Lei

    2016-01-01

    The conventional one-drug-one-target-one-disease drug discovery process has been less successful in tracking multi-genic, multi-faceted complex diseases. Systems pharmacology has emerged as a new discipline to tackle the current challenges in drug discovery. The goal of systems pharmacology is to transform huge, heterogeneous, and dynamic biological and clinical data into interpretable and actionable mechanistic models for decision making in drug discovery and patient treatment. Thus, big data technology and data science will play an essential role in systems pharmacology. This paper critically reviews the impact of three fundamental concepts of data science on systems pharmacology: similarity inference, overfitting avoidance, and disentangling causality from correlation. The authors then discuss recent advances and future directions in applying the three concepts of data science to drug discovery, with a focus on proteome-wide context-specific quantitative drug target deconvolution and personalized adverse drug reaction prediction. Data science will facilitate reducing the complexity of systems pharmacology modeling, detecting hidden correlations between complex data sets, and distinguishing causation from correlation. The power of data science can only be fully realized when integrated with mechanism-based multi-scale modeling that explicitly takes into account the hierarchical organization of biological systems from nucleic acid to proteins, to molecular interaction networks, to cells, to tissues, to patients, and to populations.

  15. Life without complex I: proteome analyses of an Arabidopsis mutant lacking the mitochondrial NADH dehydrogenase complex

    PubMed Central

    Fromm, Steffanie; Senkler, Jennifer; Eubel, Holger; Peterhänsel, Christoph; Braun, Hans-Peter

    2016-01-01

    The mitochondrial NADH dehydrogenase complex (complex I) is of particular importance for the respiratory chain in mitochondria. It is the major electron entry site for the mitochondrial electron transport chain (mETC) and therefore of great significance for mitochondrial ATP generation. We recently described an Arabidopsis thaliana double-mutant lacking the genes encoding the carbonic anhydrases CA1 and CA2, which both form part of a plant-specific ‘carbonic anhydrase domain’ of mitochondrial complex I. The mutant lacks complex I completely. Here we report extended analyses for systematically characterizing the proteome of the ca1ca2 mutant. Using various proteomic tools, we show that lack of complex I causes reorganization of the cellular respiration system. Reduced electron entry into the respiratory chain at the first segment of the mETC leads to induction of complexes II and IV as well as alternative oxidase. Increased electron entry at later segments of the mETC requires an increase in oxidation of organic substrates. This is reflected by higher abundance of proteins involved in glycolysis, the tricarboxylic acid cycle and branched-chain amino acid catabolism. Proteins involved in the light reaction of photosynthesis, the Calvin cycle, tetrapyrrole biosynthesis, and photorespiration are clearly reduced, contributing to the significant delay in growth and development of the double-mutant. Finally, enzymes involved in defense against reactive oxygen species and stress symptoms are much induced. These together with previously reported insights into the function of plant complex I, which were obtained by analysing other complex I mutants, are integrated in order to comprehensively describe ‘life without complex I’. PMID:27122571

  16. Quantum Gauss-Jordan Elimination and Simulation of Accounting Principles on Quantum Computers

    NASA Astrophysics Data System (ADS)

    Diep, Do Ngoc; Giang, Do Hoang; Van Minh, Nguyen

    2017-06-01

    The paper is devoted to a version of Quantum Gauss-Jordan Elimination and its applications. In the first part, we construct the Quantum Gauss-Jordan Elimination (QGJE) Algorithm and estimate the complexity of computation of Reduced Row Echelon Form (RREF) of N × N matrices. The main result asserts that QGJE has computation time is of order 2 N/2. The second part is devoted to a new idea of simulation of accounting by quantum computing. We first expose the actual accounting principles in a pure mathematics language. Then, we simulate the accounting principles on quantum computers. We show that, all accounting actions are exhousted by the described basic actions. The main problems of accounting are reduced to some system of linear equations in the economic model of Leontief. In this simulation, we use our constructed Quantum Gauss-Jordan Elimination to solve the problems and the complexity of quantum computing is a square root order faster than the complexity in classical computing.

  17. Neuropsychiatry of complex visual hallucinations.

    PubMed

    Mocellin, Ramon; Walterfang, Mark; Velakoulis, Dennis

    2006-09-01

    To describe the phenomenology and pathophysiology of complex visual hallucinations (CVH) in various organic states, in particular Charles Bonnet syndrome and peduncular hallucinosis. Three cases of CVH in the setting of pontine infarction, thalamic infarction and temporoparietal epileptiform activity are presented and the available psychiatric, neurological and biological literature on the structures of the central nervous system involved in producing hallucinatory states is reviewed. Complex visual hallucinations can arise from a variety of processes involving the retinogeniculocalcarine tract, or ascending brainstem modulatory structures. The cortical activity responsible for hallucinations results from altered or reduced input into these regions, or a loss of ascending inhibition of their afferent pathways. A significant degree of overlaps exists between the concepts of Charles Bonnet syndrome and peduncular hallucinosis. The fluidity of these eponymous syndromes reduces their validity and meaning, and may result in an inappropriate attribution of the underlying pathology. An understanding of how differing pathologies may produce CVH allows for the appropriate tailoring of treatment, depending on the site and nature of the lesion and content of perceptual disturbance.

  18. A study on the applications of AI in finishing of additive manufacturing parts

    NASA Astrophysics Data System (ADS)

    Fathima Patham, K.

    2017-06-01

    Artificial intelligent and computer simulation are the technological powerful tools for solving complex problems in the manufacturing industries. Additive Manufacturing is one of the powerful manufacturing techniques that provide design flexibilities to the products. The products with complex shapes are directly manufactured without the need of any machining and tooling using Additive Manufacturing. However, the main drawback of the components produced using the Additive Manufacturing processes is the quality of the surfaces. This study aims to minimize the defects caused during Additive Manufacturing with the aid of Artificial Intelligence. The developed AI system has three layers, each layer is trying to eliminate or minimize the production errors. The first layer of the AI system optimizes the digitization of the 3D CAD model of the product and hence reduces the stair case errors. The second layer of the AI system optimizes the 3D printing machine parameters in order to eliminate the warping effect. The third layer of AI system helps to choose the surface finishing technique suitable for the printed component based on the Degree of Complexity of the product and the material. The efficiency of the developed AI system was examined on the functional parts such as gears.

  19. System and process for pulsed multiple reaction monitoring

    DOEpatents

    Belov, Mikhail E

    2013-05-17

    A new pulsed multiple reaction monitoring process and system are disclosed that uses a pulsed ion injection mode for use in conjunction with triple-quadrupole instruments. The pulsed injection mode approach reduces background ion noise at the detector, increases amplitude of the ion signal, and includes a unity duty cycle that provides a significant sensitivity increase for reliable quantitation of proteins/peptides present at attomole levels in highly complex biological mixtures.

  20. Adaptive Self-Tuning Networks

    NASA Astrophysics Data System (ADS)

    Knox, H. A.; Draelos, T.; Young, C. J.; Lawry, B.; Chael, E. P.; Faust, A.; Peterson, M. G.

    2015-12-01

    The quality of automatic detections from seismic sensor networks depends on a large number of data processing parameters that interact in complex ways. The largely manual process of identifying effective parameters is painstaking and does not guarantee that the resulting controls are the optimal configuration settings. Yet, achieving superior automatic detection of seismic events is closely related to these parameters. We present an automated sensor tuning (AST) system that learns near-optimal parameter settings for each event type using neuro-dynamic programming (reinforcement learning) trained with historic data. AST learns to test the raw signal against all event-settings and automatically self-tunes to an emerging event in real-time. The overall goal is to reduce the number of missed legitimate event detections and the number of false event detections. Reducing false alarms early in the seismic pipeline processing will have a significant impact on this goal. Applicable both for existing sensor performance boosting and new sensor deployment, this system provides an important new method to automatically tune complex remote sensing systems. Systems tuned in this way will achieve better performance than is currently possible by manual tuning, and with much less time and effort devoted to the tuning process. With ground truth on detections in seismic waveforms from a network of stations, we show that AST increases the probability of detection while decreasing false alarms.

  1. Modeling bed load transport and step-pool morphology with a reduced-complexity approach

    NASA Astrophysics Data System (ADS)

    Saletti, Matteo; Molnar, Peter; Hassan, Marwan A.; Burlando, Paolo

    2016-04-01

    Steep mountain channels are complex fluvial systems, where classical methods developed for lowland streams fail to capture the dynamics of sediment transport and bed morphology. Estimations of sediment transport based on average conditions have more than one order of magnitude of uncertainty because of the wide grain-size distribution of the bed material, the small relative submergence of coarse grains, the episodic character of sediment supply, and the complex boundary conditions. Most notably, bed load transport is modulated by the structure of the bed, where grains are imbricated in steps and similar bedforms and, therefore, they are much more stable then predicted. In this work we propose a new model based on a reduced-complexity (RC) approach focused on the reproduction of the step-pool morphology. In our 2-D cellular-automaton model entrainment, transport and deposition of particles are considered via intuitive rules based on physical principles. A parsimonious set of parameters allows the control of the behavior of the system, and the basic processes can be considered in a deterministic or stochastic way. The probability of entrainment of grains (and, as a consequence, particle travel distances and resting times) is a function of flow conditions and bed topography. Sediment input is fed at the upper boundary of the channel at a constant or variable rate. Our model yields realistic results in terms of longitudinal bed profiles and sediment transport trends. Phases of aggradation and degradation can be observed in the channel even under a constant input and the memory of the morphology can be quantified with long-range persistence indicators. Sediment yield at the channel outlet shows intermittency as observed in natural streams. Steps are self-formed in the channel and their stability is tested against the model parameters. Our results show the potential of RC models as complementary tools to more sophisticated models. They provide a realistic description of complex morphological systems and help to better identify the key physical principles that rule their dynamics.

  2. Targeted Genome Editing Using DNA-Free RNA-Guided Cas9 Ribonucleoprotein for CHO Cell Engineering.

    PubMed

    Shin, Jongoh; Lee, Namil; Cho, Suhyung; Cho, Byung-Kwan

    2018-01-01

    Recent advances in the CRISPR/Cas9 system have dramatically facilitated genome engineering in various cell systems. Among the protocols, the direct delivery of the Cas9-sgRNA ribonucleoprotein (RNP) complex into cells is an efficient approach to increase genome editing efficiency. This method uses purified Cas9 protein and in vitro transcribed sgRNA to edit the target gene without vector DNA. We have applied the RNP complex to CHO cell engineering to obtain desirable phenotypes and to reduce unintended insertional mutagenesis and off-target effects. Here, we describe our routine methods for RNP complex-mediated gene deletion including the protocols to prepare the purified Cas9 protein and the in vitro transcribed sgRNA. Subsequently, we also describe a protocol to confirm the edited genomic positions using the T7E1 enzymatic assay and next-generation sequencing.

  3. PEM-PCA: a parallel expectation-maximization PCA face recognition architecture.

    PubMed

    Rujirakul, Kanokmon; So-In, Chakchai; Arnonkijpanich, Banchar

    2014-01-01

    Principal component analysis or PCA has been traditionally used as one of the feature extraction techniques in face recognition systems yielding high accuracy when requiring a small number of features. However, the covariance matrix and eigenvalue decomposition stages cause high computational complexity, especially for a large database. Thus, this research presents an alternative approach utilizing an Expectation-Maximization algorithm to reduce the determinant matrix manipulation resulting in the reduction of the stages' complexity. To improve the computational time, a novel parallel architecture was employed to utilize the benefits of parallelization of matrix computation during feature extraction and classification stages including parallel preprocessing, and their combinations, so-called a Parallel Expectation-Maximization PCA architecture. Comparing to a traditional PCA and its derivatives, the results indicate lower complexity with an insignificant difference in recognition precision leading to high speed face recognition systems, that is, the speed-up over nine and three times over PCA and Parallel PCA.

  4. Utilizing the GentleWave® System for Debridement of Undetected Apical Anatomy.

    PubMed

    Ford, Michael W

    2018-03-01

    Debriding and disinfecting complex anatomies within the root canal system pose a major challenge during root canal therapy. Even with current chemomechanical techniques, debris and bacterial remnants are commonly left behind, which are generally believed to increase the risk of endodontic failure. This case details the use of a new technique to debride complex apical anatomy in a maxillary molar. A 48-year-old female presented to the clinic with a chief complaint of increasing pain in her tooth. Clinical examination of the right first maxillary molar (#3) revealed moderate sensitivity to percussion and mild sensitivity to palpation. A pulpal diagnosis of symptomatic irreversible pulpitis and a periapi-cal diagnosis of symptomatic apical periodontitis were made. Mechanical instrumentation was performed using rotary file size #25/.04 for the mesiobuccal and distobuccal canals and size #25/.06 for the palatal canal to create a fluid path and enable obturation of the root canal system following the GentleWave® Procedure. The GentleWave Procedure was completed using Multisonic Ultracleaning™ for complete debridement and disinfection of the root canal system. The tooth was obturated using a warm vertical continuous wave obturation technique. Postoperative radiographs revealed complex anatomy within the apical third that was undetected both during pre-operative radiography and mechanical instrumentation. The palatal canal exhibited a complex apical delta with multiple points of exit, and the mesiobuccal canal revealed an undetected lateral canal within the apical third that had a separate and distinct egress. Conclusion and clinical significance: It is important for the clinician to debride and disinfect complex anatomy within the root canal system to reduce the risk of endodontic failure. This case report highlights the clinical significance of utilizing the GentleWave Procedure for detecting complex apical anatomy during endodontic therapy.

  5. Spacecraft Parachute Recovery System Testing from a Failure Rate Perspective

    NASA Technical Reports Server (NTRS)

    Stewart, Christine E.

    2013-01-01

    Spacecraft parachute recovery systems, especially those with a parachute cluster, require testing to identify and reduce failures. This is especially important when the spacecraft in question is human-rated. Due to the recent effort to make spaceflight affordable, the importance of determining a minimum requirement for testing has increased. The number of tests required to achieve a mature design, with a relatively constant failure rate, can be estimated from a review of previous complex spacecraft recovery systems. Examination of the Apollo parachute testing and the Shuttle Solid Rocket Booster recovery chute system operation will clarify at which point in those programs the system reached maturity. This examination will also clarify the risks inherent in not performing a sufficient number of tests prior to operation with humans on-board. When looking at complex parachute systems used in spaceflight landing systems, a pattern begins to emerge regarding the need for a minimum amount of testing required to wring out the failure modes and reduce the failure rate of the parachute system to an acceptable level for human spaceflight. Not only a sufficient number of system level testing, but also the ability to update the design as failure modes are found is required to drive the failure rate of the system down to an acceptable level. In addition, sufficient data and images are necessary to identify incipient failure modes or to identify failure causes when a system failure occurs. In order to demonstrate the need for sufficient system level testing prior to an acceptable failure rate, the Apollo Earth Landing System (ELS) test program and the Shuttle Solid Rocket Booster Recovery System failure history will be examined, as well as some experiences in the Orion Capsule Parachute Assembly System will be noted.

  6. ENVIRONMENTAL CONSEQUENCES OF LAND USE CHANGE: ACCOUNTING FOR COMPLEXITY WITH AGENT-BASED MODELS

    EPA Science Inventory

    The effects of people on ecosystems and the impacts of ecosystem services on human well-being are being viewed increasingly as an integrated system. Demographic and economic pressures change a variety of ecological indicators, which can then result in reduced quality of ecosystem...

  7. Research Technology

    NASA Image and Video Library

    2004-04-15

    Harnessing the Sun's energy through Solar Thermal Propulsion will propel vehicles through space by significantly reducing weight, complexity, and cost while boosting performance over current conventional upper stages. Another solar powered system, solar electric propulsion, demonstrates ion propulsion is suitable for long duration missions. Pictured is an artist's concept of space flight using solar thermal propulsion.

  8. A Hierarchic System for Information Usage.

    ERIC Educational Resources Information Center

    Lu, John; Markham, David

    This paper demonstrates an approach which enables one to reduce in a systematic way the immense complexity of a large body of knowledge. This approach provides considerable insight into what is known and unknown in a given academic field by systematically and pragmatically ordering the information. As a case study, the authors selected…

  9. A Complex Systems View of Sepsis: Implications for Nursing

    DTIC Science & Technology

    2013-02-01

    resuscitation resulting from disease progres- sion and requiring vasopressor therapy.8 Ultimately, the onset of multiple organ failure is the result of loss of...Kattwinkel J, et al. Mortality reduc- tion by heart rate characteristicmonitoring in very lowbirthweight neonates : a randomized trial. J Pediatr. 2011;159(6

  10. Elements of Engagement: A Model of Teacher Interactions via Professional Learning Networks

    ERIC Educational Resources Information Center

    Krutka, Daniel G.; Carpenter, Jeffrey P.; Trust, Torrey

    2016-01-01

    In recent years, many educators have turned to participatory online affinity spaces for professional growth with peers who are more accessible because of reduced temporal and spatial constraints. Specifically, professional learning networks (PLNs) are "uniquely personalized, complex systems of interactions consisting of people, resources, and…

  11. Channel Model Optimization with Reflection Residual Component for Indoor MIMO-VLC System

    NASA Astrophysics Data System (ADS)

    Chen, Yong; Li, Tengfei; Liu, Huanlin; Li, Yichao

    2017-12-01

    A fast channel modeling method is studied to solve the problem of reflection channel gain for multiple input multiple output-visible light communications (MIMO-VLC) in the paper. For reducing the computational complexity when associating with the reflection times, no more than 3 reflections are taken into consideration in VLC. We think that higher order reflection link consists of corresponding many times line of sight link and firstly present reflection residual component to characterize higher reflection (more than 2 reflections). We perform computer simulation results for point-to-point channel impulse response, receiving optical power and receiving signal to noise ratio. Based on theoretical analysis and simulation results, the proposed method can effectively reduce the computational complexity of higher order reflection in channel modeling.

  12. Laccases as palladium oxidases.

    PubMed

    Mekmouche, Yasmina; Schneider, Ludovic; Rousselot-Pailley, Pierre; Faure, Bruno; Simaan, A Jalila; Bochot, Constance; Réglier, Marius; Tron, Thierry

    2015-02-01

    The first example of a coupled catalytic system involving an enzyme and a palladium(ii) catalyst competent for the aerobic oxidation of alcohol in mild conditions is described. In the absence of dioxygen, the fungal laccase LAC3 is reduced by a palladium(0) species as evidenced by the UV/VIS and ESR spectra of the enzyme. During the oxidation of veratryl alcohol performed in water, at room temperature and atmospheric pressure, LAC3 regenerates the palladium catalyst, is reduced and catalyzes the four-electron reduction of dioxygen into water with no loss of enzyme activity. The association of a laccase with a water-soluble palladium complex results in a 7-fold increase in the catalytic efficiency of the complex. This is the first step in the design of a family of renewable palladium catalysts for aerobic oxidation.

  13. CPM Signals for Satellite Navigation in the S and C Bands.

    PubMed

    Xue, Rui; Sun, Yanbo; Zhao, Danfeng

    2015-06-05

    Frequency allocations in the L band suitable for global navigation satellite system (GNSS) services are getting crowded and system providers face an ever tougher job when they try to bring in new signals and services while maintaining radio frequency compatibility. With the successive opening of the S and C bands to GNSS service, the multi-band combined navigation is predicted to become a key technology for future high-precision positioning navigation systems, and a single modulation scheme satisfying the requirements in each band is a promising solution for reducing user terminal complexity. A universal modulation scheme based on the continuous phase modulation (CPM) family suitable for the above bands' demands is proposed. Moreover, this paper has put forward two specific CPM signals for the S and C bands, respectively. Then the proposed modulation schemes, together with existing candidates, are comprehensively evaluated. Simulation results show that the proposed CPM signals can not only satisfy the constraint condition of compatibility in different bands well and reduce user terminal complexity, but also provide superior performance in terms of tracking accuracy, multi-path mitigation and anti-jamming compared to other candidate modulation schemes.

  14. Condition-based diagnosis of mechatronic systems using a fractional calculus approach

    NASA Astrophysics Data System (ADS)

    Gutiérrez-Carvajal, Ricardo Enrique; Flávio de Melo, Leonimer; Maurício Rosário, João; Tenreiro Machado, J. A.

    2016-07-01

    While fractional calculus (FC) is as old as integer calculus, its application has been mainly restricted to mathematics. However, many real systems are better described using FC equations than with integer models. FC is a suitable tool for describing systems characterised by their fractal nature, long-term memory and chaotic behaviour. It is a promising methodology for failure analysis and modelling, since the behaviour of a failing system depends on factors that increase the model's complexity. This paper explores the proficiency of FC in modelling complex behaviour by tuning only a few parameters. This work proposes a novel two-step strategy for diagnosis, first modelling common failure conditions and, second, by comparing these models with real machine signals and using the difference to feed a computational classifier. Our proposal is validated using an electrical motor coupled with a mechanical gear reducer.

  15. Integrated geometry and grid generation system for complex configurations

    NASA Technical Reports Server (NTRS)

    Akdag, Vedat; Wulf, Armin

    1992-01-01

    A grid generation system was developed that enables grid generation for complex configurations. The system called ICEM/CFD is described and its role in computational fluid dynamics (CFD) applications is presented. The capabilities of the system include full computer aided design (CAD), grid generation on the actual CAD geometry definition using robust surface projection algorithms, interfacing easily with known CAD packages through common file formats for geometry transfer, grid quality evaluation of the volume grid, coupling boundary condition set-up for block faces with grid topology generation, multi-block grid generation with or without point continuity and block to block interface requirement, and generating grid files directly compatible with known flow solvers. The interactive and integrated approach to the problem of computational grid generation not only substantially reduces manpower time but also increases the flexibility of later grid modifications and enhancements which is required in an environment where CFD is integrated into a product design cycle.

  16. Understanding Water-Stress Responses in Soybean Using Hydroponics System—A Systems Biology Perspective

    PubMed Central

    Tripathi, Prateek; Rabara, Roel C.; Shulaev, Vladimir; Shen, Qingxi J.; Rushton, Paul J.

    2015-01-01

    The deleterious changes in environmental conditions such as water stress bring physiological and biochemical changes in plants, which results in crop loss. Thus, combating water stress is important for crop improvement to manage the needs of growing population. Utilization of hydroponics system in growing plants is questionable to some researchers, as it does not represent an actual field condition. However, trying to address a complex problem like water stress we have to utilize a simpler growing condition like the hydroponics system wherein every input given to the plants can be controlled. With the advent of high-throughput technologies, it is still challenging to address all levels of the genetic machinery whether a gene, protein, metabolite, and promoter. Thus, using a system of reduced complexity like hydroponics can certainly direct us toward the right candidates, if not completely help us to resolve the issue. PMID:26734044

  17. Diamond and diamond-like carbon MEMS

    NASA Astrophysics Data System (ADS)

    Luo, J. K.; Fu, Y. Q.; Le, H. R.; Williams, J. A.; Spearing, S. M.; Milne, W. I.

    2007-07-01

    To generate complex cartilage/bone tissues, scaffolds must possess several structural features that are difficult to create using conventional scaffold design/fabrication technologies. Successful cartilage/bone regeneration depends on the ability to assemble chondrocytes/osteoblasts into three-dimensional (3D) scaffolds. Therefore, we developed a 3D scaffold fabrication system that applies the axiomatic approach to our microstereolithography system. The new system offers a reduced machine size by minimizing the optical components, and shows that the design matrix is decoupled. This analysis identified the key factors affecting microstructure fabrication and an improved scaffold fabrication system was constructed. The results demonstrate that precise, predesigned 3D structures can be fabricated. Using this 3D scaffold, cell adhesion behavior was observed. The use of 3D scaffolds might help determine key factors in the study of cell behavior in complex environments and could eventually lead to the optimal design of scaffolds for the regeneration of various tissues, such as cartilage and bone.

  18. Encoding Schemes For A Digital Optical Multiplier Using The Modified Signed-Digit Number Representation

    NASA Astrophysics Data System (ADS)

    Lasher, Mark E.; Henderson, Thomas B.; Drake, Barry L.; Bocker, Richard P.

    1986-09-01

    The modified signed-digit (MSD) number representation offers full parallel, carry-free addition. A MSD adder has been described by the authors. This paper describes how the adder can be used in a tree structure to implement an optical multiply algorithm. Three different optical schemes, involving position, polarization, and intensity encoding, are proposed for realizing the trinary logic system. When configured in the generic multiplier architecture, these schemes yield the combinatorial logic necessary to carry out the multiplication algorithm. The optical systems are essentially three dimensional arrangements composed of modular units. Of course, this modularity is important for design considerations, while the parallelism and noninterfering communication channels of optical systems are important from the standpoint of reduced complexity. The authors have also designed electronic hardware to demonstrate and model the combinatorial logic required to carry out the algorithm. The electronic and proposed optical systems will be compared in terms of complexity and speed.

  19. Pricing strategy in a dual-channel and remanufacturing supply chain system

    NASA Astrophysics Data System (ADS)

    Jiang, Chengzhi; Xu, Feng; Sheng, Zhaohan

    2010-07-01

    This article addresses the pricing strategy problems in a supply chain system where the manufacturer sells original products and remanufactured products via indirect retailer channels and direct Internet channels. Due to the complexity of that system, agent technologies that provide a new way for analysing complex systems are used for modelling. Meanwhile, in order to reduce the computational load of searching procedure for optimal prices and profits, a learning search algorithm is designed and implemented within the multi-agent supply chain model. The simulation results show that the proposed model can find out optimal prices of original products and remanufactured products in both channels, which lead to optimal profits of the manufacturer and the retailer. It is also found that the optimal profits are increased by introducing direct channel and remanufacturing. Furthermore, the effect of customer preference, direct channel cost and remanufactured unit cost on optimal prices and profits are examined.

  20. Porting of EPICS to Real Time UNIX, and Usage Ported EPICS for FEL Automation

    NASA Astrophysics Data System (ADS)

    Salikova, Tatiana

    This article describes concepts and mechanisms used in porting of EPICS (Experimental Physical and Industrial Control System) codes to platform of operating system UNIX. Without destruction of EPICS architecture, new features of EPICS provides the support for real time operating system LynxOS/x86 and equipment produced by INP (Budker Institute of Nuclear Physics). Application of ported EPICS reduces the cost of software and hardware is used for automation of FEL (Free Electron Laser) complex.

  1. Chemical detection and laser wavelength stabilization employing spectroscopic absorption via laser compliance voltage sensing

    DOEpatents

    Taubman, Matthew S.; Phillips, Mark C.

    2016-01-12

    Systems and methods are disclosed that provide a direct indication of the presence and concentration of an analyte within the external cavity of a laser device that employ the compliance voltage across the laser device. The systems can provide stabilization of the laser wavelength. The systems and methods can obviate the need for an external optical detector, an external gas cell, or other sensing region and reduce the complexity and size of the sensing configuration.

  2. Applying Risk Management to Reduce The Overall Time In Lay-Up While Increasing the Cost Effectiveness of a Nimitz (CVN 68) Class Aircraft Carrier in Dry Dock During the Execution Phase of a Refueling and Complex Overhaul

    DTIC Science & Technology

    2009-03-01

    operational availability and modernization capability. 15. NUMBER OF PAGES 137 14. SUBJECT TERMS Systems Engineering Process, Risk Management...MASTER OF SCIENCE IN SYSTEMS ENGINEERING from the NAVAL POSTGRADUATE SCHOOL March 2009 Author: Kiah Bernard Rahming Approved by...Professor Gary O. Langford Thesis Advisor Dr. Paul V. Shebalin Second Reader Dr. David H. Olwell Chairman, Department of Systems

  3. A systems biology approach to studying Tai Chi, physiological complexity and healthy aging: design and rationale of a pragmatic randomized controlled trial.

    PubMed

    Wayne, Peter M; Manor, Brad; Novak, Vera; Costa, Madelena D; Hausdorff, Jeffrey M; Goldberger, Ary L; Ahn, Andrew C; Yeh, Gloria Y; Peng, C-K; Lough, Matthew; Davis, Roger B; Quilty, Mary T; Lipsitz, Lewis A

    2013-01-01

    Aging is typically associated with progressive multi-system impairment that leads to decreased physical and cognitive function and reduced adaptability to stress. Due to its capacity to characterize complex dynamics within and between physiological systems, the emerging field of complex systems biology and its array of quantitative tools show great promise for improving our understanding of aging, monitoring senescence, and providing biomarkers for evaluating novel interventions, including promising mind-body exercises, that treat age-related disease and promote healthy aging. An ongoing, two-arm randomized clinical trial is evaluating the potential of Tai Chi mind-body exercise to attenuate age-related loss of complexity. A total of 60 Tai Chi-naïve healthy older adults (aged 50-79) are being randomized to either six months of Tai Chi training (n=30), or to a waitlist control receiving unaltered usual medical care (n=30). Our primary outcomes are complexity-based measures of heart rate, standing postural sway and gait stride interval dynamics assessed at 3 and 6months. Multiscale entropy and detrended fluctuation analysis are used as entropy- and fractal-based measures of complexity, respectively. Secondary outcomes include measures of physical and psychological function and tests of physiological adaptability also assessed at 3 and 6months. Results of this study may lead to novel biomarkers that help us monitor and understand the physiological processes of aging and explore the potential benefits of Tai Chi and related mind-body exercises for healthy aging. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Development of Boundary Condition Independent Reduced Order Thermal Models using Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Raghupathy, Arun; Ghia, Karman; Ghia, Urmila

    2008-11-01

    Compact Thermal Models (CTM) to represent IC packages has been traditionally developed using the DELPHI-based (DEvelopment of Libraries of PHysical models for an Integrated design) methodology. The drawbacks of this method are presented, and an alternative method is proposed. A reduced-order model that provides the complete thermal information accurately with less computational resources can be effectively used in system level simulations. Proper Orthogonal Decomposition (POD), a statistical method, can be used to reduce the order of the degree of freedom or variables of the computations for such a problem. POD along with the Galerkin projection allows us to create reduced-order models that reproduce the characteristics of the system with a considerable reduction in computational resources while maintaining a high level of accuracy. The goal of this work is to show that this method can be applied to obtain a boundary condition independent reduced-order thermal model for complex components. The methodology is applied to the 1D transient heat equation.

  5. Model reduction by trimming for a class of semi-Markov reliability models and the corresponding error bound

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Palumbo, Daniel L.

    1991-01-01

    Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.

  6. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty

    NASA Astrophysics Data System (ADS)

    Doherty, John; Christensen, Steen

    2011-12-01

    Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology is demonstrated using a synthetic example based on groundwater modeling environments commonly encountered in northern Europe and North America.

  7. Effect of organic complexing agents on the interactions of Cs(+), Sr(2+) and UO(2)(2+) with silica and natural sand.

    PubMed

    Reinoso-Maset, Estela; Worsfold, Paul J; Keith-Roach, Miranda J

    2013-05-01

    Sorption processes play a key role in controlling radionuclide migration through subsurface environments and can be affected by the presence of anthropogenic organic complexing agents found at contaminated sites. The effect of these complexing agents on radionuclide-solid phase interactions is not well known. Therefore the aim of this study was to examine the processes by which EDTA, NTA and picolinate affect the sorption kinetics and equilibria of Cs(+), Sr(2+) and UO2(2+) onto natural sand. The caesium sorption rate and equilibrium were unaffected by the complexing agents. Strontium however showed greater interaction with EDTA and NTA in the presence of desorbed matrix cations than geochemical modelling predicted, with SrNTA(-) enhancing sorption and SrEDTA(2-) showing lower sorption than Sr(2+). Complexing agents reduced UO2(2+) sorption to silica and enhanced the sorption rate in the natural sand system. Elevated concentrations of picolinate reduced the sorption of Sr(2+) and increased the sorption rate of UO2(2+), demonstrating the potential importance of this complexing agent. These experiments provide a direct comparison of the sorption behaviour of Cs(+), Sr(2+) and UO2(2+)onto natural sand and an assessment of the relative effects of EDTA, NTA and picolinate on the selected elements. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Computation of the spectrum of spatial Lyapunov exponents for the spatially extended beam-plasma systems and electron-wave devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hramov, Alexander E.; Saratov State Technical University, Politechnicheskaja str., 77, Saratov 410054; Koronovskii, Alexey A.

    2012-08-15

    The spectrum of Lyapunov exponents is powerful tool for the analysis of the complex system dynamics. In the general framework of nonlinear dynamics, a number of the numerical techniques have been developed to obtain the spectrum of Lyapunov exponents for the complex temporal behavior of the systems with a few degree of freedom. Unfortunately, these methods cannot be applied directly to analysis of complex spatio-temporal dynamics of plasma devices which are characterized by the infinite phase space, since they are the spatially extended active media. In the present paper, we propose the method for the calculation of the spectrum ofmore » the spatial Lyapunov exponents (SLEs) for the spatially extended beam-plasma systems. The calculation technique is applied to the analysis of chaotic spatio-temporal oscillations in three different beam-plasma model: (1) simple plasma Pierce diode, (2) coupled Pierce diodes, and (3) electron-wave system with backward electromagnetic wave. We find an excellent agreement between the system dynamics and the behavior of the spectrum of the spatial Lyapunov exponents. Along with the proposed method, the possible problems of SLEs calculation are also discussed. It is shown that for the wide class of the spatially extended systems, the set of quantities included in the system state for SLEs calculation can be reduced using the appropriate feature of the plasma systems.« less

  9. Optimization of wastewater treatment plant operation for greenhouse gas mitigation.

    PubMed

    Kim, Dongwook; Bowen, James D; Ozelkan, Ertunga C

    2015-11-01

    This study deals with the determination of optimal operation of a wastewater treatment system for minimizing greenhouse gas emissions, operating costs, and pollution loads in the effluent. To do this, an integrated performance index that includes three objectives was established to assess system performance. The ASMN_G model was used to perform system optimization aimed at determining a set of operational parameters that can satisfy three different objectives. The complex nonlinear optimization problem was simulated using the Nelder-Mead Simplex optimization algorithm. A sensitivity analysis was performed to identify influential operational parameters on system performance. The results obtained from the optimization simulations for six scenarios demonstrated that there are apparent trade-offs among the three conflicting objectives. The best optimized system simultaneously reduced greenhouse gas emissions by 31%, reduced operating cost by 11%, and improved effluent quality by 2% compared to the base case operation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. New spectrophotometric methods for the determinations of hydrogen sulfide present in the samples of lake water, industrial effluents, tender coconut, sugarcane juice and egg

    NASA Astrophysics Data System (ADS)

    Shyla, B.; Nagendrappa, G.

    2012-10-01

    The new methods are working on the principle that iron(III) is reduced to iron(II) by hydrogen sulfide, catechol and p-toluidine the system 1/hydrogen sulfide the system 2, in acidic medium followed by the reduced iron forming complex with 1,10-phenanthroline with λmax 510 nm. The other two methods are based on redox reactions between electrolytically generated manganese(III) sulfate taken in excess and hydrogen sulfide followed by the unreacted oxidant oxidizing diphenylamine λmax 570 the system 3/barium diphenylamine sulphonate λmax 540 nm, the system 4. The increase/decrease in the color intensity of the dye products of the systems 1 and 2 or 3 and 4 are proportional to the concentration of hydrogen sulfide with its quantification range 0.035-1.40 μg ml-1/0.14-1.40 μg ml-1.

  11. Using NASA's Space Launch System to Enable Game Changing Science Mission Designs

    NASA Technical Reports Server (NTRS)

    Creech, Stephen D.

    2013-01-01

    NASA's Marshall Space Flight Center is directing efforts to build the Space Launch System (SLS), a heavy-lift rocket that will help restore U.S. leadership in space by carrying the Orion Multi-Purpose Crew Vehicle and other important payloads far beyond Earth orbit. Its evolvable architecture will allow NASA to begin with Moon fly-bys and then go on to transport humans or robots to distant places such as asteroids, Mars, and the outer solar system. Designed to simplify spacecraft complexity, the SLS rocket will provide improved mass margins and radiation mitigation, and reduced mission durations. These capabilities offer attractive advantages for ambitious missions such as a Mars sample return, by reducing infrastructure requirements, cost, and schedule. For example, if an evolved expendable launch vehicle (EELV) were used for a proposed mission to investigate the Saturn system, a complicated trajectory would be required with several gravity-assist planetary fly-bys to achieve the necessary outbound velocity. The SLS rocket, using significantly higher C3 energies, can more quickly and effectively take the mission directly to its destination, reducing trip times and cost. As this paper will report, the SLS rocket will launch payloads of unprecedented mass and volume, such as monolithic telescopes and in-space infrastructure. Thanks to its ability to co-manifest large payloads, it also can accomplish complex missions in fewer launches. Future analyses will include reviews of alternate mission concepts and detailed evaluations of SLS figures of merit, helping the new rocket revolutionize science mission planning and design for years to come.

  12. Multimodal ophthalmic imaging using swept source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Malone, Joseph D.; El-Haddad, Mohamed T.; Tye, Logan A.; Majeau, Lucas; Godbout, Nicolas; Rollins, Andrew M.; Boudoux, Caroline; Tao, Yuankai K.

    2016-03-01

    Scanning laser ophthalmoscopy (SLO) and optical coherence tomography (OCT) benefit clinical diagnostic imaging in ophthalmology by enabling in vivo noninvasive en face and volumetric visualization of retinal structures, respectively. Spectrally encoding methods enable confocal imaging through fiber optics and reduces system complexity. Previous applications in ophthalmic imaging include spectrally encoded confocal scanning laser ophthalmoscopy (SECSLO) and a combined SECSLO-OCT system for image guidance, tracking, and registration. However, spectrally encoded imaging suffers from speckle noise because each spectrally encoded channel is effectively monochromatic. Here, we demonstrate in vivo human retinal imaging using a swept source spectrally encoded scanning laser ophthalmoscope and OCT (SSSESLO- OCT) at 1060 nm. SS-SESLO-OCT uses a shared 100 kHz Axsun swept source, shared scanner and imaging optics, and are detected simultaneously on a shared, dual channel high-speed digitizer. SESLO illumination and detection was performed using the single mode core and multimode inner cladding of a double clad fiber coupler, respectively, to preserve lateral resolution while improving collection efficiency and reducing speckle contrast at the expense of confocality. Concurrent en face SESLO and cross-sectional OCT images were acquired with 1376 x 500 pixels at 200 frames-per-second. Our system design is compact and uses a shared light source, imaging optics, and digitizer, which reduces overall system complexity and ensures inherent co-registration between SESLO and OCT FOVs. En face SESLO images acquired concurrent with OCT cross-sections enables lateral motion tracking and three-dimensional volume registration with broad applications in multivolume OCT averaging, image mosaicking, and intraoperative instrument tracking.

  13. Learning reduced kinetic Monte Carlo models of complex chemistry from molecular dynamics.

    PubMed

    Yang, Qian; Sing-Long, Carlos A; Reed, Evan J

    2017-08-01

    We propose a novel statistical learning framework for automatically and efficiently building reduced kinetic Monte Carlo (KMC) models of large-scale elementary reaction networks from data generated by a single or few molecular dynamics simulations (MD). Existing approaches for identifying species and reactions from molecular dynamics typically use bond length and duration criteria, where bond duration is a fixed parameter motivated by an understanding of bond vibrational frequencies. In contrast, we show that for highly reactive systems, bond duration should be a model parameter that is chosen to maximize the predictive power of the resulting statistical model. We demonstrate our method on a high temperature, high pressure system of reacting liquid methane, and show that the learned KMC model is able to extrapolate more than an order of magnitude in time for key molecules. Additionally, our KMC model of elementary reactions enables us to isolate the most important set of reactions governing the behavior of key molecules found in the MD simulation. We develop a new data-driven algorithm to reduce the chemical reaction network which can be solved either as an integer program or efficiently using L1 regularization, and compare our results with simple count-based reduction. For our liquid methane system, we discover that rare reactions do not play a significant role in the system, and find that less than 7% of the approximately 2000 reactions observed from molecular dynamics are necessary to reproduce the molecular concentration over time of methane. The framework described in this work paves the way towards a genomic approach to studying complex chemical systems, where expensive MD simulation data can be reused to contribute to an increasingly large and accurate genome of elementary reactions and rates.

  14. Learning reduced kinetic Monte Carlo models of complex chemistry from molecular dynamics

    PubMed Central

    Sing-Long, Carlos A.

    2017-01-01

    We propose a novel statistical learning framework for automatically and efficiently building reduced kinetic Monte Carlo (KMC) models of large-scale elementary reaction networks from data generated by a single or few molecular dynamics simulations (MD). Existing approaches for identifying species and reactions from molecular dynamics typically use bond length and duration criteria, where bond duration is a fixed parameter motivated by an understanding of bond vibrational frequencies. In contrast, we show that for highly reactive systems, bond duration should be a model parameter that is chosen to maximize the predictive power of the resulting statistical model. We demonstrate our method on a high temperature, high pressure system of reacting liquid methane, and show that the learned KMC model is able to extrapolate more than an order of magnitude in time for key molecules. Additionally, our KMC model of elementary reactions enables us to isolate the most important set of reactions governing the behavior of key molecules found in the MD simulation. We develop a new data-driven algorithm to reduce the chemical reaction network which can be solved either as an integer program or efficiently using L1 regularization, and compare our results with simple count-based reduction. For our liquid methane system, we discover that rare reactions do not play a significant role in the system, and find that less than 7% of the approximately 2000 reactions observed from molecular dynamics are necessary to reproduce the molecular concentration over time of methane. The framework described in this work paves the way towards a genomic approach to studying complex chemical systems, where expensive MD simulation data can be reused to contribute to an increasingly large and accurate genome of elementary reactions and rates. PMID:28989618

  15. Learning reduced kinetic Monte Carlo models of complex chemistry from molecular dynamics

    DOE PAGES

    Yang, Qian; Sing-Long, Carlos A.; Reed, Evan J.

    2017-06-19

    Here, we propose a novel statistical learning framework for automatically and efficiently building reduced kinetic Monte Carlo (KMC) models of large-scale elementary reaction networks from data generated by a single or few molecular dynamics simulations (MD). Existing approaches for identifying species and reactions from molecular dynamics typically use bond length and duration criteria, where bond duration is a fixed parameter motivated by an understanding of bond vibrational frequencies. Conversely, we show that for highly reactive systems, bond duration should be a model parameter that is chosen to maximize the predictive power of the resulting statistical model. We demonstrate our methodmore » on a high temperature, high pressure system of reacting liquid methane, and show that the learned KMC model is able to extrapolate more than an order of magnitude in time for key molecules. Additionally, our KMC model of elementary reactions enables us to isolate the most important set of reactions governing the behavior of key molecules found in the MD simulation. We develop a new data-driven algorithm to reduce the chemical reaction network which can be solved either as an integer program or efficiently using L1 regularization, and compare our results with simple count-based reduction. For our liquid methane system, we discover that rare reactions do not play a significant role in the system, and find that less than 7% of the approximately 2000 reactions observed from molecular dynamics are necessary to reproduce the molecular concentration over time of methane. Furthermore, we describe a framework in this work that paves the way towards a genomic approach to studying complex chemical systems, where expensive MD simulation data can be reused to contribute to an increasingly large and accurate genome of elementary reactions and rates.« less

  16. Optimal space-time attacks on system state estimation under a sparsity constraint

    NASA Astrophysics Data System (ADS)

    Lu, Jingyang; Niu, Ruixin; Han, Puxiao

    2016-05-01

    System state estimation in the presence of an adversary that injects false information into sensor readings has attracted much attention in wide application areas, such as target tracking with compromised sensors, secure monitoring of dynamic electric power systems, secure driverless cars, and radar tracking and detection in the presence of jammers. From a malicious adversary's perspective, the optimal strategy for attacking a multi-sensor dynamic system over sensors and over time is investigated. It is assumed that the system defender can perfectly detect the attacks and identify and remove sensor data once they are corrupted by false information injected by the adversary. With this in mind, the adversary's goal is to maximize the covariance matrix of the system state estimate by the end of attack period under a sparse attack constraint such that the adversary can only attack the system a few times over time and over sensors. The sparsity assumption is due to the adversary's limited resources and his/her intention to reduce the chance of being detected by the system defender. This becomes an integer programming problem and its optimal solution, the exhaustive search, is intractable with a prohibitive complexity, especially for a system with a large number of sensors and over a large number of time steps. Several suboptimal solutions, such as those based on greedy search and dynamic programming are proposed to find the attack strategies. Examples and numerical results are provided in order to illustrate the effectiveness and the reduced computational complexities of the proposed attack strategies.

  17. Loss of Pink1 modulates synaptic mitochondrial bioenergetics in the rat striatum prior to motor symptoms: concomitant complex I respiratory defects and increased complex II-mediated respiration.

    PubMed

    Stauch, Kelly L; Villeneuve, Lance M; Purnell, Phillip R; Ottemann, Brendan M; Emanuel, Katy; Fox, Howard S

    2016-12-01

    Mutations in PTEN-induced putative kinase 1 (Pink1), a mitochondrial serine/threonine kinase, cause a recessive inherited form of Parkinson's disease (PD). Pink1 deletion in rats results in a progressive PD-like phenotype, characterized by significant motor deficits starting at 4 months of age. Despite the evidence of mitochondrial dysfunction, the pathogenic mechanism underlying disease due to Pink1-deficiency remains obscure. Striatal synaptic mitochondria from 3-month-old Pink1-deficient rats were characterized using bioenergetic and mass spectroscopy (MS)-based proteomic analyses. Striatal synaptic mitochondria from Pink1-deficient rats exhibit decreased complex I-driven respiration and increased complex II-mediated respiration compared with wild-type rats. MS-based proteomics revealed 69 of the 811 quantified mitochondrial proteins were differentially expressed between Pink1-deficient rats and controls. Down-regulation of several electron carrier proteins, which shuttle electrons to reduce ubiquinone at complex III, in the Pink1-knockouts suggests disruption of the linkage between fatty acid, amino acid, and choline metabolism and the mitochondrial respiratory system. These results suggest that complex II activity is increased to compensate for loss of electron transfer mechanisms due to reduced complex I activity and loss of electron carriers within striatal nerve terminals early during disease progression. This may contribute to the pathogenesis of PD. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Performance Enhancement of Radial Distributed System with Distributed Generators by Reconfiguration Using Binary Firefly Algorithm

    NASA Astrophysics Data System (ADS)

    Rajalakshmi, N.; Padma Subramanian, D.; Thamizhavel, K.

    2015-03-01

    The extent of real power loss and voltage deviation associated with overloaded feeders in radial distribution system can be reduced by reconfiguration. Reconfiguration is normally achieved by changing the open/closed state of tie/sectionalizing switches. Finding optimal switch combination is a complicated problem as there are many switching combinations possible in a distribution system. Hence optimization techniques are finding greater importance in reducing the complexity of reconfiguration problem. This paper presents the application of firefly algorithm (FA) for optimal reconfiguration of radial distribution system with distributed generators (DG). The algorithm is tested on IEEE 33 bus system installed with DGs and the results are compared with binary genetic algorithm. It is found that binary FA is more effective than binary genetic algorithm in achieving real power loss reduction and improving voltage profile and hence enhancing the performance of radial distribution system. Results are found to be optimum when DGs are added to the test system, which proved the impact of DGs on distribution system.

  19. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    NASA Astrophysics Data System (ADS)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  20. Rapid Evolution of a Coadapted Gene Complex: Evidence from the Segregation Distorter (Sd) System of Meiotic Drive in Drosophila Melanogaster

    PubMed Central

    Palopoli, M. F.; Wu, C. I.

    1996-01-01

    Segregation Distorter (SD) is a system of meiotic drive found in natural populations of Drosophila melanogaster. Males heterozygous for an SD second chromosome and a normal homologue (SD(+)) produce predominantly SD-bearing sperm. The coadapted gene complex responsible for this transmission advantage spans the second chromosome centromere, consisting of three major and several minor interacting loci. To investigate the evolutionary history of this system, we surveyed levels of polymorphism and divergence at six genes that together encompass this pericentromeric region and span seven map units. Interestingly, there was no discernible divergence between SD and SD(+) chromosomes for any of these molecular markers. Furthermore, SD chromosomes harbored much less polymorphism than did SD(+) chromosomes. The results suggest that the SD system evolved recently, swept to appreciable frequencies worldwide, and carried with it the entire second chromosome centromeric region (roughly 10% of the genome). Despite its well-documented genetic complexity, this coadapted system appears to have evolved on a time scale that is much shorter than can be gauged using nucleotide substitution data. Finally, the large genomic region hitchhiking with SD indicates that a multilocus, epistatically selected system could affect the levels of DNA polymorphism observed in regions of reduced recombination. PMID:8844155

  1. Improved dense trajectories for action recognition based on random projection and Fisher vectors

    NASA Astrophysics Data System (ADS)

    Ai, Shihui; Lu, Tongwei; Xiong, Yudian

    2018-03-01

    As an important application of intelligent monitoring system, the action recognition in video has become a very important research area of computer vision. In order to improve the accuracy rate of the action recognition in video with improved dense trajectories, one advanced vector method is introduced. Improved dense trajectories combine Fisher Vector with Random Projection. The method realizes the reduction of the characteristic trajectory though projecting the high-dimensional trajectory descriptor into the low-dimensional subspace based on defining and analyzing Gaussian mixture model by Random Projection. And a GMM-FV hybrid model is introduced to encode the trajectory feature vector and reduce dimension. The computational complexity is reduced by Random Projection which can drop Fisher coding vector. Finally, a Linear SVM is used to classifier to predict labels. We tested the algorithm in UCF101 dataset and KTH dataset. Compared with existed some others algorithm, the result showed that the method not only reduce the computational complexity but also improved the accuracy of action recognition.

  2. Analysis of Complex Valve and Feed Systems

    NASA Technical Reports Server (NTRS)

    Ahuja, Vineet; Hosangadi, Ashvin; Shipman, Jeremy; Cavallo, Peter; Dash, Sanford

    2007-01-01

    A numerical framework for analysis of complex valve systems supports testing of propulsive systems by simulating key valve and control system components in the test loop. In particular, it is designed to enhance the analysis capability in terms of identifying system transients and quantifying the valve response to these transients. This system has analysis capability for simulating valve motion in complex systems operating in diverse flow regimes ranging from compressible gases to cryogenic liquids. A key feature is the hybrid, unstructured framework with sub-models for grid movement and phase change including cryogenic cavitations. The multi-element unstructured framework offers improved predictions of valve performance characteristics under steady conditions for structurally complex valves such as pressure regulator valve. Unsteady simulations of valve motion using this computational approach have been carried out for various valves in operation at Stennis Space Center such as the split-body valve and the 10-in. (approx.25.4-cm) LOX (liquid oxygen) valve and the 4-in. (approx.10 cm) Y-pattern valve (liquid nitrogen). Such simulations make use of variable grid topologies, thereby permitting solution accuracy and resolving important flow physics in the seat region of the moving valve. An advantage to this software includes possible reduction in testing costs incurred due to disruptions relating to unexpected flow transients or functioning of valve/flow control systems. Prediction of the flow anomalies leading to system vibrations, flow resonance, and valve stall can help in valve scheduling and significantly reduce the need for activation tests. This framework has been evaluated for its ability to predict performance metrics like flow coefficient for cavitating venturis and valve coefficient curves, and could be a valuable tool in predicting and understanding anomalous behavior of system components at rocket propulsion testing and design sites.

  3. Fire detection system using random forest classification for image sequences of complex background

    NASA Astrophysics Data System (ADS)

    Kim, Onecue; Kang, Dong-Joong

    2013-06-01

    We present a fire alarm system based on image processing that detects fire accidents in various environments. To reduce false alarms that frequently appeared in earlier systems, we combined image features including color, motion, and blinking information. We specifically define the color conditions of fires in hue, saturation and value, and RGB color space. Fire features are represented as intensity variation, color mean and variance, motion, and image differences. Moreover, blinking fire features are modeled by using crossing patches. We propose an algorithm that classifies patches into fire or nonfire areas by using random forest supervised learning. We design an embedded surveillance device made with acrylonitrile butadiene styrene housing for stable fire detection in outdoor environments. The experimental results show that our algorithm works robustly in complex environments and is able to detect fires in real time.

  4. A linguistic geometry for space applications

    NASA Technical Reports Server (NTRS)

    Stilman, Boris

    1994-01-01

    We develop a formal theory, the so-called Linguistic Geometry, in order to discover the inner properties of human expert heuristics, which were successful in a certain class of complex control systems, and apply them to different systems. This research relies on the formalization of search heuristics of high-skilled human experts which allow for the decomposition of complex system into the hierarchy of subsystems, and thus solve intractable problems reducing the search. The hierarchy of subsystems is represented as a hierarchy of formal attribute languages. This paper includes a formal survey of the Linguistic Geometry, and new example of a solution of optimization problem for the space robotic vehicles. This example includes actual generation of the hierarchy of languages, some details of trajectory generation and demonstrates the drastic reduction of search in comparison with conventional search algorithms.

  5. Phase from defocus

    NASA Astrophysics Data System (ADS)

    Mandula, Ondrej; Allier, Cédric; Hervé, Lionel; Denarier, Eric; Fourest-Lieuvin, Anne; Gory-Fauré, Sylvie; Vinit, Angélique; Morales, Sophie

    2018-02-01

    We present a simple and compact phase imaging microscope for long-term observation of non-absorbing biological samples such as unstained cells in nutritive media. The phase image is obtained from a single defocused image taken with a standard wide-field microscope. Using a semi-coherent light source allows us to computationally re-focus image post-acquisition and recover both phase and transmission of the complex specimen. The simplicity of the system reduces both the cost and its physical size and allows a long-term observation of samples directly in a standard biological incubator. The low cost of the system can contribute to the democratization of science by allowing to perform complex long-term biological experiments to the laboratories with constrained budget. In this proceeding we present several results taken with our prototype and discuss the possibilities and limitations of our system.

  6. Case Management for Patients with Complex Multimorbidity: Development and Validation of a Coordinated Intervention between Primary and Hospital Care

    PubMed Central

    Giménez-Campos, María Soledad; Villar-López, Julia; Faubel-Cava, Raquel; Donat-Castelló, Lucas; Valdivieso-Martínez, Bernardo; Soriano-Melchor, Elisa; Bahamontes-Mulió, Amparo; García-Gómez, Juan M.

    2017-01-01

    In the past few years, healthcare systems have been facing a growing demand related to the high prevalence of chronic diseases. Case management programs have emerged as an integrated care approach for the management of chronic disease. Nevertheless, there is little scientific evidence on the impact of using a case management program for patients with complex multimorbidity regarding hospital resource utilisation. We evaluated an integrated case management intervention set up by community-based care at outpatient clinics with nurse case managers from a telemedicine unit. The hypothesis to be tested was whether improved continuity of care resulting from the integration of community-based and hospital services reduced the use of hospital resources amongst patients with complex multimorbidity. A retrospective cohort study was performed using a sample of 714 adult patients admitted to the program between January 2012 and January 2015. We found a significant decrease in the number of emergency room visits, unplanned hospitalizations, and length of stay, and an expected increase in the home care hospital-based episodes. These results support the hypothesis that case management interventions can reduce the use of unplanned hospital admissions when applied to patients with complex multimorbidity. PMID:28970745

  7. Comparison of energy interaction parameters for the complexation of Pr(III) with glutathione reduced (GSH) in absence and presence of Zn(II) in aqueous and aquated organic solvents using 4f?4f transition spectra as PROBE

    NASA Astrophysics Data System (ADS)

    Singh, Th. David; Sumitra, Ch.; Yaiphaba, N.; Devi, H. Debecca; Devi, M. Indira; Singh, N. Rajmuhon

    2005-04-01

    The coordination chemistry of glutathione reduced (GSH) is of great importance as it acts as excellent model system for the binding of metal ions. The GSH complexation with metal ions is involved in the toxicology of different metal ions. Its coordination behaviour for soft metal ions and hard metal ions is found different because of the structure of GSH and its different potential binding sites. In our work we have studied two chemically dissimilar metal ions viz. Pr(III), which prefer hard donor site like carboxylic groups and Zn(II) the soft metal ion which prefer peptide-NH and sulphydryl groups. The absorption difference and comparative absorption spectroscopy involving 4f-4f transitions of the heterobimetallic Complexation of GSH with Pr(III) and Zn(II) has been explored in aqueous and aquated organic solvents. The variation in the energy parameters like Slater-Condon ( F K), Racah ( E K) and Lande ( ξ4f), Nephelauxetic parameter ( β) and bonding parameter ( b1/2) are computed to explain the nature of complexation.

  8. High pressure homogenization to improve the stability of casein - hydroxypropyl cellulose aqueous systems.

    PubMed

    Ye, Ran; Harte, Federico

    2014-03-01

    The effect of high pressure homogenization on the improvement of the stability hydroxypropyl cellulose (HPC) and micellar casein was investigated. HPC with two molecular weights (80 and 1150 kDa) and micellar casein were mixed in water to a concentration leading to phase separation (0.45% w/v HPC and 3% w/v casein) and immediately subjected to high pressure homogenization ranging from 0 to 300 MPa, in 100 MPa increments. The various dispersions were evaluated for stability, particle size, turbidity, protein content, and viscosity over a period of two weeks and Scanning Transmission Electron Microscopy (STEM) at the end of the storage period. The stability of casein-HPC complexes was enhanced with the increasing homogenization pressure, especially for the complex containing high molecular weight HPC. The apparent particle size of complexes was reduced from ~200nm to ~130nm when using 300 MPa, corresponding to the sharp decrease of absorbance when compared to the non-homogenized controls. High pressure homogenization reduced the viscosity of HPC-casein complexes regardless of the molecular weight of HPC and STEM imagines revealed aggregates consistent with nano-scale protein polysaccharide interactions.

  9. High pressure homogenization to improve the stability of casein - hydroxypropyl cellulose aqueous systems

    PubMed Central

    Ye, Ran; Harte, Federico

    2013-01-01

    The effect of high pressure homogenization on the improvement of the stability hydroxypropyl cellulose (HPC) and micellar casein was investigated. HPC with two molecular weights (80 and 1150 kDa) and micellar casein were mixed in water to a concentration leading to phase separation (0.45% w/v HPC and 3% w/v casein) and immediately subjected to high pressure homogenization ranging from 0 to 300 MPa, in 100 MPa increments. The various dispersions were evaluated for stability, particle size, turbidity, protein content, and viscosity over a period of two weeks and Scanning Transmission Electron Microscopy (STEM) at the end of the storage period. The stability of casein-HPC complexes was enhanced with the increasing homogenization pressure, especially for the complex containing high molecular weight HPC. The apparent particle size of complexes was reduced from ~200nm to ~130nm when using 300 MPa, corresponding to the sharp decrease of absorbance when compared to the non-homogenized controls. High pressure homogenization reduced the viscosity of HPC-casein complexes regardless of the molecular weight of HPC and STEM imagines revealed aggregates consistent with nano-scale protein polysaccharide interactions. PMID:24159250

  10. Faster Finances

    NASA Technical Reports Server (NTRS)

    1976-01-01

    TRW has applied the Apollo checkout procedures to retail-store and bank-transaction systems, as well as to control systems for electric power transmission grids -- reducing the chance of power blackouts. Automatic checkout equipment for Apollo Spacecraft is one of the most complex computer systems in the world. Used to integrate extensive Apollo checkout procedures from manufacture to launch, it has spawned major advances in computer systems technology. Store and bank credit system has caused significant improvement in speed and accuracy of transactions, credit authorization, and inventory control. A similar computer service called "Validata" is used nationwide by airlines, airline ticket offices, car rental agencies, and hotels.

  11. Reduced-Order Structure-Preserving Model for Parallel-Connected Three-Phase Grid-Tied Inverters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Brian B; Purba, Victor; Jafarpour, Saber

    Next-generation power networks will contain large numbers of grid-connected inverters satisfying a significant fraction of system load. Since each inverter model has a relatively large number of dynamic states, it is impractical to analyze complex system models where the full dynamics of each inverter are retained. To address this challenge, we derive a reduced-order structure-preserving model for parallel-connected grid-tied three-phase inverters. Here, each inverter in the system is assumed to have a full-bridge topology, LCL filter at the point of common coupling, and the control architecture for each inverter includes a current controller, a power controller, and a phase-locked loopmore » for grid synchronization. We outline a structure-preserving reduced-order inverter model with lumped parameters for the setting where the parallel inverters are each designed such that the filter components and controller gains scale linearly with the power rating. By structure preserving, we mean that the reduced-order three-phase inverter model is also composed of an LCL filter, a power controller, current controller, and PLL. We show that the system of parallel inverters can be modeled exactly as one aggregated inverter unit and this equivalent model has the same number of dynamical states as any individual inverter in the system. Numerical simulations validate the reduced-order model.« less

  12. A Review of Accident Modelling Approaches for Complex Critical Sociotechnical Systems

    DTIC Science & Technology

    2008-01-01

    several Signals Passed at Danger ( SPADs ) but a signal sighting committee was not convened 4. An inspection of the signalling system at Watford...Junction was never carried out 5. The driver did not know of the reduced overlap between signals 6. The driver had committed SPADs recently and...cause. One philosophical approach to causation views counterfactual dependence as the key to the explanation of causal facts: for example, events c (the

  13. Rapid architecture alternative modeling (RAAM): A framework for capability-based analysis of system of systems architectures

    NASA Astrophysics Data System (ADS)

    Iacobucci, Joseph V.

    The research objective for this manuscript is to develop a Rapid Architecture Alternative Modeling (RAAM) methodology to enable traceable Pre-Milestone A decision making during the conceptual phase of design of a system of systems. Rather than following current trends that place an emphasis on adding more analysis which tends to increase the complexity of the decision making problem, RAAM improves on current methods by reducing both runtime and model creation complexity. RAAM draws upon principles from computer science, system architecting, and domain specific languages to enable the automatic generation and evaluation of architecture alternatives. For example, both mission dependent and mission independent metrics are considered. Mission dependent metrics are determined by the performance of systems accomplishing a task, such as Probability of Success. In contrast, mission independent metrics, such as acquisition cost, are solely determined and influenced by the other systems in the portfolio. RAAM also leverages advances in parallel computing to significantly reduce runtime by defining executable models that are readily amendable to parallelization. This allows the use of cloud computing infrastructures such as Amazon's Elastic Compute Cloud and the PASTEC cluster operated by the Georgia Institute of Technology Research Institute (GTRI). Also, the amount of data that can be generated when fully exploring the design space can quickly exceed the typical capacity of computational resources at the analyst's disposal. To counter this, specific algorithms and techniques are employed. Streaming algorithms and recursive architecture alternative evaluation algorithms are used that reduce computer memory requirements. Lastly, a domain specific language is created to provide a reduction in the computational time of executing the system of systems models. A domain specific language is a small, usually declarative language that offers expressive power focused on a particular problem domain by establishing an effective means to communicate the semantics from the RAAM framework. These techniques make it possible to include diverse multi-metric models within the RAAM framework in addition to system and operational level trades. A canonical example was used to explore the uses of the methodology. The canonical example contains all of the features of a full system of systems architecture analysis study but uses fewer tasks and systems. Using RAAM with the canonical example it was possible to consider both system and operational level trades in the same analysis. Once the methodology had been tested with the canonical example, a Suppression of Enemy Air Defenses (SEAD) capability model was developed. Due to the sensitive nature of analyses on that subject, notional data was developed. The notional data has similar trends and properties to realistic Suppression of Enemy Air Defenses data. RAAM was shown to be traceable and provided a mechanism for a unified treatment of a variety of metrics. The SEAD capability model demonstrated lower computer runtimes and reduced model creation complexity as compared to methods currently in use. To determine the usefulness of the implementation of the methodology on current computing hardware, RAAM was tested with system of system architecture studies of different sizes. This was necessary since system of systems may be called upon to accomplish thousands of tasks. It has been clearly demonstrated that RAAM is able to enumerate and evaluate the types of large, complex design spaces usually encountered in capability based design, oftentimes providing the ability to efficiently search the entire decision space. The core algorithms for generation and evaluation of alternatives scale linearly with expected problem sizes. The SEAD capability model outputs prompted the discovery a new issue, the data storage and manipulation requirements for an analysis. Two strategies were developed to counter large data sizes, the use of portfolio views and top 'n' analysis. This proved the usefulness of the RAAM framework and methodology during Pre-Milestone A capability based analysis. (Abstract shortened by UMI.).

  14. POD Model Reconstruction for Gray-Box Fault Detection

    NASA Technical Reports Server (NTRS)

    Park, Han; Zak, Michail

    2007-01-01

    Proper orthogonal decomposition (POD) is the mathematical basis of a method of constructing low-order mathematical models for the "gray-box" fault-detection algorithm that is a component of a diagnostic system known as beacon-based exception analysis for multi-missions (BEAM). POD has been successfully applied in reducing computational complexity by generating simple models that can be used for control and simulation for complex systems such as fluid flows. In the present application to BEAM, POD brings the same benefits to automated diagnosis. BEAM is a method of real-time or offline, automated diagnosis of a complex dynamic system.The gray-box approach makes it possible to utilize incomplete or approximate knowledge of the dynamics of the system that one seeks to diagnose. In the gray-box approach, a deterministic model of the system is used to filter a time series of system sensor data to remove the deterministic components of the time series from further examination. What is left after the filtering operation is a time series of residual quantities that represent the unknown (or at least unmodeled) aspects of the behavior of the system. Stochastic modeling techniques are then applied to the residual time series. The procedure for detecting abnormal behavior of the system then becomes one of looking for statistical differences between the residual time series and the predictions of the stochastic model.

  15. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  16. Using TENS for pain control: the state of the evidence

    PubMed Central

    Vance, Carol GT; Dailey, Dana L; Rakel, Barbara A; Sluka, Kathleen A

    2014-01-01

    Summary Transcutaneous electrical nerve stimulation (TENS) is a nonpharmacological intervention that activates a complex neuronal network to reduce pain by activating descending inhibitory systems in the central nervous system to reduce hyperalgesia. The evidence for TENS efficacy is conflicting and requires not only description but also critique. Population-specific systemic reviews and meta-analyses are emerging, indicating both HF and LF TENS being shown to provide analgesia, specifically when applied at a strong, nonpainful intensity. The purpose of this article is to provide a critical review of the latest basic science and clinical evidence for TENS. Additional research is necessary to determine if TENS has effects specific to mechanical stimuli and/or beyond reduction of pain and will improve activity levels, function and quality of life. PMID:24953072

  17. MCG measurement in the environment of active magnetic shield.

    PubMed

    Yamazaki, K; Kato, K; Kobayashi, K; Igarashi, A; Sato, T; Haga, A; Kasai, N

    2004-11-30

    MCG (Magnetocardiography) measurement by a SQUID gradiometer was attempted with only active magnetic shielding (active shielding). A three-axis-canceling-coil active shielding system, where three 16-10-16 turns-coil sets were put in the orthogonal directions, produces a homogeneous magnetic field in a considerable volume surrounding the center. Fluxgate sensors were used as the reference sensors of the system. The system can reduce environmental magnetic noise at low frequencies of less than a few Hz, at 50 Hz and at 150 Hz. Reducing such disturbances stabilizes biomagnetic measurement conditions for SQUIDs in the absence of magnetically shielded rooms (MSR). After filtering and averaging the measured MCG data by a first-order SQUID gradiometer with only the active shielding during the daytime, the QRS complex and T wave was clearly presented.

  18. Distributed intrusion detection system based on grid security model

    NASA Astrophysics Data System (ADS)

    Su, Jie; Liu, Yahui

    2008-03-01

    Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.

  19. Entropy, complexity, and Markov diagrams for random walk cancer models.

    PubMed

    Newton, Paul K; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-12-19

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential.

  20. Entropy, complexity, and Markov diagrams for random walk cancer models

    NASA Astrophysics Data System (ADS)

    Newton, Paul K.; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-12-01

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential.

  1. Copper Complex in Poly(vinyl chloride) as a Nitric Oxide-Generating Catalyst for the Control of Nitrifying Bacterial Biofilms.

    PubMed

    Wonoputri, Vita; Gunawan, Cindy; Liu, Sanly; Barraud, Nicolas; Yee, Lachlan H; Lim, May; Amal, Rose

    2015-10-14

    In this study, catalytic generation of nitric oxide by a copper(II) complex embedded within a poly(vinyl chloride) matrix in the presence of nitrite (source of nitric oxide) and ascorbic acid (reducing agent) was shown to effectively control the formation and dispersion of nitrifying bacteria biofilms. Amperometric measurements indicated increased and prolonged generation of nitric oxide with the addition of the copper complex when compared to that with nitrite and ascorbic acid alone. The effectiveness of the copper complex-nitrite-ascorbic acid system for biofilm control was quantified using protein analysis, which showed enhanced biofilm suppression when the copper complex was used in comparison to that with nitrite and ascorbic acid treatment alone. Confocal laser scanning microscopy (CLSM) and LIVE/DEAD staining revealed a reduction in cell surface coverage without a loss of viability with the copper complex and up to 5 mM of nitrite and ascorbic acid, suggesting that the nitric oxide generated from the system inhibits proliferation of the cells on surfaces. Induction of nitric oxide production by the copper complex system also triggered the dispersal of pre-established biofilms. However, the addition of a high concentration of nitrite and ascorbic acid to a pre-established biofilm induced bacterial membrane damage and strongly decreased the metabolic activity of planktonic and biofilm cells, as revealed by CLSM with LIVE/DEAD staining and intracellular adenosine triphosphate measurements, respectively. This study highlights the utility of the catalytic generation of nitric oxide for the long-term suppression and removal of nitrifying bacterial biofilms.

  2. Game Changing: NASA's Space Launch System and Science Mission Design

    NASA Technical Reports Server (NTRS)

    Creech, Stephen D.

    2013-01-01

    NASA s Marshall Space Flight Center (MSFC) is directing efforts to build the Space Launch System (SLS), a heavy-lift rocket that will carry the Orion Multi-Purpose Crew Vehicle (MPCV) and other important payloads far beyond Earth orbit (BEO). Its evolvable architecture will allow NASA to begin with Moon fly-bys and then go on to transport humans or robots to distant places such as asteroids and Mars. Designed to simplify spacecraft complexity, the SLS rocket will provide improved mass margins and radiation mitigation, and reduced mission durations. These capabilities offer attractive advantages for ambitious missions such as a Mars sample return, by reducing infrastructure requirements, cost, and schedule. For example, if an evolved expendable launch vehicle (EELV) were used for a proposed mission to investigate the Saturn system, a complicated trajectory would be required - with several gravity-assist planetary fly-bys - to achieve the necessary outbound velocity. The SLS rocket, using significantly higher C3 energies, can more quickly and effectively take the mission directly to its destination, reducing trip time and cost. As this paper will report, the SLS rocket will launch payloads of unprecedented mass and volume, such as "monolithic" telescopes and in-space infrastructure. Thanks to its ability to co-manifest large payloads, it also can accomplish complex missions in fewer launches. Future analyses will include reviews of alternate mission concepts and detailed evaluations of SLS figures of merit, helping the new rocket revolutionize science mission planning and design for years to come.

  3. Game changing: NASA's space launch system and science mission design

    NASA Astrophysics Data System (ADS)

    Creech, S. D.

    NASA's Marshall Space Flight Center (MSFC) is directing efforts to build the Space Launch System (SLS), a heavy-lift rocket that will carry the Orion Multi-Purpose Crew Vehicle (MPCV) and other important payloads far beyond Earth orbit (BEO). Its evolvable architecture will allow NASA to begin with Moon fly-bys and then go on to transport humans or robots to distant places such as asteroids and Mars. Designed to simplify spacecraft complexity, the SLS rocket will provide improved mass margins and radiation mitigation, and reduced mission durations. These capabilities offer attractive advantages for ambitious missions such as a Mars sample return, by reducing infrastructure requirements, cost, and schedule. For example, if an evolved expendable launch vehicle (EELV) were used for a proposed mission to investigate the Saturn system, a complicated trajectory would be required - with several gravity-assist planetary fly-bys - to achieve the necessary outbound velocity. The SLS rocket, using significantly higher characteristic energy (C3) energies, can more quickly and effectively take the mission directly to its destination, reducing trip time and cost. As this paper will report, the SLS rocket will launch payloads of unprecedented mass and volume, such as “ monolithic” telescopes and in-space infrastructure. Thanks to its ability to co-manifest large payloads, it also can accomplish complex missions in fewer launches. Future analyses will include reviews of alternate mission concepts and detailed evaluations of SLS figures of merit, helping the new rocket revolutionize science mission planning and design for years to come.

  4. Purification and characterization of a cellulolytic multienzyme complex produced by Neocallimastix patriciarum J11.

    PubMed

    Wang, Hui-Chang; Chen, Yo-Chia; Hseu, Ruey-Shyang

    2014-08-22

    Understanding the roles of the components of the multienzyme complex of the anaerobial cellulase system, acting on complex substrates, is crucial to the development of efficient cellulase systems for industrial applications such as converting lignocellulose to sugars for bioethanol production. In this study, we purified the multienzyme complex of Neocallimastix patriciarum J11 from a broth through cellulose affinity purification. The multienzyme complex is composed of at least 12 comprised proteins, based on sodium dodecyl sulfate polyacrylamide gel electrophoresis. Eight of these constituents have demonstrated β-glucanase activity on zymogram analysis. The multienzyme complex contained scaffoldings that respond to the gathering of the cellulolytic components. The levels and subunit ratio of the multienzyme complex from N. patriciarum J11 might have been affected by their utilized carbon sources, whereas the components of the complexes were consistent. The trypsin-digested peptides of six proteins were matched to the sequences of cellulases originating from rumen fungi, based on identification through liquid chromatography/mass spectrometry, revealing that at least three types of cellulase, including one endoglucanase and two exoglucanases, could be found in the multienzyme complex of N. patriciarum J11. The cellulolytic subunits could hydrolyze synergistically on both the internal bonds and the reducing and nonreducing ends of cellulose. Based on our research, our findings are the first to depict the composition of the multienzyme complex produced by N. patriciarum J11, and this complex is composed of scaffoldin and three types of cellulase. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. A continuum theory for multicomponent chromatography modeling.

    PubMed

    Pfister, David; Morbidelli, Massimo; Nicoud, Roger-Marc

    2016-05-13

    A continuum theory is proposed for modeling multicomponent chromatographic systems under linear conditions. The model is based on the description of complex mixtures, possibly involving tens or hundreds of solutes, by a continuum. The present approach is shown to be very efficient when dealing with a large number of similar components presenting close elution behaviors and whose individual analytical characterization is impossible. Moreover, approximating complex mixtures by continuous distributions of solutes reduces the required number of model parameters to the few ones specific to the characterization of the selected continuous distributions. Therefore, in the frame of the continuum theory, the simulation of large multicomponent systems gets simplified and the computational effectiveness of the chromatographic model is thus dramatically improved. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. The goldstone energy project

    NASA Technical Reports Server (NTRS)

    Bartos, K. P.

    1978-01-01

    The Golstone Energy Project was established in 1974 to investigate ways in which the Goldstone Deep Space Complex in California could be made partly or completely energy-sufficient, especially through the use of solar- and wind-derived energy resources. Ways in which energy could be conserved at the Complex were also studied. Findings included data on both wind and solar energy. Obstacles to demonstrating energy self-sufficiency are: (1) operation and maintenance costs of solar energy systems are estimated to be much higher than conventional energy systems, (2) initial capital costs of present-day technology solar collectors are high and are compounded by low collector efficiency, and (3) no significant market force exists to create the necessary industry to reduce costs through mass production and broad open-market competition.

  7. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  8. Design of an intelligent flight instrumentation unit using embedded RTOS

    NASA Astrophysics Data System (ADS)

    Estrada-Marmolejo, R.; García-Torales, G.; Torres-Ortega, H. H.; Flores, J. L.

    2011-09-01

    Micro Unmanned Aerial Vehicles (MUAV) must calculate its spatial position to control the flight dynamics, which is done by Inertial Measurement Units (IMUs). MEMS Inertial sensors have made possible to reduce the size and power consumption of such units. Commonly the flight instrumentation operates independently of the main processor. This work presents an instrumentation block design, which reduces size and power consumption of the complete system of a MUAV. This is done by coupling the inertial sensors to the main processor without considering any intermediate level of processing aside. Using Real Time Operating Systems (RTOS) reduces the number of intermediate components, increasing MUAV reliability. One advantage is the possibility to control several different sensors with a single communication bus. This feature of the MEMS sensors makes a smaller and less complex MUAV design possible.

  9. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  10. Mars Sample Return Using Commercial Capabilities: Propulsive Entry, Descent and Landing

    NASA Technical Reports Server (NTRS)

    Lemke, Lawrence G.; Gonzales, Andrew A.; Huynh, Loc C.

    2014-01-01

    Mars Sample Return (MSR) is the highest priority science mission for the next decade as recommended by the recent Decadal Survey of Planetary Science. The objective of the study was to determine whether emerging commercial capabilities can be integrated into to such a mission. The premise of the study is that commercial capabilities can be more efficient than previously described systems, and by using fewer systems and fewer or less extensive launches, overall mission cost can be reduced. This presentation describes an EDL technique using planned upgrades to the Dragon capsule to perform a Supersonic Retropulsion Entry - Red Dragon concept. Landed Payload capability meets mission requirements for a MSR Architecture that reduces complexity.

  11. Robotic aortic surgery.

    PubMed

    Duran, Cassidy; Kashef, Elika; El-Sayed, Hosam F; Bismuth, Jean

    2011-01-01

    Surgical robotics was first utilized to facilitate neurosurgical biopsies in 1985, and it has since found application in orthopedics, urology, gynecology, and cardiothoracic, general, and vascular surgery. Surgical assistance systems provide intelligent, versatile tools that augment the physician's ability to treat patients by eliminating hand tremor and enabling dexterous operation inside the patient's body. Surgical robotics systems have enabled surgeons to treat otherwise untreatable conditions while also reducing morbidity and error rates, shortening operative times, reducing radiation exposure, and improving overall workflow. These capabilities have begun to be realized in two important realms of aortic vascular surgery, namely, flexible robotics for exclusion of complex aortic aneurysms using branched endografts, and robot-assisted laparoscopic aortic surgery for occlusive and aneurysmal disease.

  12. Variations in task constraints shape emergent performance outcomes and complexity levels in balancing.

    PubMed

    Caballero Sánchez, Carla; Barbado Murillo, David; Davids, Keith; Moreno Hernández, Francisco J

    2016-06-01

    This study investigated the extent to which specific interacting constraints of performance might increase or decrease the emergent complexity in a movement system, and whether this could affect the relationship between observed movement variability and the central nervous system's capacity to adapt to perturbations during balancing. Fifty-two healthy volunteers performed eight trials where different performance constraints were manipulated: task difficulty (three levels) and visual biofeedback conditions (with and without the center of pressure (COP) displacement and a target displayed). Balance performance was assessed using COP-based measures: mean velocity magnitude (MVM) and bivariate variable error (BVE). To assess the complexity of COP, fuzzy entropy (FE) and detrended fluctuation analysis (DFA) were computed. ANOVAs showed that MVM and BVE increased when task difficulty increased. During biofeedback conditions, individuals showed higher MVM but lower BVE at the easiest level of task difficulty. Overall, higher FE and lower DFA values were observed when biofeedback was available. On the other hand, FE reduced and DFA increased as difficulty level increased, in the presence of biofeedback. However, when biofeedback was not available, the opposite trend in FE and DFA values was observed. Regardless of changes to task constraints and the variable investigated, balance performance was positively related to complexity in every condition. Data revealed how specificity of task constraints can result in an increase or decrease in complexity emerging in a neurobiological system during balance performance.

  13. A review on prognostic techniques for non-stationary and non-linear rotating systems

    NASA Astrophysics Data System (ADS)

    Kan, Man Shan; Tan, Andy C. C.; Mathew, Joseph

    2015-10-01

    The field of prognostics has attracted significant interest from the research community in recent times. Prognostics enables the prediction of failures in machines resulting in benefits to plant operators such as shorter downtimes, higher operation reliability, reduced operations and maintenance cost, and more effective maintenance and logistics planning. Prognostic systems have been successfully deployed for the monitoring of relatively simple rotating machines. However, machines and associated systems today are increasingly complex. As such, there is an urgent need to develop prognostic techniques for such complex systems operating in the real world. This review paper focuses on prognostic techniques that can be applied to rotating machinery operating under non-linear and non-stationary conditions. The general concept of these techniques, the pros and cons of applying these methods, as well as their applications in the research field are discussed. Finally, the opportunities and challenges in implementing prognostic systems and developing effective techniques for monitoring machines operating under non-stationary and non-linear conditions are also discussed.

  14. Model-order reduction of lumped parameter systems via fractional calculus

    NASA Astrophysics Data System (ADS)

    Hollkamp, John P.; Sen, Mihir; Semperlotti, Fabio

    2018-04-01

    This study investigates the use of fractional order differential models to simulate the dynamic response of non-homogeneous discrete systems and to achieve efficient and accurate model order reduction. The traditional integer order approach to the simulation of non-homogeneous systems dictates the use of numerical solutions and often imposes stringent compromises between accuracy and computational performance. Fractional calculus provides an alternative approach where complex dynamical systems can be modeled with compact fractional equations that not only can still guarantee analytical solutions, but can also enable high levels of order reduction without compromising on accuracy. Different approaches are explored in order to transform the integer order model into a reduced order fractional model able to match the dynamic response of the initial system. Analytical and numerical results show that, under certain conditions, an exact match is possible and the resulting fractional differential models have both a complex and frequency-dependent order of the differential operator. The implications of this type of approach for both model order reduction and model synthesis are discussed.

  15. Deep FIFO Surge Buffer

    NASA Technical Reports Server (NTRS)

    Temple, Gerald; Siegel, Marc; Amitai, Zwie

    1991-01-01

    First-in/first-out (FIFO) temporarily stores short surges of data generated by data-acquisition system at excessively high rate and releases data at lower rate suitable for processing by computer. Size and complexity reduced while capacity enhanced by use of newly developed, sophisticated integrated circuits and by "byte-folding" scheme doubling effective depth and data rate.

  16. Direct Spectroscopy in Hollow Optical with Fiber-Based Optical Frequency Combs

    DTIC Science & Technology

    2015-07-09

    scheme is that the generation of carrier-envelope offset frequency f0 can be avoided, which reduces the system complexity . However, a high performance RF...Peterson, "Saturated absorption in acetylene and hydrogen cyanide in hollow-core photonic bandgap fibers," Opt. Express 13, 10475-10482 (2005). 56. C

  17. Formulation and closure of compressible turbulence equations in the light of kinetic theory

    NASA Technical Reports Server (NTRS)

    Tsuge, S.; Sagara, K.

    1976-01-01

    Fluid-dynamic moment equations, based on a kinetic hierarchy system, are derived governing the interaction between turbulent and thermal fluctuations. The kinetic theory is shown to reduce the inherent complexity of the conventional formalism of compressible turbulence theory and to minimize arbitrariness in formulating the closure condition.

  18. The Use of Percolating Filters in Teaching Ecology.

    ERIC Educational Resources Information Center

    Gray, N. F.

    1982-01-01

    Using percolating filters (components of sewage treatment process) reduces problems of organization, avoids damage to habitats, and provides a local study site for field work or rapid collection of biological material throughout the year. Component organisms are easily identified and the habitat can be studied as a simple or complex system.…

  19. Solar Thermal Propulsion Concept

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Harnessing the Sun's energy through Solar Thermal Propulsion will propel vehicles through space by significantly reducing weight, complexity, and cost while boosting performance over current conventional upper stages. Another solar powered system, solar electric propulsion, demonstrates ion propulsion is suitable for long duration missions. Pictured is an artist's concept of space flight using solar thermal propulsion.

  20. The Exploration Water Recovery System

    NASA Technical Reports Server (NTRS)

    ORourke, Mary Jane E.; Carter, Layne; Holder, Donald W.; Tomes, Kristin M.

    2006-01-01

    The Exploration Water Recovery System is designed towards fulfillment of NASA s Vision for Space Exploration, which will require elevation of existing technologies to higher levels of optimization. This new system, designed for application to the Exploration infrastructure, presents a novel combination of proven air and water purification technologies. The integration of unit operations is modified from that of the current state-of-the-art water recovery system so as to optimize treatment of the various waste water streams, contaminant loads, and flow rates. Optimization is achieved primarily through the removal of volatile organic contaminants from the vapor phase prior to their absorption into the liquid phase. In the current state-of-the-art system, the water vapor in the cabin atmosphere is condensed, and the volatile organic contaminants present in that atmosphere are absorbed into the aqueous phase. Removal of contaminants the5 occurs via catalytic oxidation in the liquid phase. Oxidation kinetics, however, dictate that removal of volatile organic contaminants from the vapor phase can inherently be more efficient than their removal from the aqueous phase. Taking advantage of this efficiency reduces the complexity of the water recovery system. This reduction in system complexity is accompanied by reductions in the weight, volume, power, and resupply requirements of the system. Vapor compression distillation technology is used to treat the urine, condensate, and hygiene waste streams. This contributes to the reduction in resupply, as incorporation of vapor compression distillation technology at this point in the process reduces reliance on the expendable ion exchange and adsorption media used in the current state-of-the-art water recovery system. Other proven technologies that are incorporated into the Exploration Water Recovery System include the Trace Contaminant Control System and the Volatile Removal Assembly.

  1. Storm water runoff measurements of copper from a naturally patinated roof and from a parking space. Aspects on environmental fate and chemical speciation.

    PubMed

    Odnevall Wallinder, I; Hedberg, Y; Dromberg, P

    2009-12-01

    Release of copper from a naturally aged copper roof on a shopping centre building in a suburban site of Stockholm has been measured during different rain events after its interaction with the internal drainage system and storm drains made of cast iron and concrete. Concentrations of copper removed by means of urban storm water from a nearby parking space have been determined for comparison. Predictions and measurements of the chemical speciation of released copper are discussed compared to the total concentration, and to threshold values for freshwater and drinking water. The results clearly illustrate that the major part of the released copper from the roof is readily retained already during transport through the internal drainage system of the building, a pathway that also changes the chemical speciation of released copper and its bioavailable fraction. Most copper, not retained by cast iron and concrete surfaces, was strongly complexed to organic matter. The median concentration of free cupric ions and weak copper complexes was less than, or within the range of reported no effect concentrations, NOECs, of copper in surface waters. The parking space contributed with significantly higher and time-dependent concentrations of total copper compared to measured concentrations of copper from the roof after the interaction with the drainage system. Most copper in the surface runoff water was strongly complexed with organic matter, hence reducing the bioavailable fraction significantly to concentrations within the NOEC range. Dilution with other sources of urban storm water will reduce the released concentration of copper even further. The results illustrate that already the internal drainage system and the storm drains made of cast iron and concrete act as efficient sinks for released copper which means that any installation of additional infiltration devices is redundant.

  2. Ultra-smooth finishing of aspheric surfaces using CAST technology

    NASA Astrophysics Data System (ADS)

    Kong, John; Young, Kevin

    2014-06-01

    Growing applications for astronomical ground-based adaptive systems and air-born telescope systems demand complex optical surface designs combined with ultra-smooth finishing. The use of more sophisticated and accurate optics, especially aspheric ones, allows for shorter optical trains with smaller sizes and a reduced number of components. This in turn reduces fabrication and alignment time and costs. These aspheric components include the following: steep surfaces with large aspheric departures; more complex surface feature designs like stand-alone off-axis-parabola (OAP) and free form optics that combine surface complexity with a requirement for ultra-high smoothness, as well as special optic materials such as lightweight silicon carbide (SiC) for air-born systems. Various fabrication technologies for finishing ultra-smooth aspheric surfaces are progressing to meet these growing and demanding challenges, especially Magnetorheological Finishing (MRF) and ion-milling. These methods have demonstrated some good success as well as a certain level of limitations. Amongst them, computer-controlled asphere surface-finishing technology (CAST), developed by Precision Asphere Inc. (PAI), plays an important role in a cost effective manufacturing environment and has successfully delivered numerous products for the applications mentioned above. One of the most recent successes is the Gemini Planet Imager (GPI), the world's most powerful planet-hunting instrument, with critical aspheric components (seven OAPs and free form optics) made using CAST technology. GPI showed off its first images in a press release on January 7, 2014 . This paper reviews features of today's technologies in handling the ultra-smooth aspheric optics, especially the capabilities of CAST on these challenging products. As examples, three groups of aspheres deployed in astronomical optics systems, both polished and finished using CAST, will be discussed in detail.

  3. Hyperspherical Sparse Approximation Techniques for High-Dimensional Discontinuity Detection

    DOE PAGES

    Zhang, Guannan; Webster, Clayton G.; Gunzburger, Max; ...

    2016-08-04

    This work proposes a hyperspherical sparse approximation framework for detecting jump discontinuities in functions in high-dimensional spaces. The need for a novel approach results from the theoretical and computational inefficiencies of well-known approaches, such as adaptive sparse grids, for discontinuity detection. Our approach constructs the hyperspherical coordinate representation of the discontinuity surface of a function. Then sparse approximations of the transformed function are built in the hyperspherical coordinate system, with values at each point estimated by solving a one-dimensional discontinuity detection problem. Due to the smoothness of the hypersurface, the new technique can identify jump discontinuities with significantly reduced computationalmore » cost, compared to existing methods. Several approaches are used to approximate the transformed discontinuity surface in the hyperspherical system, including adaptive sparse grid and radial basis function interpolation, discrete least squares projection, and compressed sensing approximation. Moreover, hierarchical acceleration techniques are also incorporated to further reduce the overall complexity. In conclusion, rigorous complexity analyses of the new methods are provided, as are several numerical examples that illustrate the effectiveness of our approach.« less

  4. Low-Cost Synthesis of Smart Biocompatible Graphene Oxide Reduced Species by Means of GFP.

    PubMed

    Masullo, Tiziana; Armata, Nerina; Pendolino, Flavio; Colombo, Paolo; Lo Celso, Fabrizio; Mazzola, Salvatore; Cuttitta, Angela

    2016-02-01

    The aim of this work is focused on the engineering of biocompatible complex systems composed of an inorganic and bio part. Graphene oxide (GO) and/or graphite oxide (GtO) were taken into account as potential substrates to the linkage of the protein such as Anemonia sulcata recombinant green fluorescent protein (rAsGFP). The complex system is obtained through a reduction process between GO/GtO and rAsGFP archiving an environmentally friendly biosynthesis. Spectroscopic measurements support the formation of reduced species. In particular, photoluminescence shows a change in the activity of the protein when a bond is formed, highlighted by a loss of the maximum emission signal of rAsGFP and a redshift of the maximum absorption peak of the GO/GtO species. Moreover, the hemolysis assay reveals a lower value in the presence of less oxidized graphene species providing evidence for a biocompatible material. This singular aspect can be approached as a promising method for circulating pharmaceutical preparations via intravenous administration in the field of drug delivery.

  5. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  6. Decreased complexity of glucose dynamics in diabetes: evidence from multiscale entropy analysis of continuous glucose monitoring system data.

    PubMed

    Chen, Jin-Long; Chen, Pin-Fan; Wang, Hung-Ming

    2014-07-15

    Parameters of glucose dynamics recorded by the continuous glucose monitoring system (CGMS) could help in the control of glycemic fluctuations, which is important in diabetes management. Multiscale entropy (MSE) analysis has recently been developed to measure the complexity of physical and physiological time sequences. A reduced MSE complexity index indicates the increased repetition patterns of the time sequence, and, thus, a decreased complexity in this system. No study has investigated the MSE analysis of glucose dynamics in diabetes. This study was designed to compare the complexity of glucose dynamics between the diabetic patients (n = 17) and the control subjects (n = 13), who were matched for sex, age, and body mass index via MSE analysis using the CGMS data. Compared with the control subjects, the diabetic patients revealed a significant increase (P < 0.001) in the mean (diabetic patients 166.0 ± 10.4 vs. control subjects 93.3 ± 1.5 mg/dl), the standard deviation (51.7 ± 4.3 vs. 11.1 ± 0.5 mg/dl), and the mean amplitude of glycemic excursions (127.0 ± 9.2 vs. 27.7 ± 1.3 mg/dl) of the glucose levels; and a significant decrease (P < 0.001) in the MSE complexity index (5.09 ± 0.23 vs. 7.38 ± 0.28). In conclusion, the complexity of glucose dynamics is decreased in diabetes. This finding implies the reactivity of glucoregulation is impaired in the diabetic patients. Such impairment presenting as an increased regularity of glycemic fluctuating pattern could be detected by MSE analysis. Thus, the MSE complexity index could potentially be used as a biomarker in the monitoring of diabetes.

  7. Dynamic building risk assessment theoretic model for rainstorm-flood utilization ABM and ABS

    NASA Astrophysics Data System (ADS)

    Lai, Wenze; Li, Wenbo; Wang, Hailei; Huang, Yingliang; Wu, Xuelian; Sun, Bingyun

    2015-12-01

    Flood is one of natural disasters with the worst loss in the world. It needs to assess flood disaster risk so that we can reduce the loss of flood disaster. Disaster management practical work needs the dynamic risk results of building. Rainstorm flood disaster system is a typical complex system. From the view of complex system theory, flood disaster risk is the interaction result of hazard effect objects, rainstorm flood hazard factors, and hazard environments. Agent-based modeling (ABM) is an important tool for complex system modeling. Rainstorm-flood building risk dynamic assessment method (RFBRDAM) was proposed using ABM in this paper. The interior structures and procedures of different agents in proposed meth had been designed. On the Netlogo platform, the proposed method was implemented to assess the building risk changes of the rainstorm flood disaster in the Huaihe River Basin using Agent-based simulation (ABS). The results indicated that the proposed method can dynamically assess building risk of the whole process for the rainstorm flood disaster. The results of this paper can provide one new approach for flood disaster building risk dynamic assessment and flood disaster management.

  8. Phosphate Detection through a Cost-Effective Carbon Black Nanoparticle-Modified Screen-Printed Electrode Embedded in a Continuous Flow System.

    PubMed

    Talarico, Daria; Cinti, Stefano; Arduini, Fabiana; Amine, Aziz; Moscone, Danila; Palleschi, Giuseppe

    2015-07-07

    An automatable flow system for the continuous and long-term monitoring of the phosphate level has been developed using an amperometric detection method based on the use of a miniaturized sensor. This method is based on the monitoring of an electroactive complex obtained by the reaction between phosphate and molybdate that is consequently reduced at the electrode surface. The use of a screen-printed electrode modified with carbon black nanoparticles (CBNPs) leads to the quantification of the complex at low potential, because CBNPs are capable of electrocatalitically enhancing the phosphomolybdate complex reduction at +125 mV versus Ag/AgCl without fouling problems. The developed system also incorporates reagents and waste storage and is connected to a portable potentiostat for rapid detection and quantification of phosphate. Main analytical parameters, such as working potential, reagent concentration, type of cell, and flow rate, were evaluated and optimized. This system was characterized by a low detection limit (6 μM). Interference studies were carried out. Good recovery percentages comprised between 89 and 131.5% were achieved in different water sources, highlighting its suitability for field measurements.

  9. Nonsomatotopic organization of the higher motor centers in octopus.

    PubMed

    Zullo, Letizia; Sumbre, German; Agnisola, Claudio; Flash, Tamar; Hochner, Binyamin

    2009-10-13

    Hyperredundant limbs with a virtually unlimited number of degrees of freedom (DOFs) pose a challenge for both biological and computational systems of motor control. In the flexible arms of the octopus, simplification strategies have evolved to reduce the number of controlled DOFs. Motor control in the octopus nervous system is hierarchically organized. A relatively small central brain integrates a huge amount of visual and tactile information from the large optic lobes and the peripheral nervous system of the arms and issues commands to lower motor centers controlling the elaborated neuromuscular system of the arms. This unique organization raises new questions on the organization of the octopus brain and whether and how it represents the rich movement repertoire. We developed a method of brain microstimulation in freely behaving animals and stimulated the higher motor centers-the basal lobes-thus inducing discrete and complex sets of movements. As stimulation strength increased, complex movements were recruited from basic components shared by different types of movement. We found no stimulation site where movements of a single arm or body part could be elicited. Discrete and complex components have no central topographical organization but are distributed over wide regions.

  10. Single surgeon's experience with laparoscopic versus robotic partial nephrectomy: perioperative outcomes/complications and influence of tumor characteristics on choice of therapy.

    PubMed

    Lee, Nora G; Zampini, Anna; Tuerk, Ingolf

    2012-10-01

    Laparoscopic (LPN) and robotic partial nephrectomy (RPN) may offer similar advantages for nephron-sparing surgery (NSS). We evaluated the perioperative outcomes and complications of LPN versus RPN and sought to evaluate if one technique may have more favorable outcomes over another based on tumor characteristics. All patients who underwent LPN and RPN by a single surgeon were retrospectively reviewed. The surgeon almost exclusively performed LPN from February 2009 to January 2011 and RPN from January 2011 to January 2012. Patient demographics, tumor characteristics, perioperative outcomes, short term renal functional data, and complications were reviewed. Operative time (OT), warm ischemia time (WIT), and estimated blood loss (EBL) were evaluated for each technique when tumor characteristics were divided by size, location, distance to collecting system, and overall tumor complexity based on nephrometry scoring. Of 39 laparoscopic cases and 30 robotic cases, there were no significant differences in perioperative outcomes, short term renal functional data, or complications between the two groups except for WIT which was shorter in the LPN group (p = 0.006). For medium complexity tumors, OT was less for LPN compared to RPN (p = 0.04); for high complexity tumors, EBL was reduced for RPN compared to LPN cases (p = 0.003). When tumor characteristics were individualized, LPN may be superior to RPN for WIT for small, anterior and exophytic tumors, and tumors located > 5 mm from the collecting system. LPN and RPN appear more equivocal for WIT in posteriorly located tumors. Reduced EBL may be a benefit with RPN for larger tumors. Although WIT was less in patients undergoing LPN compared to RPN, perioperative outcomes and complications remain similar. RPN may be beneficial for approaching more difficult, posterior tumors, whereas LPN may be a better technique for WIT for simple, accessible renal tumors. Reduced EBL may be a benefit for RPN for highly complex tumors.

  11. Absence of Complex I Implicates Rearrangement of the Respiratory Chain in European Mistletoe.

    PubMed

    Senkler, Jennifer; Rugen, Nils; Eubel, Holger; Hegermann, Jan; Braun, Hans-Peter

    2018-05-21

    The mitochondrial oxidative phosphorylation (OXPHOS) system, which is based on the presence of five protein complexes, is in the very center of cellular ATP production. Complexes I to IV are components of the respiratory electron transport chain that drives proton translocation across the inner mitochondrial membrane. The resulting proton gradient is used by complex V (the ATP synthase complex) for the phosphorylation of ADP. Occurrence of complexes I to V is highly conserved in eukaryotes, with exceptions being restricted to unicellular parasites that take up energy-rich compounds from their hosts. Here we present biochemical evidence that the European mistletoe (Viscum album), an obligate semi-parasite living on branches of trees, has a highly unusual OXPHOS system. V. album mitochondria completely lack complex I and have greatly reduced amounts of complexes II and V. At the same time, the complexes III and IV form remarkably stable respiratory supercomplexes. Furthermore, complexome profiling revealed the presence of 150 kDa complexes that include type II NAD(P)H dehydrogenases and an alternative oxidase. Although the absence of complex I genes in mitochondrial genomes of mistletoe species has recently been reported, this is the first biochemical proof that these genes have not been transferred to the nuclear genome and that this respiratory complex indeed is not assembled. As a consequence, the whole respiratory chain is remodeled. Our results demonstrate that, in the context of parasitism, multicellular life can cope with lack of one of the OXPHOS complexes and give new insights into the life strategy of mistletoe species. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter-species extrapolation when the trait or response being studied is located at higher levels of organization, is in a different module, or is influenced by other modules. However, when the examination of the conserved process occurs at the same level of organization or in the same module, and hence is subject to study solely by reductionism, then extrapolation is possible. PMID:22963674

  13. Reduced-Order Structure-Preserving Model for Parallel-Connected Three-Phase Grid-Tied Inverters: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Brian B; Purba, Victor; Jafarpour, Saber

    Given that next-generation infrastructures will contain large numbers of grid-connected inverters and these interfaces will be satisfying a growing fraction of system load, it is imperative to analyze the impacts of power electronics on such systems. However, since each inverter model has a relatively large number of dynamic states, it would be impractical to execute complex system models where the full dynamics of each inverter are retained. To address this challenge, we derive a reduced-order structure-preserving model for parallel-connected grid-tied three-phase inverters. Here, each inverter in the system is assumed to have a full-bridge topology, LCL filter at the pointmore » of common coupling, and the control architecture for each inverter includes a current controller, a power controller, and a phase-locked loop for grid synchronization. We outline a structure-preserving reduced-order inverter model for the setting where the parallel inverters are each designed such that the filter components and controller gains scale linearly with the power rating. By structure preserving, we mean that the reduced-order three-phase inverter model is also composed of an LCL filter, a power controller, current controller, and PLL. That is, we show that the system of parallel inverters can be modeled exactly as one aggregated inverter unit and this equivalent model has the same number of dynamical states as an individual inverter in the paralleled system. Numerical simulations validate the reduced-order models.« less

  14. A combining rule calculation of the ground-state van der Waals potentials of the magnesium rare-gas complexes

    NASA Astrophysics Data System (ADS)

    Saidi, Samah; Alharzali, Nissrin; Berriche, Hamid

    2017-04-01

    The potential energy curves and spectroscopic constants of the ground-state of the Mg-Rg (Rg = He, Ne, Ar, Kr, and Xe) van der Waals complexes are generated by the Tang-Toennies potential model and a set of derived combining rules. The parameters of the model are calculated from the potentials of the homonuclear magnesium and rare-gas dimers. The predicted spectroscopic constants are comparable to other available theoretical and experimental results, except in the case of Mg-He, we note that there are large differences between various determinations. Moreover, in order to reveal relative differences between species more obviously we calculated the reduced potential of these five systems. The curves are clumped closely together, but at intermediate range the Mg-He reduced potential is clearly very different from the others.

  15. GSK3β and aging liver

    PubMed Central

    Jin, Jingling; Wang, Guo-Li; Timchenko, Lubov; Timchenko, and Nikolai A

    2009-01-01

    The loss of regenerative capacity of tissues is one of the major characteristics of aging. Liver represents a powerful system for investigations of mechanisms by which aging reduces regenerative capacity of tissues. The studies within last five years revealed critical role of epigenetic silencing in the inhibition of liver proliferation in old mice. These studies have shown that a number of cell cycle proteins are silenced in livers of old mice by C/EBPα-HDAC1-Brm complex and that old liver fails to reduce the complex and activate these genes in response to proliferative stimulus such as partial hepatectomy. The complex modifies histone H3 on the promoters of c-myc and FoxM1B in the manner which prevents expression of these genes. Despite this progress, little is known about mechanisms by which aging causes this epigenetic silencing. We have recently discovered signal transduction pathways which operate upstream of the C/EBPα-HDAC1-Brm complex. These pathways involve communications of growth hormone, GSK3β and cyclin D3. In addition to the liver, GH-GSK3β-cyclin D3 pathway is also changed with age in lung, brain and adipose tissues. We suggest that other age-associated alterations in these tissues might be mediated by the reduced levels of GSK3β and by elevation of cyclin D3. In this review, we summarize these new data and discuss the role of such alterations in the development of aging phenotype in the liver and in other tissues. PMID:20157540

  16. Image acquisition system using on sensor compressed sampling technique

    NASA Astrophysics Data System (ADS)

    Gupta, Pravir Singh; Choi, Gwan Seong

    2018-01-01

    Advances in CMOS technology have made high-resolution image sensors possible. These image sensors pose significant challenges in terms of the amount of raw data generated, energy efficiency, and frame rate. This paper presents a design methodology for an imaging system and a simplified image sensor pixel design to be used in the system so that the compressed sensing (CS) technique can be implemented easily at the sensor level. This results in significant energy savings as it not only cuts the raw data rate but also reduces transistor count per pixel; decreases pixel size; increases fill factor; simplifies analog-to-digital converter, JPEG encoder, and JPEG decoder design; decreases wiring; and reduces the decoder size by half. Thus, CS has the potential to increase the resolution of image sensors for a given technology and die size while significantly decreasing the power consumption and design complexity. We show that it has potential to reduce power consumption by about 23% to 65%.

  17. Impact of the single point of access referral system to reduce waiting times and improve clinical outcomes in an assistive technology service.

    PubMed

    Hosking, Jonathan; Gibson, Colin

    2016-07-01

    The introduction of a single point referral system that prioritises clients depending on case complexity and overcomes the need for re-admittance to a waiting list via a review system has been shown to significantly reduce maximum waiting times for a Posture and Mobility (Special Seating) Service from 102.0 ± 24.33 weeks to 19.2 ± 8.57 weeks (p = 0.015). Using this service model linear regression revealed a statistically significant improvement in the performance outcome of prescribed seating solutions with shorter Episode of Care completion times (p = 0.023). In addition, the number of Episodes of Care completed per annum was significantly related to the Episode of Care completion time (p = 0.019). In conclusion, it is recommended that it may be advantageous to apply this service model to other assistive technology services in order to reduce waiting times and to improve clinical outcomes.

  18. System architectures for telerobotic research

    NASA Technical Reports Server (NTRS)

    Harrison, F. Wallace

    1989-01-01

    Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.

  19. Dynamic resource allocation in a hierarchical multiprocessor system: A preliminary study

    NASA Technical Reports Server (NTRS)

    Ngai, Tin-Fook

    1986-01-01

    An integrated system approach to dynamic resource allocation is proposed. Some of the problems in dynamic resource allocation and the relationship of these problems to system structures are examined. A general dynamic resource allocation scheme is presented. A hierarchial system architecture which dynamically maps between processor structure and programs at multiple levels of instantiations is described. Simulation experiments were conducted to study dynamic resource allocation on the proposed system. Preliminary evaluation based on simple dynamic resource allocation algorithms indicates that with the proposed system approach, the complexity of dynamic resource management could be significantly reduced while achieving reasonable effective dynamic resource allocation.

  20. Implications of complex adaptive systems theory for interpreting research about health care organizations.

    PubMed

    Jordon, Michelle; Lanham, Holly Jordan; Anderson, Ruth A; McDaniel, Reuben R

    2010-02-01

    Data about health care organizations (HCOs) are not useful until they are interpreted. Such interpretations are influenced by the theoretical lenses used by the researcher. Our purpose was to suggest the usefulness of theories of complex adaptive systems (CASs) in guiding research interpretation. Specifically, we addressed two questions: (1) What are the implications for interpreting research observations in HCOs of the fact that we are observing relationships among diverse agents? (2) What are the implications for interpreting research observations in HCOs of the fact that we are observing relationships among agents that learn? We defined diversity and learning and the implications of the non-linear relationships among agents from a CAS perspective. We then identified some common analytical practices that were problematic and may lead to conceptual and methodological errors. Then we described strategies for interpreting the results of research observations. We suggest that the task of interpreting research observations of HCOs could be improved if researchers take into account that the systems they study are CASs with non-linear relationships among diverse, learning agents. Our analysis points out how interpretation of research results might be shaped by the fact that HCOs are CASs. We described how learning is, in fact, the result of interactions among diverse agents and that learning can, by itself, reduce or increase agent diversity. We encouraged researchers to be persistent in their attempts to reason about complex systems and learn to attend not only to structures, but also to processes and functions of complex systems.

  1. Reduze - Feynman integral reduction in C++

    NASA Astrophysics Data System (ADS)

    Studerus, C.

    2010-07-01

    Reduze is a computer program for reducing Feynman integrals to master integrals employing a Laporta algorithm. The program is written in C++ and uses classes provided by the GiNaC library to perform the simplifications of the algebraic prefactors in the system of equations. Reduze offers the possibility to run reductions in parallel. Program summaryProgram title:Reduze Catalogue identifier: AEGE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions:: yes No. of lines in distributed program, including test data, etc.: 55 433 No. of bytes in distributed program, including test data, etc.: 554 866 Distribution format: tar.gz Programming language: C++ Computer: All Operating system: Unix/Linux Number of processors used: The number of processors is problem dependent. More than one possible but not arbitrary many. RAM: Depends on the complexity of the system. Classification: 4.4, 5 External routines: CLN ( http://www.ginac.de/CLN/), GiNaC ( http://www.ginac.de/) Nature of problem: Solving large systems of linear equations with Feynman integrals as unknowns and rational polynomials as prefactors. Solution method: Using a Gauss/Laporta algorithm to solve the system of equations. Restrictions: Limitations depend on the complexity of the system (number of equations, number of kinematic invariants). Running time: Depends on the complexity of the system.

  2. Positioning infrastructure and technologies for low-carbon urbanization

    NASA Astrophysics Data System (ADS)

    Chester, Mikhail V.; Sperling, Josh; Stokes, Eleanor; Allenby, Braden; Kockelman, Kara; Kennedy, Christopher; Baker, Lawrence A.; Keirstead, James; Hendrickson, Chris T.

    2014-10-01

    The expected urbanization of the planet in the coming century coupled with aging infrastructure in developed regions, increasing complexity of man-made systems, and pressing climate change impacts have created opportunities for reassessing the role of infrastructure and technologies in cities and how they contribute to greenhouse gas (GHG) emissions. Modern urbanization is predicated on complex, increasingly coupled infrastructure systems, and energy use continues to be largely met from fossil fuels. Until energy infrastructures evolve away from carbon-based fuels, GHG emissions are critically tied to the urbanization process. Further complicating the challenge of decoupling urban growth from GHG emissions are lock-in effects and interdependencies. This paper synthesizes state-of-the-art thinking for transportation, fuels, buildings, water, electricity, and waste systems and finds that GHG emissions assessments tend to view these systems as static and isolated from social and institutional systems. Despite significant understanding of methods and technologies for reducing infrastructure-related GHG emissions, physical, institutional, and cultural constraints continue to work against us, pointing to knowledge gaps that must be addressed. This paper identifies three challenge themes to improve our understanding of the role of infrastructure and technologies in urbanization processes and position these increasingly complex systems for low-carbon growth. The challenges emphasize how we can reimagine the role of infrastructure in the future and how people, institutions, and ecological systems interface with infrastructure.

  3. KBGIS-II: A knowledge-based geographic information system

    NASA Technical Reports Server (NTRS)

    Smith, Terence; Peuquet, Donna; Menon, Sudhakar; Agarwal, Pankaj

    1986-01-01

    The architecture and working of a recently implemented Knowledge-Based Geographic Information System (KBGIS-II), designed to satisfy several general criteria for the GIS, is described. The system has four major functions including query-answering, learning and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial object language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is performing all its designated tasks successfully. Future reports will relate performance characteristics of the system.

  4. Failure of local thermal equilibrium in quantum friction

    DOE PAGES

    Intravaia, Francesco; Behunin, Ryan; Henkel, Carsten; ...

    2016-09-01

    Recent progress in manipulating atomic and condensed matter systems has instigated a surge of interest in nonequilibrium physics, including many-body dynamics of trapped ultracold atoms and ions, near-field radiative heat transfer, and quantum friction. Under most circumstances the complexity of such nonequilibrium systems requires a number of approximations to make theoretical descriptions tractable. In particular, it is often assumed that spatially separated components of a system thermalize with their immediate surroundings, although the global state of the system is out of equilibrium. This powerful assumption reduces the complexity of nonequilibrium systems to the local application of well-founded equilibrium concepts. Whilemore » this technique appears to be consistent for the description of some phenomena, we show that it fails for quantum friction by underestimating by approximately 80% the magnitude of the drag force. Here, our results show that the correlations among the components of driven, but steady-state, quantum systems invalidate the assumption of local thermal equilibrium, calling for a critical reexamination of this approach for describing the physics of nonequilibrium systems.« less

  5. Thermo-mechanical analysis of ITER first mirrors and its use for the ITER equatorial visible∕infrared wide angle viewing system optical design.

    PubMed

    Joanny, M; Salasca, S; Dapena, M; Cantone, B; Travère, J M; Thellier, C; Fermé, J J; Marot, L; Buravand, O; Perrollaz, G; Zeile, C

    2012-10-01

    ITER first mirrors (FMs), as the first components of most ITER optical diagnostics, will be exposed to high plasma radiation flux and neutron load. To reduce the FMs heating and optical surface deformation induced during ITER operation, the use of relevant materials and cooling system are foreseen. The calculations led on different materials and FMs designs and geometries (100 mm and 200 mm) show that the use of CuCrZr and TZM, and a complex integrated cooling system can limit efficiently the FMs heating and reduce their optical surface deformation under plasma radiation flux and neutron load. These investigations were used to evaluate, for the ITER equatorial port visible∕infrared wide angle viewing system, the impact of the FMs properties change during operation on the instrument main optical performances. The results obtained are presented and discussed.

  6. Floating Offshore WTG Integrated Load Analysis & Optimization Employing a Tuned Mass Damper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez Tsouroukdissian, Arturo; Lackner, Matt; Cross-Whiter, John

    2015-09-25

    Floating offshore wind turbines (FOWTs) present complex design challenges due to the coupled dynamics of the platform motion, mooring system, and turbine control systems, in response to wind and wave loading. This can lead to higher extreme and fatigue loads than a comparable fixed bottom or onshore system. Previous research[1] has shown the potential to reduced extreme and fatigue loads on FOWT using tuned mass dampers (TMD) for structural control. This project aims to reduce maximum loads using passive TMDs located at the tower top during extreme storm events, when grid supplied power for other controls systems may not bemore » available. The Alstom Haliade 6MW wind turbine is modelled on the Glosten Pelastar tension-leg platform (TLP). The primary objectives of this project are to provide a preliminary assessment of the load reduction potential of passive TMDs on real wind turbine and TLP designs.« less

  7. PAPR reduction based on tone reservation scheme for DCO-OFDM indoor visible light communications.

    PubMed

    Bai, Jurong; Li, Yong; Yi, Yang; Cheng, Wei; Du, Huimin

    2017-10-02

    High peak-to-average power ratio (PAPR) leads to out-of-band power and in-band distortion in the direct current-biased optical orthogonal frequency division multiplexing (DCO-OFDM) systems. In order to effectively reduce the PAPR with faster convergence and lower complexity, this paper proposes a tone reservation based scheme, which is the combination of the signal-to-clipping noise ratio (SCR) procedure and the least squares approximation (LSA) procedure. In the proposed scheme, the transmitter of the DCO-OFDM indoor visible light communication (VLC) system is designed to transform the PAPR reduced signal into real-valued positive OFDM signal without doubling the transmission bandwidth. Moreover, the communication distance and the light emitting diode (LED) irradiance angle are taking into consideration in the evaluation of the system bit error rate (BER). The PAPR reduction efficiency of the proposed scheme is remarkable for DCO-OFDM indoor VLC systems.

  8. Mathematical Modeling of Dual Layer Shell Type Recuperation System for Biogas Dehumidification

    NASA Astrophysics Data System (ADS)

    Gendelis, S.; Timuhins, A.; Laizans, A.; Bandeniece, L.

    2015-12-01

    The main aim of the current paper is to create a mathematical model for dual layer shell type recuperation system, which allows reducing the heat losses from the biomass digester and water amount in the biogas without any additional mechanical or chemical components. The idea of this system is to reduce the temperature of the outflowing gas by creating two-layered counter-flow heat exchanger around the walls of biogas digester, thus increasing a thermal resistance and the gas temperature, resulting in a condensation on a colder surface. Complex mathematical model, including surface condensation, is developed for this type of biogas dehumidifier and the parameter study is carried out for a wide range of parameters. The model is reduced to 1D case to make numerical calculations faster. It is shown that latent heat of condensation is very important for the total heat balance and the condensation rate is highly dependent on insulation between layers and outside temperature. Modelling results allow finding optimal geometrical parameters for the known gas flow and predicting the condensation rate for different system setups and seasons.

  9. Heme versus non-heme iron-nitroxyl {FeN(H)O}⁸ complexes: electronic structure and biologically relevant reactivity.

    PubMed

    Speelman, Amy L; Lehnert, Nicolai

    2014-04-15

    Researchers have completed extensive studies on heme and non-heme iron-nitrosyl complexes, which are labeled {FeNO}(7) in the Enemark-Feltham notation, but they have had very limited success in producing corresponding, one-electron reduced, {FeNO}(8) complexes where a nitroxyl anion (NO(-)) is formally bound to an iron(II) center. These complexes, and their protonated iron(II)-NHO analogues, are proposed key intermediates in nitrite (NO2(-)) and nitric oxide (NO) reducing enzymes in bacteria and fungi. In addition, HNO is known to have a variety of physiological effects, most notably in the cardiovascular system. HNO may also serve as a signaling molecule in mammals. For these functions, iron-containing proteins may mediate the production of HNO and serve as receptors for HNO in vivo. In this Account, we highlight recent key advances in the preparation, spectroscopic characterization, and reactivity of ferrous heme and non-heme nitroxyl (NO(-)/HNO) complexes that have greatly enhanced our understanding of the potential biological roles of these species. Low-spin (ls) heme {FeNO}(7) complexes (S = 1/2) can be reversibly reduced to the corresponding {FeNO}(8) species, which are stable, diamagnetic compounds. Because the reduction is ligand (NO) centered in these cases, it occurs at extremely negative redox potentials that are at the edge of the biologically feasible range. Interestingly, the electronic structures of ls-{FeNO}(7) and ls-{FeNO}(8) species are strongly correlated with very similar frontier molecular orbitals (FMOs) and thermodynamically strong Fe-NO bonds. In contrast, high-spin (hs) non-heme {FeNO}(7) complexes (S = 3/2) can be reduced at relatively mild redox potentials. Here, the reduction is metal-centered and leads to a paramagnetic (S = 1) {FeNO}(8) complex. The increased electron density at the iron center in these species significantly decreases the covalency of the Fe-NO bond, making the reduced complexes highly reactive. In the absence of steric bulk, monomeric high-spin {FeNO}(8) complexes decompose rapidly. Notably, in a recently prepared, dimeric [{FeNO}(7)]2 species, we observed that reduction leads to rapid N-N bond formation and N2O generation, which directly models the reactivity of flavodiiron NO reductases (FNORs). We have also made key progress in the preparation and stabilization of corresponding HNO complexes, {FeNHO}(8), using both heme and non-heme ligand sets. In both cases, we have taken advantage of sterically bulky coligands to stabilize these species. ls-{FeNO}(8) complexes are basic and easily form corresponding ls-{FeNHO}(8) species, which, however, decompose rapidly via disproportionation and H2 release. Importantly, we recently showed that we can suppress this reaction via steric protection of the bound HNO ligand. As a result, we have demonstrated that ls-{FeNHO}(8) model complexes are stable and amenable to spectroscopic characterization. Neither ls-{FeNO}(8) nor ls-{FeNHO}(8) model complexes are active for N-N coupling, and hence, seem unsuitable as reactive intermediates in nitric oxide reductases (NORs). Hs-{FeNO}(8) complexes are more basic than their hs-{FeNO}(7) precursors, but their electronic structure and reactivity is not as well characterized.

  10. SQL is Dead; Long-live SQL: Relational Database Technology in Science Contexts

    NASA Astrophysics Data System (ADS)

    Howe, B.; Halperin, D.

    2014-12-01

    Relational databases are often perceived as a poor fit in science contexts: Rigid schemas, poor support for complex analytics, unpredictable performance, significant maintenance and tuning requirements --- these idiosyncrasies often make databases unattractive in science contexts characterized by heterogeneous data sources, complex analysis tasks, rapidly changing requirements, and limited IT budgets. In this talk, I'll argue that although the value proposition of typical relational database systems are weak in science, the core ideas that power relational databases have become incredibly prolific in open source science software, and are emerging as a universal abstraction for both big data and small data. In addition, I'll talk about two open source systems we are building to "jailbreak" the core technology of relational databases and adapt them for use in science. The first is SQLShare, a Database-as-a-Service system supporting collaborative data analysis and exchange by reducing database use to an Upload-Query-Share workflow with no installation, schema design, or configuration required. The second is Myria, a service that supports much larger scale data, complex analytics, and supports multiple back end systems. Finally, I'll describe some of the ways our collaborators in oceanography, astronomy, biology, fisheries science, and more are using these systems to replace script-based workflows for reasons of performance, flexibility, and convenience.

  11. A special purpose silicon compiler for designing supercomputing VLSI systems

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Murugavel, P.; Kamakoti, V.; Shankarraman, M. J.; Rangarajan, S.; Mallikarjun, M.; Karthikeyan, B.; Prabhakar, T. S.; Satish, V.; Venkatasubramaniam, P. R.

    1991-01-01

    Design of general/special purpose supercomputing VLSI systems for numeric algorithm execution involves tackling two important aspects, namely their computational and communication complexities. Development of software tools for designing such systems itself becomes complex. Hence a novel design methodology has to be developed. For designing such complex systems a special purpose silicon compiler is needed in which: the computational and communicational structures of different numeric algorithms should be taken into account to simplify the silicon compiler design, the approach is macrocell based, and the software tools at different levels (algorithm down to the VLSI circuit layout) should get integrated. In this paper a special purpose silicon (SPS) compiler based on PACUBE macrocell VLSI arrays for designing supercomputing VLSI systems is presented. It is shown that turn-around time and silicon real estate get reduced over the silicon compilers based on PLA's, SLA's, and gate arrays. The first two silicon compiler characteristics mentioned above enable the SPS compiler to perform systolic mapping (at the macrocell level) of algorithms whose computational structures are of GIPOP (generalized inner product outer product) form. Direct systolic mapping on PLA's, SLA's, and gate arrays is very difficult as they are micro-cell based. A novel GIPOP processor is under development using this special purpose silicon compiler.

  12. Designing Flight-Deck Procedures

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Wiener, L.; Shafto, Mike (Technical Monitor)

    1995-01-01

    A complex human-machine system consists of more than merely one or more human operators and a collection of hardware components. In order to operate a complex system successfully, the human-machine system must be supported by an organizational infrastructure of operating concepts, rules, guidelines, and documents. The coherency of such operating concepts, in terms of consistency and logic, is vitally important for the efficiency and safety of any complex system. In high-risk endeavors such as aircraft operations, space flight, nuclear power production, manufacturing process control, and military operations, it is essential that such support be flawless, as the price of operational error can be high. When operating rules are not adhered to, or the rules are inadequate for the task at hand, not only will the system's goals be thwarted, but there may also be tragic human and material consequences. To ensure safe and predictable operations, support to the operators, in this case flight crews, often comes in the form of standard operating procedures. These provide the crew with step-by-step guidance for carrying out their operations. Standard procedures do indeed promote uniformity, but they do so at the risk of reducing the role of human operators to a lower level. Management, however, must recognize the danger of over-procedurization, which fails to exploit one of the most valuable assets in the system, the intelligent operator who is "on the scene." The alert system designer and operations manager recognize that there cannot be a procedure for everything, and the time will come in which the operators of a complex system will face a situation for which there is no written procedure. Procedures, whether executed by humans or machines, have their place, but so does human cognition.

  13. Pixel-based OPC optimization based on conjugate gradients.

    PubMed

    Ma, Xu; Arce, Gonzalo R

    2011-01-31

    Optical proximity correction (OPC) methods are resolution enhancement techniques (RET) used extensively in the semiconductor industry to improve the resolution and pattern fidelity of optical lithography. In pixel-based OPC (PBOPC), the mask is divided into small pixels, each of which is modified during the optimization process. Two critical issues in PBOPC are the required computational complexity of the optimization process, and the manufacturability of the optimized mask. Most current OPC optimization methods apply the steepest descent (SD) algorithm to improve image fidelity augmented by regularization penalties to reduce the complexity of the mask. Although simple to implement, the SD algorithm converges slowly. The existing regularization penalties, however, fall short in meeting the mask rule check (MRC) requirements often used in semiconductor manufacturing. This paper focuses on developing OPC optimization algorithms based on the conjugate gradient (CG) method which exhibits much faster convergence than the SD algorithm. The imaging formation process is represented by the Fourier series expansion model which approximates the partially coherent system as a sum of coherent systems. In order to obtain more desirable manufacturability properties of the mask pattern, a MRC penalty is proposed to enlarge the linear size of the sub-resolution assistant features (SRAFs), as well as the distances between the SRAFs and the main body of the mask. Finally, a projection method is developed to further reduce the complexity of the optimized mask pattern.

  14. Reduced Basis and Stochastic Modeling of Liquid Propellant Rocket Engine as a Complex System

    DTIC Science & Technology

    2015-07-02

    additions, the approach will be extended to a real- gas system so that it can be used to investigate model multi-element liquid rocket combustors in a...Sirignano (2010). In the following discussion, we examine the various conservation principles for the gas and liquid phases. The hyperbolic nature of the...conservation equations for the gas and liquid phases. Mass conservation of individual chemical species or of individual classes of liquid droplets will

  15. Entry, Descent, and Landing With Propulsive Deceleration

    NASA Technical Reports Server (NTRS)

    Palaszewski, Bryan

    2012-01-01

    The future exploration of the Solar System will require innovations in transportation and the use of entry, descent, and landing (EDL) systems at many planetary landing sites. The cost of space missions has always been prohibitive, and using the natural planetary and planet s moons atmospheres for entry, descent, and landing can reduce the cost, mass, and complexity of these missions. This paper will describe some of the EDL ideas for planetary entry and survey the overall technologies for EDL that may be attractive for future Solar System missions.

  16. Nonparametric method for failures diagnosis in the actuating subsystem of aircraft control system

    NASA Astrophysics Data System (ADS)

    Terentev, M. N.; Karpenko, S. S.; Zybin, E. Yu; Kosyanchuk, V. V.

    2018-02-01

    In this paper we design a nonparametric method for failures diagnosis in the aircraft control system that uses the measurements of the control signals and the aircraft states only. It doesn’t require a priori information of the aircraft model parameters, training or statistical calculations, and is based on analytical nonparametric one-step-ahead state prediction approach. This makes it possible to predict the behavior of unidentified and failure dynamic systems, to weaken the requirements to control signals, and to reduce the diagnostic time and problem complexity.

  17. Endoscopic, single-catheter treatment of Dandy-Walker syndrome hydrocephalus: technical case report and review of treatment options.

    PubMed

    Sikorski, Christian W; Curry, Daniel J

    2005-01-01

    Optimal treatment for hydrocephalus related to Dandy-Walker syndrome (DWS) remains elusive. Patients with DWS-related hydrocephalus often require combinations of shunting systems to effectively drain both the supratentorial ventricles and posterior fossa cyst. We describe an endoscopic technique, whereby a frontally placed, single-catheter shunting system effectively drained the supratentorial and infratentorial compartments. This reduces the complexity and potential risk associated with the combined shunting systems required by so many with DWS-related hydrocephalus. Copyright 2005 S. Karger AG, Basel.

  18. A methodology based on reduced complexity algorithm for system applications using microprocessors

    NASA Technical Reports Server (NTRS)

    Yan, T. Y.; Yao, K.

    1988-01-01

    The paper considers a methodology on the analysis and design of a minimum mean-square error criterion linear system incorporating a tapped delay line (TDL) where all the full-precision multiplications in the TDL are constrained to be powers of two. A linear equalizer based on the dispersive and additive noise channel is presented. This microprocessor implementation with optimized power of two TDL coefficients achieves a system performance comparable to the optimum linear equalization with full-precision multiplications for an input data rate of 300 baud.

  19. The endocrine system and sarcopenia: potential therapeutic benefits.

    PubMed

    McIntire, Kevin L; Hoffman, Andrew R

    2011-12-01

    Age related muscle loss, known as sarcopenia, is a major factor in disability, loss of mobility and quality of life in the elderly. There are many proposed mechanisms of age-related muscle loss that include the endocrine system. A variety of hormones regulate growth, development and metabolism throughout the lifespan. Hormone activity may change with age as a result of reduced hormone secretion or decreased tissue responsiveness. This review will focus on the complex interplay between the endocrine system, aging and skeletal muscle and will present possible benefits of therapeutic interventions for sarcopenia.

  20. Structured grid technology to enable flow simulation in an integrated system environment

    NASA Astrophysics Data System (ADS)

    Remotigue, Michael Gerard

    An application-driven Computational Fluid Dynamics (CFD) environment needs flexible and general tools to effectively solve complex problems in a timely manner. In addition, reusable, portable, and maintainable specialized libraries will aid in rapidly developing integrated systems or procedures. The presented structured grid technology enables the flow simulation for complex geometries by addressing grid generation, grid decomposition/solver setup, solution, and interpretation. Grid generation is accomplished with the graphical, arbitrarily-connected, multi-block structured grid generation software system (GUM-B) developed and presented here. GUM-B is an integrated system comprised of specialized libraries for the graphical user interface and graphical display coupled with a solid-modeling data structure that utilizes a structured grid generation library and a geometric library based on Non-Uniform Rational B-Splines (NURBS). A presented modification of the solid-modeling data structure provides the capability for arbitrarily-connected regions between the grid blocks. The presented grid generation library provides algorithms that are reliable and accurate. GUM-B has been utilized to generate numerous structured grids for complex geometries in hydrodynamics, propulsors, and aerodynamics. The versatility of the libraries that compose GUM-B is also displayed in a prototype to automatically regenerate a grid for a free-surface solution. Grid decomposition and solver setup is accomplished with the graphical grid manipulation and repartition software system (GUMBO) developed and presented here. GUMBO is an integrated system comprised of specialized libraries for the graphical user interface and graphical display coupled with a structured grid-tools library. The described functions within the grid-tools library reduce the possibility of human error during decomposition and setup for the numerical solver by accounting for boundary conditions and connectivity. GUMBO is linked with a flow solver interface, to the parallel UNCLE code, to provide load balancing tools and solver setup. Weeks of boundary condition and connectivity specification and validation has been reduced to hours. The UNCLE flow solver is utilized for the solution of the flow field. To accelerate convergence toward a quick engineering answer, a full multigrid (FMG) approach coupled with UNCLE, which is a full approximation scheme (FAS), is presented. The prolongation operators used in the FMG-FAS method are compared. The procedure is demonstrated on a marine propeller in incompressible flow. Interpretation of the solution is accomplished by vortex feature detection. Regions of "Intrinsic Swirl" are located by interrogating the velocity gradient tensor for complex eigenvalues. The "Intrinsic Swirl" parameter is visualized on a solution of a marine propeller to determine if any vortical features are captured. The libraries and the structured grid technology presented herein are flexible and general enough to tackle a variety of complex applications. This technology has significantly enabled the capability of the ERC personnel to effectively calculate solutions for complex geometries.

  1. Protocol for a mixed methods study of hospital readmissions: sensemaking in Veterans Health Administration healthcare system in the USA.

    PubMed

    Penney, Lauren S; Leykum, Luci K; Noël, Polly; Finley, Erin P; Lanham, Holly Jordan; Pugh, Jacqueline

    2018-04-07

    Effective delivery of healthcare in complex systems requires managing interdependencies between professions and organisational units. Reducing 30-day hospital readmissions may be one of the most complex tasks that a healthcare system can undertake. We propose that these less than optimal outcomes are related to difficulties managing the complex interdependencies among organisational units and to a lack of effective sensemaking among individuals and organisational units regarding how best to coordinate patient needs. This is a mixed method, multistepped study. We will conduct in-depth qualitative organisational case studies in 10 Veterans Health Administration facilities (6 with improving and 4 with worsening readmission rates), focusing on relationships, sensemaking and improvisation around care transition processes intended to reduce early readmissions. Data will be gathered through multiple methods (eg, chart reviews, surveys, interviews, observations) and analysed using analytic memos, qualitative coding and statistical analyses. We will construct an agent-based model based on those results to explore the influence of sensemaking and specific care transition processes on early readmissions. Ethical approval has been obtained through the Institutional Review Board of the University of Texas Health Science Center at San Antonio (approval number: 14-258 hour). We will disseminate our findings in manuscripts in peer-reviewed journals, professional conferences and through short reports back to participating entities and stakeholders. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Application of superconducting technology to earth-to-orbit electromagnetic launch systems

    NASA Technical Reports Server (NTRS)

    Hull, J. R.; Carney, L. M.

    1988-01-01

    Benefits may occur by incorporating superconductors, both existing and those currently under development, in one or more parts of a large-scale electromagnetic launch (EML) system that is capable of delivering payloads from the surface of the Earth to space. The use of superconductors for many of the EML components results in lower system losses; consequently, reductions in the size and number of energy storage devices are possible. Applied high-temperature superconductivity may eventually enable novel design concepts for energy distribution and switching. All of these technical improvements have the potential to reduce system complexity and lower payload launch costs.

  3. Technology assessment--who is getting stuck, anyway?

    PubMed

    Bayne, C G

    1997-10-01

    Some 13% to 62% of all injuries reported to hospital occupational health workers are traceable to phlebotomy procedures. However, the selection of a needleless system is complex. The informed manager seeks answers to the following questions: (1) Do needleless systems reduce the risk of seroconversion to bloodborne pathogens? (Answer yes.) (2) Does the use of a needleless system affect patients' risk of catheter sepsis? (Answer no.) and (3) What about chemical compatibility with the newer materials used in needleless systems? (New variables require more studies.) The author lists references, manufacturers and some of the chemicals to which some manufacturers have exposed their devices.

  4. Intelligent Energy Systems As a Modern Basis For Improving Energy Efficiency

    NASA Astrophysics Data System (ADS)

    Vidyaev, Igor G.; Ivashutenko, Alexandr S.; Samburskaya, Maria A.

    2017-01-01

    This work presents data on the share of energy costs in the cost structure for different countries. The information is provided on reducing the use of energy resources by means of introducing the intelligent control systems in the industrial enterprises. The structure and the use of such intelligent systems in the energy industry are under our consideration. It is shown that the constructing an intelligent system should be the strategic direction for the development of the distribution grid complex implying the four main areas for improvement: intellectualization of the equipment, management, communication and automation.

  5. Future Data Communication Architectures for Safety Critical Aircraft Cabin Systems

    NASA Astrophysics Data System (ADS)

    Berkhahn, Sven-Olaf

    2012-05-01

    The cabin of modern aircraft is subject to increasing demands for fast reconfiguration and hence flexibility. These demands require studies for new network architectures and technologies of the electronic cabin systems, which consider also weight and cost reductions as well as safety constraints. Two major approaches are in consideration to reduce the complex and heavy wiring harness: the usage of a so called hybrid data bus technology, which enables the common usage of the same data bus for several electronic cabin systems with different safety and security requirements and the application of wireless data transfer technologies for electronic cabin systems.

  6. Proteomics-Based Analysis of Protein Complexes in Pluripotent Stem Cells and Cancer Biology.

    PubMed

    Sudhir, Putty-Reddy; Chen, Chung-Hsuan

    2016-03-22

    A protein complex consists of two or more proteins that are linked together through protein-protein interactions. The proteins show stable/transient and direct/indirect interactions within the protein complex or between the protein complexes. Protein complexes are involved in regulation of most of the cellular processes and molecular functions. The delineation of protein complexes is important to expand our knowledge on proteins functional roles in physiological and pathological conditions. The genetic yeast-2-hybrid method has been extensively used to characterize protein-protein interactions. Alternatively, a biochemical-based affinity purification coupled with mass spectrometry (AP-MS) approach has been widely used to characterize the protein complexes. In the AP-MS method, a protein complex of a target protein of interest is purified using a specific antibody or an affinity tag (e.g., DYKDDDDK peptide (FLAG) and polyhistidine (His)) and is subsequently analyzed by means of MS. Tandem affinity purification, a two-step purification system, coupled with MS has been widely used mainly to reduce the contaminants. We review here a general principle for AP-MS-based characterization of protein complexes and we explore several protein complexes identified in pluripotent stem cell biology and cancer biology as examples.

  7. Proteomics-Based Analysis of Protein Complexes in Pluripotent Stem Cells and Cancer Biology

    PubMed Central

    Sudhir, Putty-Reddy; Chen, Chung-Hsuan

    2016-01-01

    A protein complex consists of two or more proteins that are linked together through protein–protein interactions. The proteins show stable/transient and direct/indirect interactions within the protein complex or between the protein complexes. Protein complexes are involved in regulation of most of the cellular processes and molecular functions. The delineation of protein complexes is important to expand our knowledge on proteins functional roles in physiological and pathological conditions. The genetic yeast-2-hybrid method has been extensively used to characterize protein-protein interactions. Alternatively, a biochemical-based affinity purification coupled with mass spectrometry (AP-MS) approach has been widely used to characterize the protein complexes. In the AP-MS method, a protein complex of a target protein of interest is purified using a specific antibody or an affinity tag (e.g., DYKDDDDK peptide (FLAG) and polyhistidine (His)) and is subsequently analyzed by means of MS. Tandem affinity purification, a two-step purification system, coupled with MS has been widely used mainly to reduce the contaminants. We review here a general principle for AP-MS-based characterization of protein complexes and we explore several protein complexes identified in pluripotent stem cell biology and cancer biology as examples. PMID:27011181

  8. Operation of passive membrane systems for drinking water treatment.

    PubMed

    Oka, P A; Khadem, N; Bérubé, P R

    2017-05-15

    The widespread adoption of submerged hollow fibre ultrafiltration (UF) for drinking water treatment is currently hindered by the complexity and cost of these membrane systems, especially in small/remote communities. Most of the complexity is associated with auxiliary fouling control measures, which include backwashing, air sparging and chemical cleaning. Recent studies have demonstrated that sustained operation without fouling control measures is possible, but little is known regarding the conditions under which extended operation can be sustained with minimal to no fouling control measures. The present study investigated the contribution of different auxiliary fouling control measures to the permeability that can be sustained, with the intent of minimizing the mechanical and operational complexity of submerged hollow fiber UF membrane systems while maximizing their throughput capacity. Sustained conditions could be achieved without backwashing, air sparging or chemical cleaning (i.e. passive operation), indicating that these fouling control measures can be eliminated, substantially simplifying the mechanical and operational complexity of submerged hollow fiber UF systems. The adoption of hydrostatic pressure (i.e. gravity) to provide the driving force for permeation further reduced the system complexity. Approximately 50% of the organic material in the raw water was removed during treatment. The sustained passive operation and effective removal of organic material was likely due to the microbial community that established itself on the membrane surface. The permeability that could be sustained was however only approximately 20% of that which can be maintained with fouling control measures. Retaining a small amount of air sparging (i.e. a few minutes daily) and incorporating a daily 1-h relaxation (i.e. permeate flux interruption) period prior to sparging more than doubled the permeability that could be sustained. Neither the approach used to interrupt the permeate flux nor that developed to draw air into the system for sparging using gravity add substantial mechanical or operational complexity to the system. The high throughput capacity that can be sustained by eliminating all but a couple of simple fouling control measures make passive membrane systems ideally suited to provide high quality water especially where access to financial resources, technical expertise and/or electrical power is limited. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Reducing the Complexity Gap: Expanding the Period of Human Nurturance

    ERIC Educational Resources Information Center

    Kiel, L. Douglas

    2014-01-01

    Socio-techno-cultural reality, in the current historical era, evolves at a faster rate than do human brain or human institutions. This reality creates a "complexity gap" that reduces human and institutional capacities to adapt to the challenges of late modernity. New insights from the neurosciences may help to reduce the complexity gap.…

  10. Prenatal Nicotine Exposure Disrupts Infant Neural Markers of Orienting.

    PubMed

    King, Erin; Campbell, Alana; Belger, Aysenil; Grewen, Karen

    2018-06-07

    Prenatal nicotine exposure (PNE) from maternal cigarette smoking is linked to developmental deficits, including impaired auditory processing, language, generalized intelligence, attention, and sleep. Fetal brain undergoes massive growth, organization, and connectivity during gestation, making it particularly vulnerable to neurotoxic insult. Nicotine binds to nicotinic acetylcholine receptors, which are extensively involved in growth, connectivity, and function of developing neural circuitry and neurotransmitter systems. Thus, PNE may have long-term impact on neurobehavioral development. The purpose of this study was to compare the auditory K-complex, an event-related potential reflective of auditory gating, sleep preservation and memory consolidation during sleep, in infants with and without PNE and to relate these neural correlates to neurobehavioral development. We compared brain responses to an auditory paired-click paradigm in 3- to 5-month-old infants during Stage 2 sleep, when the K-complex is best observed. We measured component amplitude and delta activity during the K-complex. Infants with PNE demonstrated significantly smaller amplitude of the N550 component and reduced delta-band power within elicited K-complexes compared to nonexposed infants and also were less likely to orient with a head turn to a novel auditory stimulus (bell ring) when awake. PNE may impair auditory sensory gating, which may contribute to disrupted sleep and to reduced auditory discrimination and learning, attention re-orienting, and/or arousal during wakefulness reported in other studies. Links between PNE and reduced K-complex amplitude and delta power may represent altered cholinergic and GABAergic synaptic programming and possibly reflect early neural bases for PNE-linked disruptions in sleep quality and auditory processing. These may pose significant disadvantage for language acquisition, attention, and social interaction necessary for academic and social success.

  11. Activity of Tobramycin against Cystic Fibrosis Isolates of Burkholderia cepacia Complex Grown as Biofilms.

    PubMed

    Kennedy, Sarah; Beaudoin, Trevor; Yau, Yvonne C W; Caraher, Emma; Zlosnik, James E A; Speert, David P; LiPuma, John J; Tullis, Elizabeth; Waters, Valerie

    2016-01-01

    Pulmonary infection with Burkholderia cepacia complex in cystic fibrosis (CF) patients is associated with more-rapid lung function decline and earlier death than in CF patients without this infection. In this study, we used confocal microscopy to visualize the effects of various concentrations of tobramycin, achievable with systemic and aerosolized drug administration, on mature B. cepacia complex biofilms, both in the presence and absence of CF sputum. After 24 h of growth, biofilm thickness was significantly reduced by exposure to 2,000 μg/ml of tobramycin for Burkholderia cepacia, Burkholderia multivorans, and Burkholderia vietnamiensis; 200 μg/ml of tobramycin was sufficient to reduce the thickness of Burkholderia dolosa biofilm. With a more mature 48-h biofilm, significant reductions in thickness were seen with tobramycin at concentrations of ≥100 μg/ml for all Burkholderia species. In addition, an increased ratio of dead to live cells was observed in comparison to control with tobramycin concentrations of ≥200 μg/ml for B. cepacia and B. dolosa (24 h) and ≥100 μg/ml for Burkholderia cenocepacia and B. dolosa (48 h). Although sputum significantly increased biofilm thickness, tobramycin concentrations of 1,000 μg/ml were still able to significantly reduce biofilm thickness of all B. cepacia complex species with the exception of B. vietnamiensis. In the presence of sputum, 1,000 μg/ml of tobramycin significantly increased the dead-to-live ratio only for B. multivorans compared to control. In summary, although killing is attenuated, high-dose tobramycin can effectively decrease the thickness of B. cepacia complex biofilms, even in the presence of sputum, suggesting a possible role as a suppressive therapy in CF. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  12. Interspecies Systems Biology Uncovers Metabolites Affecting C. elegans Gene Expression and Life History Traits

    PubMed Central

    Watson, Emma; MacNeil, Lesley T.; Ritter, Ashlyn D.; Yilmaz, L. Safak; Rosebrock, Adam P.; Caudy, Amy A.; Walhout, Albertha J. M.

    2014-01-01

    SUMMARY Diet greatly influences gene expression and physiology. In mammals, elucidating the effects and mechanisms of individual nutrients is challenging due to the complexity of both the animal and its diet. Here we used an interspecies systems biology approach with Caenorhabditis elegans and two if its bacterial diets, Escherichia coli and Comamonas aquatica, to identify metabolites that affect the animal’s gene expression and physiology. We identify vitamin B12 as the major dilutable metabolite provided by Comamonas aq. that regulates gene expression, accelerates development and reduces fertility, but does not affect lifespan. We find that vitamin B12 has a dual role in the animal: it affects development and fertility via the methionine/S-Adenosylmethionine (SAM) cycle and breaks down the short-chain fatty acid propionic acid preventing its toxic buildup. Our interspecies systems biology approach provides a paradigm for understanding complex interactions between diet and physiology. PMID:24529378

  13. Digital tanlock loop architecture with no delay

    NASA Astrophysics Data System (ADS)

    Al-Kharji AL-Ali, Omar; Anani, Nader; Al-Araji, Saleh; Al-Qutayri, Mahmoud; Ponnapalli, Prasad

    2012-02-01

    This article proposes a new architecture for a digital tanlock loop which eliminates the time-delay block. The ? (rad) phase shift relationship between the two channels, which is generated by the delay block in the conventional time-delay digital tanlock loop (TDTL), is preserved using two quadrature sampling signals for the loop channels. The proposed system outperformed the original TDTL architecture, when both systems were tested with frequency shift keying input signal. The new system demonstrated better linearity and acquisition speed as well as improved noise performance compared with the original TDTL architecture. Furthermore, the removal of the time-delay block enables all processing to be digitally performed, which reduces the implementation complexity. Both the original TDTL and the new architecture without the delay block were modelled and simulated using MATLAB/Simulink. Implementation issues, including complexity and relation to simulation of both architectures, are also addressed.

  14. NASA's Space Launch System (SLS) Program: Mars Program Utilization

    NASA Technical Reports Server (NTRS)

    May, Todd A.; Creech, Stephen D.

    2012-01-01

    NASA's Space Launch System is being designed for safe, affordable, and sustainable human and scientific exploration missions beyond Earth's orbit (BEO), as directed by the NASA Authorization Act of 2010 and NASA's 2011 Strategic Plan. This paper describes how the SLS can dramatically change the Mars program's science and human exploration capabilities and objectives. Specifically, through its high-velocity change (delta V) and payload capabilities, SLS enables Mars science missions of unprecedented size and scope. By providing direct trajectories to Mars, SLS eliminates the need for complicated gravity-assist missions around other bodies in the solar system, reducing mission time, complexity, and cost. SLS's large payload capacity also allows for larger, more capable spacecraft or landers with more instruments, which can eliminate the need for complex packaging or "folding" mechanisms. By offering this capability, SLS can enable more science to be done more quickly than would be possible through other delivery mechanisms using longer mission times.

  15. Joint channel estimation and multi-user detection for multipath fading channels in DS-CDMA systems

    NASA Astrophysics Data System (ADS)

    Wu, Sau-Hsuan; Kuo, C.-C. Jay

    2002-11-01

    The technique of joint blind channel estimation and multiple access interference (MAI) suppression for an asynchronous code-division multiple-access (CDMA) system is investigated in this research. To identify and track dispersive time-varying fading channels and to avoid the phase ambiguity that come with the second-order statistic approaches, a sliding-window scheme using the expectation maximization (EM) algorithm is proposed. The complexity of joint channel equalization and symbol detection for all users increases exponentially with system loading and the channel memory. The situation is exacerbated if strong inter-symbol interference (ISI) exists. To reduce the complexity and the number of samples required for channel estimation, a blind multiuser detector is developed. Together with multi-stage interference cancellation using soft outputs provided by this detector, our algorithm can track fading channels with no phase ambiguity even when channel gains attenuate close to zero.

  16. Linear-algebraic bath transformation for simulating complex open quantum systems

    DOE PAGES

    Huh, Joonsuk; Mostame, Sarah; Fujita, Takatoshi; ...

    2014-12-02

    In studying open quantum systems, the environment is often approximated as a collection of non-interacting harmonic oscillators, a configuration also known as the star-bath model. It is also well known that the star-bath can be transformed into a nearest-neighbor interacting chain of oscillators. The chain-bath model has been widely used in renormalization group approaches. The transformation can be obtained by recursion relations or orthogonal polynomials. Based on a simple linear algebraic approach, we propose a bath partition strategy to reduce the system-bath coupling strength. As a result, the non-interacting star-bath is transformed into a set of weakly coupled multiple parallelmore » chains. Furthermore, the transformed bath model allows complex problems to be practically implemented on quantum simulators, and it can also be employed in various numerical simulations of open quantum dynamics.« less

  17. Engineering the object-relation database model in O-Raid

    NASA Technical Reports Server (NTRS)

    Dewan, Prasun; Vikram, Ashish; Bhargava, Bharat

    1989-01-01

    Raid is a distributed database system based on the relational model. O-raid is an extension of the Raid system and will support complex data objects. The design of O-Raid is evolutionary and retains all features of relational data base systems and those of a general purpose object-oriented programming language. O-Raid has several novel properties. Objects, classes, and inheritance are supported together with a predicate-base relational query language. O-Raid objects are compatible with C++ objects and may be read and manipulated by a C++ program without any 'impedance mismatch'. Relations and columns within relations may themselves be treated as objects with associated variables and methods. Relations may contain heterogeneous objects, that is, objects of more than one class in a certain column, which can individually evolve by being reclassified. Special facilities are provided to reduce the data search in a relation containing complex objects.

  18. Fluorescence enhancement of quercetin complexes by silver nanoparticles and its analytical application

    NASA Astrophysics Data System (ADS)

    Liu, Ping; Zhao, Liangliang; Wu, Xia; Huang, Fei; Wang, Minqin; Liu, Xiaodan

    2014-03-01

    It is found that the plasmon effect of silver nanoparticles (AgNPs) helps to enhance the fluorescence intensity of the quercetin (Qu) and nucleic acids system. Qu exhibited strong fluorescence enhancement when it bound to nucleic acids in the presence of AgNPs. Based on this, a sensitive method for the determination of nucleic acids was developed. The detection limits for the nucleic acids (S/N = 3) were reduced to the ng mL-1 level. The interaction mechanism of the AgNPs-fish sperm DNA (fsDNA)-Qu system was also investigated in this paper. This complex system of Qu and AgNPs was also successfully used for the detection of nucleic acids in agarose gel electrophoresis analysis. Preliminary results indicated that AgNPs also helped to improve sensitivity in the fluorescence image analysis of Qu combined with cellular contents in Arabidopsis thaliana protoplasts.

  19. Expert system development for commonality analysis in space programs

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1987-01-01

    This report is a combination of foundational mathematics and software design. A mathematical model of the Commonality Analysis problem was developed and some important properties discovered. The complexity of the problem is described herein and techniques, both deterministic and heuristic, for reducing that complexity are presented. Weaknesses are pointed out in the existing software (System Commonality Analysis Tool) and several improvements are recommended. It is recommended that: (1) an expert system for guiding the design of new databases be developed; (2) a distributed knowledge base be created and maintained for the purpose of encoding the commonality relationships between design items in commonality databases; (3) a software module be produced which automatically generates commonality alternative sets from commonality databases using the knowledge associated with those databases; and (4) a more complete commonality analysis module be written which is capable of generating any type of feasible solution.

  20. Computed intraoperative navigation guidance--a preliminary report on a new technique.

    PubMed

    Enislidis, G; Wagner, A; Ploder, O; Ewers, R

    1997-08-01

    To assess the value of a computer-assisted three-dimensional guidance system (Virtual Patient System) in maxillofacial operations. Laboratory and open clinical study. Teaching Hospital, Austria. 6 patients undergoing various procedures including removal of foreign body (n=3) and biopsy, maxillary advancement, and insertion of implants (n=1 each). Storage of computed tomographic (CT) pictures on an optical disc, and imposition of intraoperative video images on to these. The resulting display is shown to the surgeon on a micromonitor in his head-up display for guidance during the operations. To improve orientation during complex or minimally invasive maxillofacial procedures and to make such operations easier and less traumatic. Successful transferral of computed navigation technology into an operation room environment and positive evaluation of the method by the surgeons involved. Computer-assisted three-dimensional guidance systems have the potential for making complex or minimally invasive procedures easier to do, thereby reducing postoperative morbidity.

Top